Abstract
Background: Patient portals are becoming ubiquitous. Previous research has documented substantial barriers, especially among vulnerable patient subgroups such as those with lower socioeconomic status or limited health literacy (LHL). We tested the effectiveness of delivering online, video-based portal training to patients in a safety net setting.
Methods: We created an online video curriculum about accessing the San Francisco Health Network portal, and then randomized 93 English-speaking patients with 1+ chronic diseases to receive 1) an in-person tutorial with a research assistant, or 2) a link to view the videos on their own. We also examined a third, nonrandomized usual care comparison group. The primary outcome was portal log-in (yes/no) 3 to 6 months post-training, assessed via the electronic health record. Secondary outcomes were self-reported attitudes and skills collected via baseline and follow-up surveys.
Results: Mean age was 54 years, 51% had LHL, 60% were nonwhite, 52% were female, 45% reported fair/poor health, and 76% reported daily Internet use. At followup, 21% logged into the portal, with no differences by arm (P = .41), but this was higher than the overall clinic rate of 9% (P < .01) during the same time period. We found significant prepost improvements in self-rated portal skills (P = .03) and eHealth literacy (P < .01). Those with LHL were less likely to log in post-training (P < .01).
Conclusions: Both modalities of online training were comparable, and neither mode enabled a majority of vulnerable patients to use portals, especially those with LHL. This suggests that portal training will need to be more intensive or portals need improved usability to meaningfully increase use among diverse patients.
- Chronic Disease
- Electronic Health Records
- Health Literacy
- Information Technology
- Primary Health Care
- Telemedicine
Online patient portal use has expanded exponentially because of financial incentives of the Meaningful Use program.1 These Web sites allow patients online access to medical records through features such as lab results and secure messaging2—representing patient-centered efforts to improve efficiency and convenience of care. Some studies suggest portal Web sites may help improve patient outcomes,3⇓–5 such as improving processes that ultimately affect health behaviors (eg, simpler refill processes leading to better medication adherence).6⇓–8
However, one of the biggest challenges for widespread portal use has been the lack of uptake among diverse patient populations.9 Across the overwhelming majority of studies, patients from racial/ethnic backgrounds and with limited health literacy have been significantly less likely to be portal users,10⇓⇓⇓⇓–15 even when accounting for Internet use in everyday life. Furthermore, usability studies of existing portal interfaces have shown considerable barriers, especially among those with limited health literacy.16⇓⇓–19
Therefore, there remains a large implementation gap in research and clinical practice to engage a broader set of patients in portal use. This goal of increased portal use among diverse patients is relevant for both improving access to care as well as potentially mitigating future health care disparities as systems continue to deliver more care electronically.20 Because previous research specifically identified training and additional support as a primary means to reduce barriers to use,21,22 we designed and evaluated an online portal training program for patients in a safety net health care setting. In a randomized trial, we evaluated the impact of implementing different modes of training on subsequent portal use rates.
Methods
Study Setting
The San Francisco Health Network provides primary care to over 63,000 patients/year as the city's only public health care delivery system. The majority of patients are nonwhite and on Medicaid/uninsured.23 Launched in January 2015, the online patient portal (called MYSFHEALTH) allows patients to access their visit summaries, lab results, and health education materials online. To register, patients initiate signup in person at their clinic and receive an email to activate their account on their own. At the time of the study, 35% of primary care patients had initiated signup for MYSFHEALTH.
MYSFHEALTH Training Content
Informed by documented usability barriers to portal use in our setting,17 we worked in close consultation with the library at Zuckerberg San Francisco General Hospital as well as with 2 advisory boards to iterate both the content and format of our training curriculum. Our patient advisory board was comprised of 10 members who were primary care patients or caregivers, and our project advisory board included 9 experts from health technology research, adult learning and literacy, and clinical administration/operations. Over the course of 4 in-person meetings, we created a final portal training curriculum with simple instructions and 11 how-to videos for accessing MYSFHEALTH (patient story, getting started, signing up, signing in, creating a username, creating a password, accessing the homepage, accessing a visit summary, reviewing lab results, using the online health library, and message from a health care provider—all videos available on request). Participants could watch all videos sequentially, or skip to relevant topics as preferred. Our iterations simplified the content to better match the digital and health literacy levels of the population. In particular, these videos used audio (with captions) for explaining the portal functionality as well as screenshots of how to access each feature. When completed, we put the training content onto an online learning platform, LearnerWeb,24 which allowed us to track individual access of the videos during the study. At the conclusion of the study, we moved all video content to the Network's Web site for public access.
Trial Recruitment
From June to October 2016, we recruited participants from 2 primary care clinics, 1 based on the campus of Zuckerberg San Francisco General Hospital and 1 community-based clinic. Through an electronic query of the electronic health record (EHR), we generated a list of patients who had clinic visits before the study (April to July 2016) meeting the following criteria: 1) English speaking (as the portal was only available in English in our setting), 2) age 18 years or older, and 3) diagnosed with a chronic condition (as previous studies have shown that portal use is highest among those managing a chronic illness25). Providers reviewed the lists and excluded individuals with cognitive or visual impairment, severe mental health conditions, or other barriers to enrollment. Through phone screening, we further excluded individuals without email addresses (as this was necessary for portal registration) and those who already self reported using MYSFHEALTH.
Randomization
We randomized patients during an in-person session to receive: 1) an in-person tutorial with a trained research assistant versus 2) a link to access the online tutorial on their own. During this in-person enrollment session, all participants received an informational pamphlet that was disseminated within the general clinic population outlining key features of the patient portal, and were guided through the steps of signing up for a LearnerWeb account to access the training materials. For participants randomized to the in-person training arm, a trained research assistant prompted participants to log into the learning platform and guided them in accessing the training materials. The staff member provided further explanation or clarification if participants had questions about the training material. Participants in the take-home arm were given an article handout with a link to the training materials and an outline of the steps for accessing the training curriculum. The research assistants delivering the intervention were blinded to the randomization allocation until after the consent process was complete.
The trial is registered at clinicaltrials.gov as NCT03354000.
Access of Training Curriculum
Using LearnerWeb, we measured whether participants accessed the training videos, categorized as yes/no and number of videos watched (from 0 to 11).
Portal Signup and Use
Portal outcomes were assessed via EHR chart review as 1) initiating signup for portal access (yes/no), and 2) logging into the portal after the sign-up process was complete (yes/no and total number of logins). Our primary outcome was the binary assessment of portal log-on in the follow-up period of 3 to 6 months post-training (mean, 112 days; range, 82 to 192 days), with an estimated sample size of 100 to detect detect a 25% difference in portal use (10% vs 35%) with 80% power. Because portal enrollment is tied to an in-person sign-up process, we examined portal use data after the participant's next visit whenever possible, using 3 to 6 months as a time frame in which most patients with chronic diseases were generally scheduled for a follow-up appointment.
Clinic Comparison Group
To obtain comparison outcomes in a third usual care cohort, we performed an additional, nonrandomized EHR data pull of all patients who had visited the 2 primary care clinics during the recruitment time period (April to July 2016) and followed them through the same follow-up period (July to December 2016). We used the same portal use outcome ascertainment in this group.
Survey Measures
We also collected patient survey measures at baseline (in-person) and follow-up (via phone). Participants could receive in-person support from the research assistant at any time to complete their written baseline survey. Demographics included age, gender, race/ethnicity, and highest education completed. We assessed health literacy status using a screener about confidence in filling out medical forms independently.26 We categorized participants noting any lack of confidence filling out medical forms as having limited health literacy based on previous work.10,11,17,27 We asked participants how well they spoke English and categorized less than “very well” as limited English proficiency. Participants also reported how often they used the Internet/email (daily, weekly, or less), as well as their specific chronic condition(s).
To assess portal interest/attitudes and skills, we also asked a series of self-reported items at both baseline and followup. Interest and attitude items included 1) usefulness of Internet for making health decisions (5-point Likert from not at all to very useful), 2) importance of getting medical information electronically (3-point Likert from not at all to very important),28 3) confidence in safeguards for online medical records (3-point Likert from not at all to very confident),28 4) interest in using the MYSFHEALTH portal Web site to see their medical record (5-point Likert from no interest to high interest), and 5) interest in using specific potential portal features (4-point Likerts from not at all to very interested28). Next, we assessed their self-reported skills by asking about 1) self-rated skills to use a Web site to manage health care (5-point Likert strongly disagree to strongly agree), 2) confidence in logging into MYSFHEALTH without help (scored 1 to 10), 3) confidence in using MYSFHEALTH to improve their health (scored 1 to 10), and 4) self-reported eHealth literacy using 4 items of a validated scale (such as “I know how to use the Internet to answer my health questions”29).
Next, we assessed patient perceptions of their health care that we hypothesized could be impacted by portal use, either because of increased convenience or access to specific health care information. This included the Patient Assessment of Chronic Illness Care30 and a modified version of the Self-Efficacy for Managing Chronic Disease Scale31 (adjusted from 11th to 4th grade Flesch-Kincaid readability level).
The follow-up surveys also included an open-ended question about the reason(s) for portal nonuse among those without any log-ins documented in the EHR.
Participants received $25 gift cards to a major retailer for completing the baseline and follow-up portions of the study.
Analysis
We summarized participant baseline characteristics by trial arm. Next, we examined the portal use outcomes using χ2 tests by study arm. We completed intent-to-treat analyses among all participants, as well as a per-protocol examination comparing participants exposed versus not exposed to the online training materials.
Because there were no statistical differences or consistent patterns by study arm, we collapsed all participants for the remaining secondary statistical analyses. We first compared the overall portal sign-up and use in the trial to the total rates in usual care during the same time period, using 2-sample tests of proportions. We also examined changes over time in self-reported survey measures, using paired t-tests for continuous variables and McNemar's test for categorical assessments. In addition, we looked at unadjusted differences in portal use in the follow-up period by participant characteristics using χ2 tests. We performed all analyses using Stata 14.2 (College Station, TX).
Finally, to analyze participant-reported reasons for portal nonuse at follow-up, 2 members of the analytic team categorized the reasons into distinct categories, meeting in person to establish consensus.
Results
Figure 1 displays the recruitment and follow-up of the trial sample.32 We enrolled 93 participants in the trial, collecting portal use outcomes on 88 individuals (95% follow-up overall; 94% take-home and 95% in-person). Follow-up surveys were collected on 75 individuals (81% follow-up overall; 78% take-home and 84% in-person).
Trial recruitment flowchart.
Participant Characteristics
The mean age of the sample was 54 years, and 51% had self-reported limited health literacy; 61% were nonwhite, 52% were female, 45% were in fair/poor health (Table 1). A quarter of the sample reported limited English proficiency, despite completing the phone screening process in English. A majority used the Internet and email daily (76% and 65%, respectively). Participants were balanced between trial arms, with the exception of anxiety.
Baseline Participant Characteristics, Overall and by Trial Arm
Randomized Trial Findings: Portal Signup and Use
The main trial results are shown in Table 2. By design, everyone in the in-person training arm watched the videos at least once, compared with 43% in the take-home arm (P < .001). The average number of video lessons viewed (out of 11) was 3.4 (SD = 3.6; median, 2; range, 0 to 11), with 1.3 average views in the take-home arm and 5.7 average views in the in-person arm. The most watched lesson topics were for accessing lab results (46%) and signing up (43%), while the least-watched video topics were creating usernames (16%) and passwords (15%).
Primary Trial Outcomes: Portal Initiation and Use at 3–6-Month Followup Assessed via Electronic Health Record
Overall, 18 participants (21%) logged in to the portal at least once during the follow-up period and another 17 (20%) were newly Web-enabled during a clinic visit to initiate the portal sign-up process. For the intention-to-treat analysis, the proportion of patients logging into the portal did not differ by in-person versus take-home arm (P = .8). The mean number of logins to the portal Web site was 1.3 overall (SD = 3.1; range, 0 to 15) and 4.7 among those who logged in at least once (SD = 4.4; range, 1 to 15), and this also did not differ by study arm (P = .54 and P = .53, respectively). Similarly, there was no difference in portal sign-up rates at follow-up by study arm (P = .9). In the per-protocol analysis, those who had watched at least 1 of the videos (vs none) had a slightly higher proportion of logging into the portal (23% vs 15%), but this difference was not statistically different. There were no significant differences in portal outcomes by clinic site (data not shown).
Usual Care Comparison
When comparing portal use to the general clinic rates over the same time (Figure 2), rates of portal signup and logon were over twice as high: 20% in the trial compared with 8% in usual care for initiating portal sign-up process (P < .001), and 21% in the trial compared with 9% in usual care for portal logins (P < .001). The mean number of portal logins within the usual care comparison sample was also lower at 0.49 (SD = 4.2).
Portal initiation and use at 3–6 month followup, comparing trial participants to usual care comparison group.
Participant Attitudes/Skills and Perceptions of Health Care
We found significant changes in patients' self-reported skills in using the portal Web site from the baseline to follow-up surveys (Table 3), with the proportion reporting that they agreed they had sufficient ability to use the Web site increasing from 63% to 78% (P = .03). Similarly, there was a significant increase in the eHealth literacy scale over time (14.4 to 16.2, P < .001). However, we found a decrease in self-reported interest in using the portal, from 53% to 39% (P = .01).
Changes in Participants' Self-Reported Survey Measures from Baseline to Followup at 3 to 6 Months Post-Training
Participant Demographics and Portal Use
In a secondary analysis, we examined portal use during followup by relevant participant characteristics (Figure 3). Participants who were 60 years or older were more likely to have logged in to the portal than their younger counterparts (32% vs 13%, P = .03). In addition, portal use was higher among participants with adequate compared with limited health literacy (35% vs 7%, P < .01), and among participants with 1 versus with 2+ conditions (35% vs 15%, P = .03).
Rates of portal use at 3–6 month followup by key patient demographics and health characteristics.
Reasons for Nonuse
Among 38 participants who reported reasons for not using the portal at followup, the most common reasons were being too busy or occupied with personal issues (24%), not having the need to use it (18%), being concerned about security (13%), needing more help to sign up (8%), misplacing written information from the training (8%), and having limited access to a computer/Internet (8%).
Discussion
We found an online video-based portal training resulted in moderate use of the portal in subsequent months. Participants' self-reported confidence in using the website and eHealth digital literacy appeared to be the most malleable to improvement post-training. The mode of the training (in-person vs take-home) did not affect ultimate portal use, suggesting that any 1-on-1 session highlighting the Web site and providing general information about enrollment may prompt some level of increased patient use. Therefore, the scalability of our training may be promising, especially among systems with a way to deliver a brief in-person educational session. A secondary examination of overall portal use during the same time period suggests that the trial participants had substantially higher use than usual care.
Yet our training was not successful in getting a majority of patients to become portal users. Our portal enrollment process at the San Francisco Health Network protects patients' security by requiring in-person verification to sign up, but this represents a high bar for widespread adoption. Furthermore, previous studies have shown that providers/staff and system implementation directly influence individual-level patient portal use,33⇓–35 and our system was early in the clinic workflow development process to fully support patient use. While we were not able to harness specific patient-provider or patient-staff discussions about portal use to time the delivery of training in our study, future work capitalizing on clinic and provider support for portal use will be critical. Finally, the existing usability of existing portal interfaces has been shown to be subpar,16⇓⇓–19,36 which represents an additional barrier to patient log-in. In fact, patients in our study reported less interest in using the portal at follow-up, perhaps suggesting that our system's implementation and enrollment process and current portal functionality may be less appealing than they had envisioned.
One previous study evaluated a nurse-led portal training for primary care patients enrolled in an ongoing chronic disease care coordination program, reporting prepost improvements in functional status and reduced emergency department visits.37 This study found no improvement in chronic disease self efficacy, similar to our findings, but much higher rates of portal use among the participants (perhaps due to higher levels of patient educational attainment).
It is important to note that a large proportion of patients in our Network were not eligible for this study, given the high number of non-English speakers and lower rates of email use in our health care setting. The lack of portal accessibility in multiple languages, as well as the need for a higher level of existing digital literacy skills to sign up, represent substantial barriers to use among diverse populations.38 Even within our sample that was purposefully screened for these issues, we delivered training to patients reporting limited English proficiency and a lack of regular Internet and/or email use—2 groups who likely need more intensive assistance to meaningfully access portal Web sites.
Overall, our findings suggest a need for more computer/digital literacy training and support, especially given reported barriers related to technology at followup among eligible participants. More vulnerable patients, particularly those with limited health literacy and the sickest patients, were less likely to be portal users, similar to previous work.9,11,17,39 Patients with limited health literacy had particularly low rates of portal use, underscoring a need for tailored training or support for this subpopulation. Interestingly, those over age 60 years were more likely to be portal users post-training (which is in contrast to other work34,39); this population might have specifically benefited from the brief training to orient them to the Web site.
Our study was limited by a modest sample size. In addition, our work was conducted within clinics in the same urban safety net system, which may limit generalizability, especially to more resourced health care settings with different enrollment strategies and more affluent patients. We also used self-reported health literacy, which could differ from direct assessment of health literacy status such as the Short Test of Functional Health Literacy in Adults (S-TOFHLA). Finally, our prepost changes in the self-reported patient survey outcomes lack an equivalent control group and could be subject to regression to the mean.
Moving forward, we cannot forget the basic training needs of diverse patients to be able to use existing health technologies. In contrast to a “build it and they will come” approach, portals will not be universally effective without sufficient training and support for broader uptake. In addition, health literacy (and likely English proficiency, which we could not study fully given our lack of non-English functionality) continues to present barriers to use. We need better research and implementation strategies focused on patients with communication barriers to address this issue more explicitly.
Notes
This article was externally peer reviewed.
Funding: The authors of this publication have conducted this research with support from the Agency for Healthcare Research and Quality (R00HS022408) and the National Library of Medicine–National Institutes of Health (NIH NLM; G08LM012166). US has grant support from the NIH National Cancer Institute (NCI; K24CA212294-01). NR receives funding from the Agency for Healthcare Research and Quality (1K08HS022561 and P30HS023558). DS receives support from the NIH NLM (R01 M012355–01A1) and the National Institute of Diabetes and Digestive and Kidney Diseases through the HMORN–UCSF Center for Diabetes Translational Research (NIDDK; 2P30 NIDDK092924-06).
Conflict of interest: none declared.
To see this article online, please go to: http://jabfm.org/content/32/2/248.full.
- Received for publication September 5, 2018.
- Revision received November 6, 2018.
- Accepted for publication November 30, 2018.