Skip to main content

Who stop telemonitoring disease activity and who adhere: a prospective cohort study of patients with inflammatory arthritis

Abstract

Background

The use of frequent electronic patient reported outcome measures (ePRO’s) enables monitoring disease activity at a distance (telemonitoring) in patients with inflammatory arthritis. However, telemonitoring studies report declining long-term adherence to reporting ePRO’s, which may oppose the benefits of telemonitoring. Therefore, the objective was to investigate what factors are associated with (non-)adherence to telemonitoring with a weekly ePRO in patients with inflammatory arthritis (IA).

Methods

We performed a prospective cohort study in patients with rheumatoid arthritis (RA), psoriatic arthritis (PsA) and ankylosing spondylitis (AS) at Reade Amsterdam, The Netherlands. Patients telemonitored their disease activity weekly for 6 months with a modified Multidimensional Health Assessment Questionnaire completed in a smartphone application. The primary outcome was time to dropout, defined as ≥ 4 weeks of consecutively nonresponse. Based on literature and through expert meetings, a predefined set of 13 baseline factors were selected to assess the association with time to dropout through a multivariable Cox-regression analysis.

Results

A total of 220 consecutive patients were included (mean age 54, SD 12; 55% females; 99 RA, 81 PsA, and 40 AS). A total of 141 patients (64%) dropped out, with a median time to dropout of 17 weeks (IQR 9–26). Women had a significant higher chance to dropout over 6 months compared to men (HR 1.58, 95% CI 1.06–2.36).

Conclusion

In the set of investigated factors, women stopped reporting the weekly ePRO sooner than men. Future focus group discussions will be performed to investigate the reasons for dropout, and in specific why women dropped out sooner.

Trial registration This trials was prospectively registered at www.trialregister.nl (NL8414).

Peer Review reports

Background

Inflammatory arthritis demands long-term disease activity monitoring [1] which requires an outpatient clinic visit to assess the disease activity with composite measures such as the Disease Activity Score 28 (DAS28) or Clinical Disease Activity Index (CDAI) [2, 3]. However, with electronic patient reported outcomes (ePRO’s) the disease activity can be measured longitudinally at a distance (telemonitoring) in between visits. This leads to a reduction of clinical visits while maintaining tight disease control, and higher patient satisfaction regarding both shared decision making and physician’s awareness of disease fluctuations [4,5,6].

Although high adherence to reporting ePRO’s through mobile applications (apps) was observed in recently performed trials that investigated telemonitoring, often adherence declined over time reducing the potential benefits [7,8,9]. For example, adherence decreased from 88 to 62% during the 6 month study of Lee et al. [10]. In addition, Seppen et al. reported declining adherence rates from > 90% in week one, to less than 50% in week four [11]. Identifying factors related to nonadherence increases our understanding of adherence to reporting ePRO’s over time and may identify factors that can be influenced to improve adherence.

Multiple models such as the Unified Theory of Acceptance and Use of Technology (UTAUT) and Technology Acceptance Model (TAM) are constructed to explain adoption and usage behavior of new technologies [12, 13]. A recent systematic review regarding adherence of telemonitoring with ePRO’s in patients with chronic diseases showed that lower eHealth literacy (the ability to seek and use health information from electronic resources) and the presence of comorbidity may act as potential barriers to adherence [14]. Within rheumatology specifically, Colls et al. retrospectively found in rheumatoid arthritis (RA) patients that a higher disease activity was associated with lower adherence to reporting ePRO’s, and being older than 65 years with higher adherence [15]. Other presumed important factors such as gender and educational level were not significantly associated to adherence. During a qualitative study, receiving appropriate training, clarity of instructions and a simple user interface of a mobile application (app), were identified as potential facilitators for higher adherence. Experiencing technical issues with the app were potential barriers for adherence [9, 16]. To conclude, evidence regarding which factors influence adherence to reporting ePRO’s is limited, and often retrospectively tested, univariably or in a limited combined set of factors. But, comparable with medication compliance, adherence is a complex concept in which a multitude of factors correlate with adherence and possibly each other.

Therefore, the objective of this study was to prospectively explore the association between a combined set of patients- and clinical related factors with (non)-adherence to telemonitoring with a weekly ePRO through a smartphone app, in patients with inflammatory arthritis.

Methods

Study design

We performed a 6-month prospective cohort study at Reade, a center for rehabilitation and rheumatology, in Amsterdam, The Netherlands, from April 2020 to June 2021. The protocol was registered at ICTRP Search Portal (who.int) (NL8414) at 28-02-2020. Patients continued routine clinical care, with the addition of weekly telemonitoring of their disease activity through an ePRO questionnaire that was completed in a smartphone application designed specifically for this purpose: the MijnReuma Reade app [11].

The app has already been extensively described in previous publications [11, 17]. In short, the MijnReuma Reade app was developed at Reade with the aim to enable patients to monitor their symptoms and disease activity on a weekly base at home with a modified version of the Multi-Dimensional Healthcare Assessment Questionnaire (MDHAQ) with the addition of a single flare question, see Table 1 [17, 18]. The results were displayed in text supported by graphs for disease activity, pain, function, overall wellbeing, fatigue, and morning stiffness. Patients received a badge notification when the new ePRO questionnaire was available, and a reminder was sent after three days when the ePRO questionnaire was not yet filled in. The app transferred the patients' data in real-time to their Electronic Medical Record (EMR) at Reade, making the results directly visible for the health care providers at Reade. The app was secured by a two factor authentication, is CE certified, and compliant with the Dutch privacy and security laws [19].

Table 1 The weekly ePRO questionnaires included in the MijnReuma Reade app (17)

Study outcome

Adherence and nonadherence to reporting ePRO’s was measured after 6 months. According to the adherence framework of O’Brien et al., we considered multiple metrics to define (non-)adherence [20]. We deemed time to dropout as the most suitable metric for non-adherence as the primary outcome for this study. Dropout was defined as ≥ 4 weeks of consecutively nonresponse to the ePRO, based on the 0.9 percentile of all average user gaps [21]. Adherence was the secondary outcome, measured as the User Activity Ratio (UAR). The UAR is the number of reported ePRO’s divided by the number of potentially reported ePRO’s (26 in this study) × 100.

Patient selection and recruitment procedure

To minimize selection bias, and to increase the generalizability of our future findings, we consecutively approached all patients with Rheumatoid Arthritis (RA), Psoriatic Arthritis (PsA) or Ankylosing Spondylitis (AS) who had a physical or telephone consultation with their rheumatologist at the outpatient clinic, from March 2020 until December 2020. The inclusion criteria were: (1) diagnosed with RA, PsA or AS according to their rheumatologist, (2) having an Android or iOS-based smartphone, and (3) able to speak, read and write Dutch. The exclusion criteria were (1) not having an e-mail address and (2) previously participating in a trial in which the “MijnReuma Reade” app was used. Due to local COVID-19 regulations, the study was performed without any face-to-face contact between patients and the researchers. Patients were invited by e-mail including detailed information about the study. One week thereafter, the researcher phoned the patients to ask if they were interested in participation and to answer any questions regarding the study. Additional information (phone number and email address) was given about how to contact the researchers when participants encountered any (technical) problems with the app. If the patient consented, additional instructions were given on how to use the app. Personal log-in credentials along with a link to the iOS store and Google Play store to download the app, were sent to the patient per e-mail. When the patient logged-in for the first time, they were asked to fill in the electronic informed consent file in the app through a checkbox. Patients were only definitively included in the study when both the oral and electronic informed consent were obtained. This study is performed in accordance with relevant guidelines and regulations, and in specific the legislation of the medical ethics committee of the Vrije Universiteit medisch centum (VUmc) at Amsterdam, the Netherlands (case number 2019.641), who issued a waiver for this study at 05-11-2019. All methods were carried out in accordance with relevant guidelines and regulations (declaration of helsinki).

Investigated factors

To select a predefined set of factors that we were interested in to see if these factors were associated with (non)-adherence, we could not rely solely on existing comparable studies which investigated adherence to telemonitoring by ePRO’s in the field of rheumatology, as these are scarce. Therefore, we examined established models describing how users come to accept and use a technology, such as the TAM and UTAUT, and selected candidate factors based on these models [12, 13]. We complemented the list with possible relevant clinical and sociodemographic factors identified through consensus meetings between JW, BS, and WB. The final set of investigated factors is shown in Table 2.

Table 2 Set of investigated patient and clinical factors

Patient factors

As can be seen in Table 2, a total of nine patient factors were studied. The Effective Consumer Scale 17 (EC-17) measures the skills and behaviors people need to effectively manage their healthcare and consists of 17 items scored on a 5-point Likert scale ranging from 0 (never) to 4 (always)[22]. Higher scores represent more effective patient’s self-management attitude and behavior. The Perceived Efficacy in Patient-Physician Interaction 5 (PEPPI-5) measures the efficacy of patients to interact with their physicians with a 5-item questionnaire [23]. Each item is scored on a 5-point Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree). Higher scores represent higher perceived self-efficacy in patient-physician interaction.

No Dutch eHealth or technology literacy measure exist yet. Therefore we extracted the 10-item smartphone subscale of the Media and Technology Usage and Attitude Scale (MTUAS) and translated it to Dutch (following the guidelines for translation of questionnaires) [24]. The smartphone subscale is validated as an independent questionnaire. Higher scores represent higher smartphone usage. The System Usability Scale (SUS) consists of 10 questions measured on a 5-point Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree) [25]. Higher scores represents a higher perceived usability of the app, a score higher than 68 is considered above average.

Clinical factors

A total of six clinical predictors were assessed. Disease activity was measured with the Routine Assessment of Patient Index Data 3 (RAPID3), a composite measure containing the function, pain, and patient global scales derived from the MDHAQ [26]. Comorbidity was measured by the Charlson Comorbidity Index (CCI) [27]. Each comorbidity category has an associated weight, and the sum of all comorbidity weights result in a score. Higher scores predict higher mortality and higher resource usage. A score of zero means that no comorbidities were found.

Medication adherence was measured by the Compliance Questionnaire Rheumatology (CQR) and is a five question self-report medication adherence measure created specifically for patients with rheumatic diseases and discriminates patients in low or high medication adherence groups [28].

Sample size

We calculated the needed sample size without the estimation of an effect size, since there was no comparative study on which we could estimate the possible effect size. A dropout over time rate of 60% was expected based on the adherence data of a pilot telemonitoring study and preliminary data of an RCT both performed at Reade Amsterdam [5, 11]. Following the method for sample size calculation of Green et al., a minimum of 131 cases were necessary to study 13 factors [29]. We divided the cases (131) by the expected dropout over time (0.60) and concluded that a minimum of 219 participants was needed.

Statistical analysis

Descriptive values were presented as mean and standard deviation (SD) if normally distributed, otherwise the median and interquartile range (IQR) was presented.

For the primary outcome, the association between the combined set of factors and dropout over time was assessed through a multi-variable Cox proportional hazard regression analysis. In advance, each independent variable was checked for multicollinearity. If multicollinearity was present (variance inflation factor (VIF) > 5), the most relevant factors were chosen based on literature and clinical expertise to remain in the multivariable regression model. Results are presented as hazard ratios (HR) with 95% confidence interval (95% CI). The SUS-score was measured at 3 months and could therefore not be included in the Cox-Regression analysis. We performed a univariable linear regression analysis to compare the SUS-scores for adherent patients, patients dropped out in month 1, month 2–3 and month 4–6.

For the secondary outcome, the association between the combined set of factors and a higher UAR (more reported ePRO’s) was assessed through a multivariable logistic generalized estimating equation (GEE) model. The longitudinal GEE analysis corrects for dependent observations within a person. This was necessary as each patient had 26 observations: for each week the outcome (reported ePRO yes/no) was determined. All factors were selected for the multivariable logistic GEE model because of the large number of records and sufficient number of factors. Results are presented as odds-ratio (OR) with 95% CI for reporting more ePRO’s. All analyses were run on IBM SPSS statistics V23.

Role of the funding source

This research is investigator-initiated and funded by Pfizer, Sanofi, Eli-Lilly and Novartis. The funders had no role in the design of this study, nor during its execution, analysis, interpretation of the data, or decision to submit results.

Results

Between April 2020 and December 2020, a total of 825 patients were consecutively assessed for eligibility, of which 220 patients downloaded the app and gave digital informed consent, see Fig. 1. Two patients withdrew during the study: one moved abroad at week seven, and one indicated not having time to continue with the study at week 11. Both patients were included in the analysis, and were considered as dropout as soon as they withdrew. Three patients did not fill out the baseline questionnaire, therefore data regarding their medication adherence, smartphone usage, self-management and patient-physician interaction was missing and were excluded for the Cox-regression and GEE analysis. We included 99 RA patients, 81 PsA and 40 AS patients. The average age was 54 years (SD 12), 55% was female, and the median RAPID3 disease activity was moderate (3.7), see Table 3.

Fig. 1
figure 1

Flow chart. Patients selection and flow through the study

Table 3 Baseline characteristics (n = 220)

Time to dropout

A total of 79 patients (36%) continued telemonitoring during the 6 months period, and 141 (64%) dropped out (Fig. 2). Median (IQR) time to dropout was 17 (9–26) weeks. The VIF was < 5.0 for all factors, therefore collinearity between factors was negligible. Within the set of investigated factors, women had a higher risk to dropout over the 6 months period compared to men (median time to dropout 15 vs 19 weeks, HR 1.58, 95% CI 1.06–2.36; Table 4). Low medication adherence (median time to dropout 16.5, IQR 8.5–26) compared to high (median time to dropout 18 weeks IQR 9–26), biological usage (15 weeks, IQR 8–26) compared to csDMARDs (18 weeks, IQR 9–26), a higher education level (16 weeks, IQR 8–26) compared to lower (18 weeks, IQR 7–26), and the patients with PsA diagnosis (16 weeks, IQR 8–26) compared to RA (20 weeks, IQR 10–26) had all a small but not statistically significant increased risk for dropout over 6 months. Since gender was significantly associated with time to dropout, we post-hoc stratified the analysis for men and women separately to assess if the set of factors associated with time to drop out differed between men and women. There were small but not significant differences found in hazard ratios for time to dropout for both men and women, see Table 1 and Figure 1 in the Additional file 1.

Fig. 2
figure 2

Proportion of participants who stop reporting ePRO’s over time

Table 4 Associations with time to dropout of reporting ePRO’s

Patients who dropped out in the 1st month and 2nd to 3rd month reported significant lower mean SUS scores compared with adherent patients (respectively 67.6 for month 1 and 71.5 for month 2 to 3 vs 81.8 for adherent patients, p < 0.001). The SUS scores for patients dropped out in the 4th–6th month was lower, but this difference was not statistically different (78.2 p = 0.18).

User activity ratio

The UAR over 6 months was 49%. In the first week 81% completed the ePRO, which decreased to 39% in the last week. The decline was steepest in the first weeks of the study and consolidated from week 14, see Fig. 3.

Fig. 3
figure 3

User activity ratio (% completed ePRO) with 95% confidence interval per week during the study

Women had significant lower odds to complete ePRO’s compared to men (OR 0.66, 95%CI 0.43–1.00), see Table 5. Patients with a higher comorbidity index had higher odds to complete ePRO’s, but the result.

Table 5 Association of factors related with completing more ePRO’s

was not statistically different. The lower odds to complete ePRO’s for biological usage, higher educational level, diagnosis, higher smartphone usage and higher medication adherence were all not statistically significant. Again, since gender was significant associated with the UAR, we post-hoc stratified for gender and repeated the analysis to assess if the factors are different associated to adherence for men and women. In the combination of factors, there were small but not significant differences in odds ratio to report ePRO’s for both men and women, see Table 2 in the Additional file 1.

Discussion

This study investigated the association of a combined set patient- and clinical related factors with (non-)adherence to telemonitoring disease activity with a weekly ePRO in patients with inflammatory arthritis. Here we showed that of the 13 investigated factors, only gender was significantly associated with adherence: women had both higher chances to dropout over 6 months and lower odds to report the weekly ePRO’s.

The association between gender and adherence has been described in the widely accepted UTAUT model [12]. This model states that there are differences between genders in the mechanism of how they adopt and use new technologies such as apps for telemonitoring. Thus, a difference in usage in new technology between genders might be present if they are unintentionally easier to adopt for either men or women. The UTAUT describes that one of the differences is that women tend to rely more on supporting factors than men. Similar studies identified in a recent systematic review, approached patients actively when they stopped reporting ePRO’s to ask if they needed support, and did not find a significant association between adherence and gender [14]. In our study, the initiative to seek (technical) support for the app was given to the patients themselves, which may have contributed to the discrepancy in adherence between men and women.

Other factors could also account for the conflicting results regarding gender differences between our study and the results found in the systematic review [14]. For example, it is notable that Colls et al., and Jamilloux et al. included predominantly women (81% and 79%) [15, 30], Guzman et al. predominantly men (97%) [31], while Rosen et al. had a small sample size of 50 participants consisting of predominantly women (71%) [32]. Therefore, the included studies in the review may not have the proper men/women ratio for their sample size to establish a significant difference in adherence between men and women, which is different compared to our study, with a more balanced ratio between men and women.

If gender differences exist in adherence to traditional face-to-face follow-up visits is unknown, as we could not find any literature regarding this. However, we did find that the relationship between adherence and gender is frequently described considering medication administration, although the results are contradicting. While a systematic review towards adherence to biological treatment in patients with inflammatory arthritis (RA, PsA, AS) found that women were in general less adherent than men, a recent performed prospective cohort in patients with RA showed no significant difference between men and women (OR 0.90, 95CI 0.44–1.85) [33, 34]. Reasons why gender differences may be present in adherence to medication usage were not investigated.

This study was due to the quantitative nature also unable to identify reasons for why women had higher risks to dropout then men. However, we do hypothesize that by increasing the support for a telemonitoring program, we might be able to decrease the observed adherence gap between men and women in our population. Future prospective studies are necessary to corroborate our results and identify why women dropped out sooner. Furthermore, since our results suggests that gender differences may also exist in eHealth in rheumatic care, we advise that research groups need to investigate potential gender differences in adherence when developing new eHealth interventions such as telemonitoring disease activity.

Potentially vulnerable patients, such as patients with a lower level of self-management, older age, or lower education level, did not have an increased risk to dropout over the 6 months in our study. In contrary, patients with a higher education had a small (although not significant) higher risk to dropout. We found this to be remarkable. It could be that we could not identify these findings as the population of our study consisted for 50% of higher educated patients, and the patients reported a higher self-management (EC-17) baseline score compared with other patients found in literature [23]. Another reason may be that the education level does not play an important role in filling out ePRO’s [15], or it may be the result of selection bias as only 220 out of 825 (27%) invited patients participated in the study. An unintended selection bias is frequently observed in eHealth studies in rheumatology. For example, Colls et al. reported a remarkable high education level for their participants: > 80% attained college or an even higher educational level [15]. And Müskens et al. showed that RA patients participating in an eHealth platform tended to be younger and higher educated than patients who did not [35], which seems in line with our study population. Thus, although adherence was not lower for potential vulnerable patients in our study, their underrepresentation in eHealth studies suggests that it is urgent that future research focusses on how to make eHealth, and telemonitoring specifically, more accessible for all patients.

We found that patients who dropped out in the first 3 months reported a significant lower perceived usability of the app than adherent patients. A recent study showed that for optimal adoption of apps in rheumatic care, apps should be adjusted to the needs of rheumatic patients and their level of eHealth literacy [36]. A patient centered design is therefore deemed crucial [37]. Although the app used in this study was from the start designed with patients’ input, the system usability scores indicate that there are still patients who are unsatisfied with the usability of the app (mean SUS 67.6 for participants that dropped out early, compared to 81.8 for adherent participants), which may have influenced the adherence. Therefore, optimization of the app should be continued even after implementation with continuous feedback of patients to increase the usability, adoption, and adherence. We will perform focus group discussions in our study population to investigate how we can improve the app and overcome perceived barriers in the usage of the app.

There are other limitations to our study which should be noted. Firstly, with our study design we could not determine causality between the investigated factors and (non-)adherence. It is therefore unclear if an improvement in reported usability scores will lead to higher adherence. Secondly, the generalizability of studies investigating adherence to telemonitoring prospectively, and therefore also this study, is limited. This since adherence to telemonitoring with ePRO’s is assumed to be subject to a multitude of factors that are all in relation to each other, and which are unlikely to be the same between studies. For example, the used tool to collect ePRO’s is deemed to influence the amount of reported ePRO’s significantly, however every research group develops their own tool (app) to connect with their electronic medical records [38]. Therefore, our results may be different in other settings, even with an optimal internal validity. We countered the influence of the inter-factor associations as much as possible by analyzing the different factors as a group-set and incorporate as much relevant factors as we were able to. Still, the limited external validity should be considered if the results are extrapolated to other settings.

Conclusion

Over 60% of the patients who telemonitored their disease activity in-between visits with a weekly ePRO over a 26-week period dropped out. Especially women stopped reporting the weekly ePRO’s sooner than men and had lower odds to report the weekly ePRO’s. Reasons why patients become non-adherent, as well as reasons to adhere to telemonitoring need to be investigated to improve the adoption of telemonitoring with ePRO’s in general, and for women specifically.

Availability of data and materials

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.

References

  1. Smolen JS, Landewé RBM, Bijlsma JWJ, Burmester GR, Dougados M, Kerschbaumer A, et al. EULAR recommendations for the management of rheumatoid arthritis with synthetic and biological disease-modifying antirheumatic drugs: 2019 update. Ann Rheum Dis. 2020;79(6):685–99.

    Article  CAS  PubMed  Google Scholar 

  2. van Riel PL, Renskers L. The Disease Activity Score (DAS) and the Disease Activity Score using 28 joint counts (DAS28) in the management of rheumatoid arthritis. Clin Exp Rheumatol. 2016;34(5 Suppl 101):S40–4.

    PubMed  Google Scholar 

  3. Smolen JS, Aletaha D. Scores for all seasons: SDAI and CDAI. Clin Exp Rheumatol. 2014;32(5 Suppl 85):S-75-9.

    Google Scholar 

  4. de Thurah A, Stengaard-Pedersen K, Axelsen M, Fredberg U, Schougaard LMV, Hjollund NHI, et al. Tele-health followup strategy for tight control of disease activity in rheumatoid arthritis: results of a randomized controlled trial. Arthritis Care Res. 2018;70(3):353–60.

    Article  Google Scholar 

  5. Seppen BW J, ter Wee M,; van Schaardenburg D, Roorda L, Nurmohamed M, Bos W. Smartphone assisted patient initiated care safely reduces outpatient clinic visits in patients with rheumatoid arthritis: results from a randomized controlled trial. Arthritis Rheumatol. 2021;73.

  6. Shaw Y, Courvoisier DS, Scherer A, Ciurea A, Lehmann T, Jaeger VK, et al. Impact of assessing patient-reported outcomes with mobile apps on patient-provider interaction. RMD Open. 2021;7(1):e001566.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Rathbone AL, Clarry L, Prescott J. Assessing the efficacy of mobile health apps using the basic principles of cognitive behavioral therapy: systematic review. J Med Internet Res. 2017;19(11):e399.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Austin L, Sharp CA, van der Veer SN, Machin M, Humphreys J, Mellor P, et al. Providing “the bigger picture”: benefits and feasibility of integrating remote monitoring from smartphones into the electronic health record. Rheumatology. 2020;59(2):367–78.

    Article  PubMed  Google Scholar 

  9. Bingham CO 3rd, Gaich CL, DeLozier AM, Engstrom KD, Naegeli AN, de Bono S, et al. Use of daily electronic patient-reported outcome (PRO) diaries in randomized controlled trials for rheumatoid arthritis: rationale and implementation. Trials. 2019;20(1):182.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Lee YC, Lu F, Colls J, Luo D, Wang P, Dunlop DD, et al. Outcomes of a mobile app to monitor patient-reported outcomes in rheumatoid arthritis: a randomized controlled trial. Arthritis Rheumatol. 2021;73(8):1421–9.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Seppen BF, Wiegel J, L’Ami MJ, Dos Santos Rico SD, Catarinella FS, Turkstra F, et al. Feasibility of self-monitoring rheumatoid arthritis with a smartphone app: results of two mixed-methods pilot studies. JMIR Formative Res. 2020;4(9):e20165.

    Article  Google Scholar 

  12. Venkatesh V, Morris MG, Davis GB, Davis FD. User acceptance of information technology: toward a unified view. MIS Q. 2003:425–78.

  13. Davis FD. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1989;13(3):319–40.

    Article  Google Scholar 

  14. Wiegel J, Seppen B, van der Leeden M, van der Esch M, de Vries R, Bos W. Adherence to telemonitoring by electronic patient-reported outcome measures in patients with chronic diseases: a systematic review. Int J Environ Res Public Health. 2021;18(19):10161.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Colls J, Lee YC, Xu C, Corrigan C, Lu F, Marquez-Grap G, et al. Patient adherence with a smartphone app for patient-reported outcomes in rheumatoid arthritis. Rheumatology. 2021;60(1):108–12.

    Article  PubMed  Google Scholar 

  16. Renskers L, Rongen-van Dartel SA, Huis AM, van Riel PL. Patients’ experiences regarding self-monitoring of the disease course: an observational pilot study in patients with inflammatory rheumatic diseases at a rheumatology outpatient clinic in The Netherlands. BMJ Open. 2020;10(8):e033321.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Seppen BF, L’Ami MJ, Dos Santos Rico SD, ter Wee MM, Turkstra F, Roorda LD, et al. A smartphone app for self-monitoring of rheumatoid arthritis disease activity to assist patient-initiated care: protocol for a randomized controlled trial. JMIR Res Protoc. 2020;9(2):e15105.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Pincus T. Electronic multidimensional health assessment questionnaire (eMDHAQ): past, present and future of a proposed single data management system for clinical care, research, quality improvement, and monitoring of long-term outcomes. Clin Exp Rheumatol. 2016;34(5 Suppl 101):S17-s33.

    PubMed  Google Scholar 

  19. ISO/IEC 27001:2013. Available from: https://www.iso.org/standard/54534.html.

  20. O’Brien HL, Toms EG. What is user engagement? A conceptual framework for defining user engagement with technology. J Am Soc Inform Sci Technol. 2008;59(6):938–55.

    Article  Google Scholar 

  21. Böhm AK, Jensen ML, Sørensen MR, Stargardt T. Real-world evidence of user engagement with mobile health for diabetes management: longitudinal observational study. JMIR Mhealth Uhealth. 2020;8(11):e22212.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Santesso N, Rader T, Wells GA, O’Connor AM, Brooks PM, Driedger M, et al. Responsiveness of the effective consumer scale (EC-17). J Rheumatol. 2009;36(9):2087–91.

    Article  PubMed  Google Scholar 

  23. ten Klooster PM, Oostveen JC, Zandbelt LC, Taal E, Drossaert CH, Harmsen EJ, et al. Further validation of the 5-item Perceived Efficacy in Patient-Physician Interactions (PEPPI-5) scale in patients with osteoarthritis. Patient Educ Couns. 2012;87(1):125–30.

    Article  PubMed  Google Scholar 

  24. Rosen LD, Whaling K, Carrier LM, Cheever NA, Rokkum J. The Media and Technology Usage and Attitudes Scale: An empirical investigation. Comput Human Behav. 2013;29(6):2501–11.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  25. Bangor A, Kortum PT, Miller JT. An empirical evaluation of the system usability scale. Int J Hum Comput Interact. 2008;24(6):574–94.

    Article  Google Scholar 

  26. Pincus T, Castrejon I, Riad M, Obreja E, Lewis C, Krogh NS. Reliability, feasibility, and patient acceptance of an electronic version of a multidimensional health assessment questionnaire for routine rheumatology care: validation and patient preference study. JMIR Formative Res. 2020;4(5):e15815.

    Article  Google Scholar 

  27. Charlson ME, Pompei P, Ales KL, MacKenzie CR. A new method of classifying prognostic comorbidity in longitudinal studies: development and validation. J Chronic Dis. 1987;40(5):373–83.

    Article  CAS  PubMed  Google Scholar 

  28. Hughes LD, Done J, Young A. A 5 item version of the Compliance Questionnaire for Rheumatology (CQR5) successfully identifies low adherence to DMARDs. BMC Musculoskelet Disord. 2013;14:286.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Green SB. How many subjects does it take to do a regression analysis. Multivariate Behav Res. 1991;26(3):499–510.

    Article  CAS  PubMed  Google Scholar 

  30. Jamilloux Y, Sarabi M, Kerever S, Boussely N, le Sidaner A, Valgueblasse V, et al. Adherence to online monitoring of patient-reported outcomes by patients with chronic inflammatory diseases: a feasibility study. Lupus. 2015;24(13):1429–36.

    Article  CAS  PubMed  Google Scholar 

  31. Guzman-Clark JR, van Servellen G, Chang B, Mentes J, Hahn TJ. Predictors and outcomes of early adherence to the use of a home telehealth device by older veterans with heart failure. Telemed J E-Health. 2013;19(3):217–23.

    Article  PubMed  Google Scholar 

  32. Rosen D, McCall JD, Primack BA. Telehealth protocol to prevent readmission among high-risk patients with congestive heart failure. Am J Med. 2013;130(11):1326–30.

    Article  Google Scholar 

  33. López-González R, León L, Loza E, Redondo M, Garcia de Yébenes MJ, Carmona L. Adherence to biologic therapies and associated factors in rheumatoid arthritis, spondyloarthritis and psoriatic arthritis: a systematic literature review. Clin Exp Rheumatol. 2015;33(4):559–69.

    PubMed  Google Scholar 

  34. Balsa A, de Yébenes MJG, Carmona L. Multilevel factors predict medication adherence in rheumatoid arthritis: a 6-month cohort study. Ann Rheum Dis. 2022;81(3):327–34.

    Article  PubMed  Google Scholar 

  35. Müskens WD, Rongen-van Dartel SAA, Vogel C, Huis A, Adang EMM, van Riel P. Telemedicine in the management of rheumatoid arthritis: maintaining disease control with less health-care utilization. Rheumatol Adv Pract. 2021;5(1):079.

    Google Scholar 

  36. Knitza J, Simon D, Lambrecht A, Raab C, Tascilar K, Hagen M, et al. Mobile health usage, preferences, barriers, and ehealth literacy in rheumatology: patient survey study. JMIR Mhealth Uhealth. 2020;8(8):e19661.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Solomon DH, Rudin RS. Digital health technologies: opportunities and challenges in rheumatology. Nat Rev Rheumatol. 2020;16(9):525–35.

    Article  PubMed  Google Scholar 

  38. Kelders SM, Kok RN, Ossebaard HC, Van Gemert-Pijnen JE. Persuasive system design does matter: a systematic review of adherence to web-based interventions. J Med Internet Res. 2012;14(6):e152.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

This work was supported by by Pfizer; Sanofi; Eli-Lilly; and Novartis. They had no role in the design of this study, nor during its execution, analysis, interpretation of the data, or decision to submit results.

Author information

Authors and Affiliations

Authors

Contributions

JW, BS, WHB and MMtW wrote and edited the manuscript, JW was responsible for study procedures including obtaining informed consent. MMtW helped JW with the statistical analysis. All authors provided feedback on the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to J. Wiegel.

Ethics declarations

Ethics approval and consent to participate

This study is performed in accordance with legislation, and approved by the medical ethics committee of the Vrije Universiteit medisch centum (VUmc) at Amsterdam, the Netherlands (case number 2019.641). They issued a waiver for written informed consent for this study, and approved electronic informed consent as mentioned above.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1: Figure 1.

Proportion of participants who stop reporting ePRO’s over time, split for gender. Table 1. Hazard ratios for dropout, stratified for gender. Table 2. Odds ratio to complete an electronic patient reported outcomes, stratified for gender.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Wiegel, J., Seppen, B.F., Nurmohamed, M.T. et al. Who stop telemonitoring disease activity and who adhere: a prospective cohort study of patients with inflammatory arthritis. BMC Rheumatol 6, 73 (2022). https://doi.org/10.1186/s41927-022-00303-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41927-022-00303-w

Keywords

  • Adherence
  • Telemonitoring
  • Telehealth
  • ePRO’s
  • Inflammatory arthritis
  • Patient reported outcomes
  • Dropout