Virtual reality for stroke rehabilitation Laver, Kate E; Lange, Belinda; George, Stacey ...
Cochrane database of systematic reviews,
11/2017, Letnik:
11
Journal Article
Recenzirano
Odprti dostop
Virtual reality and interactive video gaming have emerged as recent treatment approaches in stroke rehabilitation with commercial gaming consoles in particular, being rapidly adopted in clinical ...settings. This is an update of a Cochrane Review published first in 2011 and then again in 2015.
Primary objective: to determine the efficacy of virtual reality compared with an alternative intervention or no intervention on upper limb function and activity.Secondary objectives: to determine the efficacy of virtual reality compared with an alternative intervention or no intervention on: gait and balance, global motor function, cognitive function, activity limitation, participation restriction, quality of life, and adverse events.
We searched the Cochrane Stroke Group Trials Register (April 2017), CENTRAL, MEDLINE, Embase, and seven additional databases. We also searched trials registries and reference lists.
Randomised and quasi-randomised trials of virtual reality ("an advanced form of human-computer interface that allows the user to 'interact' with and become 'immersed' in a computer-generated environment in a naturalistic fashion") in adults after stroke. The primary outcome of interest was upper limb function and activity. Secondary outcomes included gait and balance and global motor function.
Two review authors independently selected trials based on pre-defined inclusion criteria, extracted data, and assessed risk of bias. A third review author moderated disagreements when required. The review authors contacted investigators to obtain missing information.
We included 72 trials that involved 2470 participants. This review includes 35 new studies in addition to the studies included in the previous version of this review. Study sample sizes were generally small and interventions varied in terms of both the goals of treatment and the virtual reality devices used. The risk of bias present in many studies was unclear due to poor reporting. Thus, while there are a large number of randomised controlled trials, the evidence remains mostly low quality when rated using the GRADE system. Control groups usually received no intervention or therapy based on a standard-care approach.
results were not statistically significant for upper limb function (standardised mean difference (SMD) 0.07, 95% confidence intervals (CI) -0.05 to 0.20, 22 studies, 1038 participants, low-quality evidence) when comparing virtual reality to conventional therapy. However, when virtual reality was used in addition to usual care (providing a higher dose of therapy for those in the intervention group) there was a statistically significant difference between groups (SMD 0.49, 0.21 to 0.77, 10 studies, 210 participants, low-quality evidence).
when compared to conventional therapy approaches there were no statistically significant effects for gait speed or balance. Results were statistically significant for the activities of daily living (ADL) outcome (SMD 0.25, 95% CI 0.06 to 0.43, 10 studies, 466 participants, moderate-quality evidence); however, we were unable to pool results for cognitive function, participation restriction, or quality of life. Twenty-three studies reported that they monitored for adverse events; across these studies there were few adverse events and those reported were relatively mild.
We found evidence that the use of virtual reality and interactive video gaming was not more beneficial than conventional therapy approaches in improving upper limb function. Virtual reality may be beneficial in improving upper limb function and activities of daily living function when used as an adjunct to usual care (to increase overall therapy time). There was insufficient evidence to reach conclusions about the effect of virtual reality and interactive video gaming on gait speed, balance, participation, or quality of life. This review found that time since onset of stroke, severity of impairment, and the type of device (commercial or customised) were not strong influencers of outcome. There was a trend suggesting that higher dose (more than 15 hours of total intervention) was preferable as were customised virtual reality programs; however, these findings were not statistically significant.
This study examined how awareness of diagnostic label impacted self-reported quality of life (QOL) in persons with varying degrees of cognitive impairment.
Older adults (n = 259) with normal ...cognition, Mild Cognitive Impairment (MCI), or mild Alzheimer's disease dementia (AD) completed tests of cognition and self-report questionnaires that assessed diagnosis awareness and multiple domains of QOL: cognitive problems, activities of daily living, physical functioning, mental wellbeing, and perceptions of one's daily life. We compared measures of QOL by cognitive performance, diagnosis awareness, and diagnostic group.
Persons with MCI or AD who were aware of their diagnosis reported lower average satisfaction with daily life (QOL-AD), basic functioning (BADL Scale), and physical wellbeing (SF-12 PCS), and more difficulties in daily life (DEM-QOL) than those who were unaware (all p ≤ .007). Controlling for gender, those expecting their condition to worsen over time reported greater depression (GDS), higher stress (PSS), lower quality of daily life (QOL-AD, DEM-QOL), and more cognitive difficulties (CDS) compared to others (all p < .05).
Persons aware of their diagnostic label-either MCI or AD-and its prognosis report lower QOL than those unaware of these facts about themselves. These relationships are independent of the severity of cognitive impairment.
Introduction/Aims
The CHAMPION MG study demonstrated that ravulizumab significantly improved Myasthenia Gravis‐Activities of Daily Living (MG‐ADL) and Quantitative Myasthenia Gravis (QMG) total ...scores versus placebo in adults with acetylcholine receptor antibody‐positive generalized myasthenia gravis (AChR+ gMG). This post hoc analysis aimed to assess these outcomes by time from MG diagnosis.
Methods
Changes from baseline to week 26 in MG‐ADL and QMG total scores were analyzed by time from MG diagnosis to study entry (≤2 vs. >2 years). Within each subgroup, least‐squares (LS) mean changes for ravulizumab and placebo were compared using mixed models for repeated measures.
Results
In ravulizumab‐treated patients, differences in LS mean (standard error of the mean) changes from baseline to week 26 were not statistically significant in the ≤2‐years subgroup versus the >2‐years subgroup for MG‐ADL (−4.3 0.70 vs. −2.9 0.37; p = .0511) or QMG (−4.3 0.94 vs. −2.5 0.50; p = .0822) scores. No clear trends were observed in the placebo group. LS mean changes from baseline were significantly greater for ravulizumab versus placebo in both the ≤2 and >2 years from diagnosis subgroups for MG‐ADL and QMG scores (all p < .05). The difference in treatment effect between the ≤2‐years and >2‐years subgroups was not statistically significant. No clinically meaningful between‐subgroup differences in treatment‐emergent adverse events were observed in ravulizumab‐treated patients.
Discussion
Ravulizumab treatment improved clinical outcomes for patients with AChR+ gMG regardless of time from diagnosis. A numerical trend was observed favoring greater treatment effect with earlier versus later treatment after diagnosis. Further studies are required for confirmation.
Abstract Background Impairment in instrumental activities of daily living (IADL) leads to early loss in productivity and adds significant burden to caregivers. Executive dysfunction is thought to be ...an important contributor to functional impairment. The objective of this study was to investigate the relationship between executive function and IADL in a large cohort of well-characterized normal older controls, mild cognitive impairment (MCI), and patients with mild Alzheimer's disease, separately as well as across the entire sample, while accounting for demographic, cognitive, and behavioral factors. Methods Subjects with baseline clinical datasets (n = 793) from the Alzheimer's Disease Neuroimaging Initiative study (228 normal older controls, 387 MCI, 178 Alzheimer's disease) were included in the analysis. A multiple regression model was used to assess the relationship between executive function and IADL. Results A multiple regression model, including diagnosis, global cognitive impairment, memory performance, and other covariates demonstrated a significant relationship between executive dysfunction and IADL impairment across all subjects ( R2 = .60, P < .0001 for model; Digit Symbol, partial ß = −.044, P = .005; Trailmaking Test B–A, quadratic relation, P = .01). Similarly, an analysis using MCI subjects only yielded a significant relationship ( R2 = .16, P < .0001 for model; Digit Symbol, partial ß = −.08, P = .001). Conclusions These results suggest that executive dysfunction is a key contributor to impairment in IADL. This relationship was evident even after accounting for degree of memory deficit across the continuum of cognitive impairment and dementia.
Frailty is one of the greatest challenges facing our aging population, as it can lead to adverse outcomes such as institutionalization, hospitalization, and mortality. However, the factors that are ...associated with frailty are poorly understood. We performed a systematic review of longitudinal studies in order to identify the sociodemographic, physical, biological, lifestyle-related, and psychological risk or protective factors that are associated with frailty among community-dwelling older adults.
A systematic literature search was conducted in the following databases in order to identify studies that assessed the factors associated with of frailty among community-dwelling older adults: Embase, Medline Ovid, Web of Science, Cochrane, PsychINFO Ovid, CINAHL EBSCOhost, and Google Scholar. Studies were selected if they included a longitudinal design, focused on community-dwelling older adults aged 60 years and older, and used a tool to assess frailty. The methodological quality of each study was assessed using the Quality of Reporting of Observational Longitudinal Research checklist.
Twenty-three studies were included. Significant associations were reported between the following types of factors and frailty: sociodemographic factors (7/7 studies), physical factors (5/6 studies), biological factors (5/7 studies), lifestyle factors (11/13 studies), and psychological factors (7/8 studies). Significant sociodemographic factors included older age, ethnic background, neighborhood, and access to private insurance or Medicare; significant physical factors included obesity and activities of daily living (ADL) functional status; significant biological factors included serum uric acid; significant lifestyle factors included a higher Diet Quality Index International (DQI) score, higher fruit/vegetable consumption and higher tertile of all measures of habitual dietary resveratrol exposure; significant psychological factors included depressive symptoms.
A broad range of sociodemographic, physical, biological, lifestyle, and psychological factors show a longitudinal association with frailty. These factors should be considered when developing interventions aimed at preventing and/or reducing the burden associated with frailty among community-dwelling older adults.
Lewinnek's recommendation for orienting the cup in THA is criticized because it involves a static assessment of the safe zone and because it does not consider stem geometry. A revised concept of the ...safe zone should consider those factors, but to our knowledge, this has not been assessed.
(1) To determine the shape, size, and location of target zones for combined cup and stem orientation for a straight stem/hemispheric cup THA to maximize the impingement-free ROM and (2) To determine whether and how these implant positions change as stem anteversion, neck-shaft angle, prosthetic head size and target range of movements are varied.
A three-dimensional computer-assisted design model, in which design geometry was expressed in terms of parameters, of a straight stem/hemispheric cup hip prosthesis was designed, its design parameters modified systematically, and each prosthesis model was implanted virtually at predefined component orientations. Functional component orientation referencing to body planes was used: cups were abducted from 20° to 70°, and anteverted from -10° to 40°. Stems were rotated from -10° to 40° anteversion, neck-shaft angles varied from 115° to 143°, and head sizes varied from 28 to 40 mm. Hip movements up to the point of prosthetic impingement were tested, including simple flexion/extension, internal/external rotation, ab/adduction, combinations of these, and activities of daily living that were known to trigger dislocation. For each combination of parameters, the impingement-free combined target zone was determined. Maximizing the size of the combined target zone was the optimization criterion.
The combined target zones for impingement-free cup orientation had polygonal boundaries. Their size and position in the diagram changed with stem anteversion, neck-shaft angle, head size, and target ROM. The largest target zones were at neck-shaft angles from 125° to 127°, at stem anteversions from 10° to 20°, and at radiographic cup anteversions between 17° and 25°. Cup anteversion and stem anteversion were inverse-linearly correlated supporting the combined-anteversion concept. The range of impingement-free cup inclinations depended on head size, stem anteversion, and neck-shaft angle. For a 127°-neck-shaft angle, the lowest cup inclinations that fell within the target zone were 42° for the 28-mm and 35° for the 40-mm head. Cup anteversion and combined version depended on neck-shaft angle. For head size 32-mm cup, anteversion was 6° for a 115° neck-shaft angle and 25° for a 135°-neck-shaft angle, and combined version was 15° and 34° respectively.
The shape, size, and location of the combined target zones were dependent on design and implantation parameters of both components. Changing the prosthesis design or changing implantation parameters also changed the combined target zone. A maximized combined target zone was found. It is mandatory to consider both components to determine the accurate impingement-free prosthetic ROM in THA.
This study accurately defines the hypothetical impingement-free, design-specific component orientation in THA. Transforming it into clinical precision may be the case for navigation and/or robotics, but this is speculative, and as of now, unproven.
: In 2018, the American Physical Therapy Association (APTA) published a clinical guideline for adults with neurological conditions, which included recommendations for the Five-Repetition Sit-to-Stand ...test (5STSt). According to the APTA, a standard-height chair should be used, but there is no recommendation regarding seat depth. In addition, the APTA recommended the use of one trial of the test, based on expert opinion.
: (1) Compare the 5STSt scores of patients post-stroke and healthy-matched controls using two types of chairs (one standardized and one adjusted to the individual's anthropometric characteristics); and (2) Verify whether different numbers of trial affect the 5STSt scores.
: Eighteen patients post-stroke and 18 healthy-matched controls performed three trials of the 5STSt for each type of chair. ANOVA was used for analysis (α = 0.05).
: No significant interaction between groups and chairs was found. Patients post-stroke showed worsened performances in 5STSt when using both chairs compared to the healthy controls (
= .001). In both groups, the 5STSt scores were lower when using a standardized chair than an adjusted chair (
< .003) and different numbers of trials provided similar 5STSt scores (0.44 ≤
≤ 0.98).
: The 5STSt scores were affected by the physical characteristics of the chair, and an adjusted chair should be used. The APTA recommendation for one trial of the 5STSt is supported by the present results.
Objective
Older adults make up the fastest growing segment of the population, and disability rates increase with age. There is much debate whether later born cohorts of 85-year-olds will face the ...same disability rates as earlier born cohorts. This study aimed to examine ADL and IADL disability in three birth cohorts of Swedish 85-year-olds born three decades apart, examined in 1986–87, 2008–10 and 2015-16, as well as potential factors associated with ADL and IADL disability in these birth-cohorts.
Methods
Systematically selected population-based birth cohorts of 85-year-olds (n = 1,551) from the Gothenburg H70 Birth Cohort studies, Sweden, born in 1901–02 (n = 494), 1923–24 (n = 571) and 1930 (n = 486) and examined with identical methods. Disability was defined as a need for assistance in any ADL/IADL activities.
Results
ADL/IADL disability decreased between cohorts in both men and women (from 76.7% in 1986–87, to 58.4% in 2008–10, and 48.4% in 2015–16, P-value trend <.001). Factors associated with ADL/IADL disability varied between cohorts, although dementia and depression increased the odds of disability in all three birth cohorts.
Conclusion
Later born cohorts of 85-year-olds face less ADL/IADL disability compared to earlier born cohorts. As disability poses a significant financial burden on healthcare services, our findings might contribute to a more positive view on global ageing and the demographic challenges ahead. However, it might also be that in later born cohorts, ADL/IADL disability affects people at later ages, but due to increased longevity, the total number of years in late-life with a functional disability will remain the same.
Criteria for mild cognitive impairment (MCI) consider impairment in instrumental activities of daily living (IADL) as exclusionary, but cross-sectional studies suggest that some high-level functional ...deficits are present in MCI. This longitudinal study examines informant-rated IADL in MCI, compared with cognitively normal (CN) older individuals, and explores whether functional abilities, particularly those with high cognitive demand, are predictors of MCI and dementia over a 2-year period in individuals who were CN at baseline.
A sample of 602 non-demented community dwelling individuals (375 CN and 227 with MCI) aged 70-90 years underwent baseline and 24-month assessments that included cognitive and medical assessments and an interview with a knowledgeable informant on functional abilities with the Bayer Activities of Daily Living Scale.
Significantly more deficits in informant-reported IADL with high cognitive demand were present in MCI compared with CN individuals at baseline and 2-year follow-up. Functional ability in CN individuals at baseline, particularly in activities with high cognitive demand, predicted MCI and dementia at follow-up. Difficulties with highly cognitively demanding activities specifically predicted amnestic MCI but not non-amnestic MCI whereas those with low cognitive demand did not predict MCI or dementia. Age, depressive symptoms, cardiovascular risk factors and the sex of the informant did not contribute to the prediction.
IADL are affected in individuals with MCI, and IADL with a high cognitive demand show impairment predating the diagnosis of MCI. Subtle cognitive impairment is therefore likely to be a major hidden burden in society.