Psychological Stress and Intraocular Pressure in Glaucoma: A Randomized Controlled Trial
Ferreira NS, Costa VP, Miranda JF, Cintra LO, Barbosa LS, Barbosa da Silva MG, Abreu NA, Abe RY. Ophthalmol Glaucoma. 2024;7(6):518-530. doi:10.1016/j.ogla.2024.07.004
Question
What is the effect of psychological stress on intraocular pressure in glaucoma patients?
Background/Summary of Findings
Thirty-nine primary open-angle glaucoma patients were randomized to either the stress group (18) or control group (21). All participants had 3 baseline intraocular pressure measurements performed between 8:00 am and 2:00 pm before the study. The stress group underwent the Trier Social Stress Test, and their response was measured via levels of salivary cortisol, salivary amylase, intraocular pressure, mean arterial pressure, and heart before, immediately after, and 40 minutes after the Trier Social Stress Test.
The Trier Social Stress Test incorporates social evaluation and unpredictability by having subjects speak in front of an unresponsive audience and complete a surprise math test. Subjects were told that they needed to prepare a 5-minute speech describing why they would be a good candidate for a job at the hospital. Their speech would be videotaped and reviewed by a panel of judges trained in public speaking. They were given 10 minutes to prepare. During the speech, if the participant stopped talking, he/she was allowed to remain silent for 20 seconds. If the subject did not resume speaking, he/she was told “you still have time remaining.” At the end of the 5-minute speech, the participant was asked: “sequentially subtract the number 13 from 1022. You will verbally report your answers aloud and be asked to start over from 1022 if a mistake is made.” If a mistake was made, the subject was told he/she was incorrect and was asked to start over from 1022. The Trier Social Stress Test is widely used in stress laboratories worldwide and is the current gold standard in human experimental stress research. All interviewers wore a lab coat during the Trier Social Stress Test to increase stress for subjects.
A significant increase in intraocular pressure (3.8 mm Hg in the right eye, 4.1 mm Hg in the left eye) was observed when intraocular pressure during the Trier Social Stress Test was compared with pretrial levels. Overall, 61.1% of patients in the Trier Social Stress Test group showed an intraocular pressure increase greater than 4 mm Hg. During the recovery period, the mean intraocular pressure declined significantly. Salivary cortisol, salivary amylase, mean arterial pressure, and heart rate also increased significantly after the Trier Social Stress Test.
Clinical Value/Implications
This is the first randomized controlled trial to investigate the relationship between psychological stress and intraocular pressure in patients already diagnosed with glaucoma. The post stress intraocular pressure elevation of approximately 4 mm Hg found in the study was clinically significant and clinically relevant to glaucoma management. This correlation between intraocular pressure and stress has also been seen in past studies evaluating the intraocular pressure–lowering effect of antistress therapies, such as meditation. Although these results may not impact preferred practice patterns for glaucoma until further trials are performed, they illustrate the multifactorial risk factors and considerations providers need to be aware of in the management of their glaucoma patients.
Oral Antioxidant and Lutein/Zeaxanthin Supplements Slow Geographic Atrophy Progression to the Fovea in Age-Related Macular Degeneration
Keenan TDL, Agrón E, Keane PA, Domalpally A, Chew EY. Ophthalmology. 2025;132(1):14-29. doi:10.1016/j.ophtha.2024.07.014
Question
Does oral micronutrient supplementation slow geographic atrophy progression in age-related macular degeneration?
Background/Summary of Findings
A total of 392 eyes of 318 participants with geographic atrophy in the Age-Related Eye Disease Study and 1210 eyes of 891 participants with geographic atrophy in the Age-Related Eye Disease Study 2 participated in the study. Given how devastating geographic atrophy can be when the central vision is impacted, therapeutic approaches that could slow down progression to this area would be highly valuable to individuals with macular degeneration.
Participants were randomized to 4 study treatments:
-
Lutein plus zeaxanthin
-
Docosahexaenoic acid + eicosapentaenoic acid
-
Lutein/zeaxanthin and docosahexaenoic acid/eicosapentaenoic acid
-
Placebo
In addition, all participants were offered the original Age-Related Eye Disease Study formulation to take. Those who agreed to take the Age-Related Eye Disease Study formulation consented to a second randomization to receive 1 of 4 alternative Age-Related Eye Disease Study formulations:
-
Original formulation
-
Original formulation without beta carotene
-
Original formulation but with lower zinc (25 mg instead of 80 mg)
-
Original formulation but without beta carotene and low zinc
After mean follow-up of 3 years, geographic atrophy progression toward the central macula was significantly slower in eyes of participants randomized to antioxidants versus no antioxidants. In Age-Related Eye Disease Study eyes with noncentral geographic atrophy, progression toward central macula was slower with randomization to antioxidants versus none with a difference of 36%. In Age-Related Eye Disease Study 2 eyes with noncentral geographic atrophy, in participants assigned to Age-Related Eye Disease Study antioxidants without beta-carotene, progression was significantly slower with randomization to lutein/zeaxanthin versus none with a difference of 35%.
Oral micronutrient supplementation slowed geographic atrophy progression toward the central macula. Specifically, the effects of lutein/zeaxanthin and vitamins C and E appear to be complementary and may enhance the natural phenomenon of foveal sparing from geographic atrophy in age-related macular degeneration.
Clinical Value/Implications
At present, no specific recommendations are provided by the American Academy of Ophthalmology Age-Related Macular Degeneration Preferred Practice Pattern for patients with bilateral geographic atrophy. Previously, neither Age-Related Eye Disease Study nor Age-Related Eye Disease Study 2 trials detected any benefit to vitamin supplementation once participants had developed late-stage age-related macular degeneration.
The study findings support the continued use of Age-Related Eye Disease Study 2 supplements by people with noncentral geographic atrophy. This would have the potential advantage of convenience and simplicity as patients could continue to take the same supplement before and after progression from intermediate age-related macular degeneration to geographic atrophy. Given that there are few options for individuals with late-stage dry age-related macular degeneration to restore their vision, micronutrient supplementation is a simple step that can slow down disease progression, even for those with late disease.
Social Factors Associated With the Risk of Glaucoma Suspect Conversion to Glaucoma: Analysis of the Nationwide All of Us Program
Wu JH, Halfpenny W, Bu J, Brar M, Weinreb RN, Baxter SL. Ophthalmol Glaucoma. 2024;7(6):551-562. doi:10.1016/j.ogla.2024.06.007.
Question
Are social factors associated with a risk of glaucoma suspects converting to open-angle glaucoma?
Background/Summary of Findings
Using the nationwide All of Us database, known for being more enriched with underrepresented minorities, participants aged 18 years and older with diagnosis of “glaucoma suspect,” “open-angle with borderline intraocular pressure,” “open-angle with borderline findings, low risk,” and “open-angle with borderline findings, high risk” were selected. Among these participants, authors identified patients with that converted to glaucoma diagnosis within 5 years. Diagnosis of “open-angle glaucoma,” which included both primary and secondary forms of glaucoma, was used to determine conversion in the group of glaucoma suspects.
Overall, 5274 glaucoma suspect patients were identified and 786 (14.9%) converted to open-angle glaucoma within a 5-year period.
The factors that were associated with greater hazard of conversion to open-angle glaucoma are as follows:
-
Black/African American race (hazard ratio = 1.70)
-
Male gender (hazard ratio = 1.30)
-
Older age (hazard ratio = 1.17)
-
History of intraocular surgery (hazard ratio = 1.60)
-
More reasons for delayed health care access (hazard ratio = 2.27)
-
No history of recreational drug use (hazard ratio = 1.23)
Being employed was associated with a smaller hazard of conversion to open-angle glaucoma.
The primary reasons for delayed or missed health care access were related to cost of care. There was a greater proportion of conversion cases in individuals with lower annual income or inability to afford health care.
Clinical Value/Implications
Knowledge of significant risk factors for conversion of glaucoma suspects to open-angle glaucoma is essential for providing comprehensive and personalized care to our patients. Individuals identified as being at higher risk for conversion may need potentially more aggressive treatment plans, shorter follow-up intervals, and more frequent auxiliary testing. The study findings suggest that older Black males with delayed health care access and history of intraocular surgery are patients to watch more carefully and closely for conversion to open-angle glaucoma.
Concordance Between Self-Reported Visual Difficulty and Objective Visual Impairment: The National Health and Aging Trends Study
Potter T, Almidani L, Diaz M, Varadaraj V, Mihailovic A, Ramulu PY. Ophthalmology. 2024;131(12):1447-1456. doi:10.1016/j.ophtha.2024.06.009
Question
How does the performance of self-reported visual difficulty predict objective visual impairment in older adults, and if discordant, what factors influence that discordance?
Background/Summary of Findings
As the average life expectancy increases, the risk and burden of visual impairment increases. National surveys are undertaken to better understand visual impairment prevalence, its related burden, and subsequently use that data to better allocate resources to assist communities most susceptible to visual impairment. Many of these prevalence surveys employ subjective measures, and past studies have often shown discrepancies between self-reported data and data acquired through traditional objective measurements. Objective measurements are more burdensome requiring training, equipment costs, time, and often travel. In certain groups, there is a tendency to subjectively underreport the level of visual impairment relative to objective measurements with factors such as age, gender, cultural beliefs, depression, and cognitive ability, all influencing this discrepancy. In this study, the authors aimed to examine how self-reported visual difficulty was able to predict objective visual difficulty and whether self-reports can actually be used as a surrogate measure of visual disability.
Using a cross-sectional analysis of the 2022 National Health and Aging Trends Study, participants reporting blindness or difficulties with distance or near vision were included and characterized as having visual difficulty. Presenting binocular distance visual acuity, near visual acuity, and contrast sensitivity were assessed. Objective vision impairment was defined as having a distance visual acuity of worse than 20/40 and a near visual acuity of worse than 20/40 or having a contrast sensitivity of worse than 1.55 log contrast sensitivity. Receiver operating characteristic analysis was used to compare the performances of self-reported visual difficulty in predicting visual impairment. To investigate factors that influence discordance, the authors limited their sample to adults with visual impairment and used a multivariable logistic regression model to identify factors associated with not reporting visual disability. Similar analyses were performed to explore factors associated with reporting visual disability in adults who did not actually have visual impairment objectively.
A total of 4999 adults were included in the cohort. Visual difficulty achieved an area under the curve of 56.0 (95% CI, 55.2-56.9) in predicting visual impairment, with a sensitivity of 15.8 (95% CI, 14.2-17.5) and specificity of 96.3 (95% CI, 95.5-96.9). Characteristics associated with not reporting visual difficulty in adults with visual impairment included female gender (odds ratio, 0.64 [95% CI, 0.42-0.99]), Hispanic ethnicity (odds ratio, 0.49 [95% CI, 0.31-0.78), higher income (≥$75 000; odds ratio, 1.99 [95% CI, 1.14-3.45]), 4 or more comorbidities (odds ratio, 0.46 [95% CI, 0.29-0.72]), and depressive symptoms (odds ratio, 0.49 [95% CI, 0.25-0.93]). Factors associated with self-reporting visual difficulty in the absence of visual impairment included Hispanic ethnicity (odds ratio, 2.11 [95% CI, 1.15-3.86]), higher income (≥$75 000, odds ratio, 0.27 [95% CI, 0.12-0.63]), and anxiety symptoms (odds ratio, 3.05 [95% CI, 1.56-5.97]).
The authors concluded that self-reported visual difficulty is a distinct measure assessing disability and has limited ability in predicting objective visual impairment. Caution is advised when using self-reported visual difficulty as a surrogate measure for objective visual impairment in epidemiological studies, although it may still be an effective way to capture risk of current or future disability.
Clinical Value/Implications
From a clinical standpoint, the results of the survey align with what many of us note in practice, that there is often a disconnect between the patient’s subjective impression of their vision and what we objectively measure and subsequently perceive. Ultimately, in assessing a patient’s visual function, both subjective and objective information must be considered in determining how best to meet individual patient needs.
Assessment of the Predictive Ability of Theranostics for Corneal Cross-Linking in Treating Keratoconus: A Randomized Clinical Trial
Roszkowska AM, Scorcia V, Mencucci R, Giannaccare G, Lombardo G, Alunni Fegatelli D, Vestri A, Bifezzi L, Bernava GM, Serrao S, Lombardo M. Ophthalmology. 2024;131(12):1403-1415. doi:10.1016/j.ophtha.2024.06.012
Question
Does integration of theranostics in an ultraviolet A medical device provide more precise and predictive treatment of keratoconus by corneal cross-linking?
Background/Summary of Findings
Theranostics is an emerging approach of personalized, predictive, and precision medicine; the term refers to the use of simultaneous imaging diagnostics for developing targeted therapies. In corneal cross-linking treatment, ultraviolet A light-triggered theranostics consists of the energy excitation of riboflavin, which can react to light irradiation sensitively, and thus induce controlled therapy and imaging of the cornea. In preclinical studies, this therapeutic approach has been demonstrated to accurately predict the biomechanical stiffening effect induced on donor eye bank human corneal tissues treated either by epithelium-off or by epithelium-on corneal cross-linking protocols. This personalized predictive effect is achieved by using 2 imaging biomarkers, riboflavin score and theranostic score, generated by a theranostic ultraviolet A device that processes and analyzes the fluorescence emitted by the cornea in real time during cross-linking treatment. The riboflavin score reflects riboflavin stromal concentration and the theranostic score indicates treatment effect.
This study, known as the hypothesis of the Assessment of Theranostic Guided Riboflavin/UV-A Corneal Cross-linking for Treatment of Keratoconus (ARGO) clinical trial focused on investigating whether the amount of riboflavin concentration in the cornea and its effective ultraviolet A light-mediated photoactivation are the main factors influencing corneal cross-linking treatment efficacy in human eyes. The study further explored whether the combined use of the theranostic imaging biomarkers riboflavin score and theranostic score, calculated in real time by the investigational ultraviolet A medical device, could classify eyes and predict efficacy of corneal cross-linking treatment accurately in halting keratoconus progression (intended as a Kmax flattening at 1 year after surgery), regardless of treatment protocol variations and regardless of whether they involved removing the corneal epithelium.
Using a prospective, randomized, multicenter, masked clinical trial design, 50 patients with progressive keratoconus were stratified to undergo epithelium-off (25 eyes) and epithelium-on (25 eyes) cross-linking protocols using an ultraviolet A medical device with theranostic software. The device controlled ultraviolet A light both for performing corneal cross-linking and assessing the corneal riboflavin concentration (riboflavin score) and treatment effect (theranostic score). A 0.22% riboflavin formulation was applied onto the cornea for 15 minutes and 20 minutes in epithelium-off and epithelium-on protocols, respectively. All eyes underwent 9 minutes of ultraviolet A irradiance at 10 mW/cm2. The primary outcome measure was validation of the combined use of theranostic imaging biomarkers through measurement of their accuracy (proportion of correctly classified eyes) and precision (positive predictive value) to classify eyes correctly and predict a Kmax flattening at 1 year after cross-linking. Other outcome measures included change in Kmax, endothelial cell density, uncorrected and corrected distance visual acuity, manifest spherical equivalent refraction, and central corneal thickness 1 year after corneal cross-linking.
The accuracy and precision of the theranostic imaging biomarkers in predicting eyes that had greater than 0.1 D of Kmax flattening at 1 year were 91% and 95%, respectively. The Kmax value significantly flattened by a median of -1.3 D (IQR, -2.11 to -0.49 D; P < 0.001); both the uncorrected and corrected distance visual acuity improved by a median of e0.1 logMAR (logMAR; IQR, -0.3 to 0.0 logMAR [P < 0.001] and -0.2 to 0.0 logMAR [P < 0.001], respectively). No significant changes in endothelial cell density (P = 0.33) or central corneal thickness (P = 0.07) were noted 1 year after surgery. The authors concluded that the study demonstrated the efficacy of integrating theranostics in an ultraviolet A medical device for the precise and predictive treatment of keratoconus with epithelium-off and epithelium-on corneal cross-linking protocols. Concentration of riboflavin and its ultraviolet A light-mediated photoactivation in the cornea are the primary factors determining corneal cross-linking efficacy.
Clinical Value/Implications
The validation of this ultraviolet A theranostic platform demonstrates the potential that theranostic technologies will have in personalizing medicine in the future. Specifically with cross-linking, the use of theranostic imaging biomarkers that predict treatment efficacy with high accuracy may maximize outcomes in patients with progressive keratoconus.
Visual Loss in Geographic Atrophy: Learnings From the Lampalizumab Trials
Anegondi N, Steffen V, Sadda SR, Schmitz-Valckenberg S, Tufail A, Csaky K, Lad EM, Kaiser PK, Ferrara D, Chakravarthy U. Ophthalmology. 2024:S0161-6420(24)00742-5. doi:10.1016/j.ophtha.2024.11.017. Epub ahead of print. PMID: 39581330.
Question
What is the correlation of geographic atrophy lesion growth rate and baseline characteristics, including geographic atrophy, foveal involvement, and focality, on visual loss as measured by best-corrected visual acuity in patients with age-related macular degeneration?
Background/Summary of Findings
Geographic atrophy is estimated to affect approximately 5 million individuals globally and is estimated to account for 26% of legal blindness cases in the United States. Patients with geographic atrophy may experience rapid disease progression. Approximately one-half and one-quarter of geographic atrophy–affected eyes have been shown to experience greater than or equal to 3 lines or [Equation] 6 lines of best-corrected visual acuity over 2 years, using the Early Treatment Diabetic Retinopathy Study charts. Although complement inhibition trials have demonstrated significant reduction in the growth of geographic atrophy lesions, when compared with sham treatments, there were not significant differences in functional outcomes. This study’s authors sought to determine whether there were correlations between functional and anatomical parameters that might help explain the discordance found in studies.
Using a retrospective analysis of the Lampalizumab phase 3 (NCT02247479 and NC02247531) and prospective observational (NCT02479386) trials, patients with bilateral geographic atrophy had monthly best-corrected visual acuity and fundus autofluorescence taken at baseline and every 6 months for 2 years. Baseline geographic atrophy from autofluorescence images were correlated to best-corrected visual acuity and change in best-corrected visual acuity. Best-corrected visual acuity changes were subgrouped by foveal vs nonfoveal involvement of the geographic atrophy and/or by focality of the geographic atrophy lesions. Time to best-corrected vision loss of greater than or equal to 5, greater than or equal to 10, and greater than or equal to 15 letters was then performed. The results were that best-corrected visual acuity and geographic atrophy area at baseline did not correlate with best-corrected visual acuity change at any visit. Geographic atrophy growth rate showed a weak correlation with best-corrected visual acuity loss, which increased over time. The 2 highest geographic atrophy growth rate quartiles had accelerated best-corrected visual acuity loss in eyes with subfoveal, unifocal lesions. Approximately 75%, 50%, and 25% of study eyes experienced a letter loss of greater than or equal to 5, greater than or equal to 10, and greater than or equal to 15 letters by 2 years, respectively.
The authors concluded that best-corrected visual acuity and geographic atrophy area at baseline did not correlate with best-corrected visual acuity loss, but faster geographic atrophy growth rates appeared to be associated with faster best-corrected visual acuity loss. Geographic atrophy foveal involvement and focality correlated with the rate of best-corrected visual acuity loss with subfoveal lesions at high risk of vision loss over time, especially when the geographic atrophy lesion was unifocal.
Clinical Value/Implications
Although best-corrected visual acuity is not the best metric to use for understanding how patients with macular degeneration lose functionality, this study underscores that the characteristics of each individual geographic atrophy lesion will contribute to the rate of best-corrected vision loss. Accordingly, we need to assess each individual’s characteristics to determine whether it is likely that the use of anticomplement inhibition can have a beneficial effect in preserving vision through reducing geographic lesion growth.