Short-term Detection of Fast Progressors in Glaucoma: The Fast Progression Assessment through Clustered Evaluation (Fast-PACE) Study

Medeiros FA, Malek DA, Tseng H, et al. Ophthalmology. 2024;131(6):645-657. doi:10.1016/j.ophtha.2023.12.031

Question

How does an intensive, clustered testing approach identify eyes with rapid progression over 6 months?

Background/Summary of Findings

For many patients with primary open-angle glaucoma, their course of disease is protracted. However, there are a significant number of patients who will progress rapidly and are at risk of functional loss. Identifying those that are rapid progressors as soon as possible should provide an enhanced opportunity to prevent blindness. Identifying fast progression is, however, complicated by test-retest variability of commonly available clinical tools, including standard automated perimetry and optical coherence tomography. To overcome test-retest variability, employing multiple tests (clustering) at the start and finish of a period of disease observation has been proposed as an effective strategy to assess progression using trend-based analysis.

Using a prospective cohort study including a total of 125 eyes of 65 participants with primary open-angle glaucoma, the Fast Progression Assessment through Clustered Evaluation (Fast-PACE) study was designed to assess the feasibility and efficacy of detecting fast progressors in a short time by intensive clustering of functional and structural testing.

Participants underwent 2 sets of 5 weekly visits (clusters) separated by an average of 6 months and then were followed with single visits every 6 months for an overall mean follow-up of 25 months (mean of 17 tests). Each visit consisted of testing with standard automated perimetry 24-2 and 10-2, and spectral-domain optical coherence tomography. Progression was assessed using trend analyses of standard automated perimetry mean deviation and retinal nerve fiber layer thickness. Generalized estimating equations were applied to adjust for correlations between eyes for confidence interval estimation and hypothesis testing. The main outcome measure was diagnostic accuracy of the 6-month clustering period to identify progression detected during the overall follow-up.

A total of 19 of 125 eyes (15%; confidence interval, 9%-24%) progressed based on standard automated perimetry 24-2 mean deviation over the 6-month clustering period. A total of 14 eyes (11%; confidence interval, 6%-20%) progressed on standard automated perimetry 10-2 mean deviation, and 16 eyes (13%; confidence interval, 8%-21%) progressed by retinal nerve fiber layer thickness, with 30 of 125 eyes (24%; confidence interval, 16%-34%) progressing by function, structure, or both. Of the 35 eyes progressing during the overall follow-up, 25 had progressed during the 6-month clustering period, for a sensitivity of 71% (confidence interval, 53%-85%). Of the 90 eyes that did not progress during the overall follow-up, 85 also did not progress during the 6-month period, for a specificity of 94% (confidence interval, 88%-98%). Of the 14 eyes considered fast progressors by standard automated perimetry 24-2, standard automated perimetry 10-2, or spectral-domain optical coherence tomography during the overall follow-up, 13 were identified as progressing during the 6-month cluster period, for a sensitivity of 93% (confidence interval, 66%-100%) for identifying fast progression with a specificity of 85% (confidence interval, 77%-90%).

The authors concluded that clustered testing in the Fast-PACE study detected fast-progressing glaucoma eyes over 6 months and that the methodology used could be applied in clinical trials investigating interventions to slow glaucoma progression and may be of value for short-term assessment of high-risk patients.

Clinical value/implications

Detecting fast progressors and intervening accordingly may be the most critical task we perform as glaucoma providers. Determining change using the current paradigm of one structural or functional test, separated oftentimes by a year in between, is likely insufficient to minimize vision loss in fast progressors. This study reinforces the need to adapt our strategies for those at risk of fast progression, and further research will likely inform us on how we may employ clustering as that adaptation.

Xue CC, Sim R, Chee ML, et al. Ophthalmology. 2024;131(6):692-699. doi:10.1016/j.ophtha.2023.12.030

Question

Is kidney function associated with age-related macular degeneration in an Asian population?

Background/Summary of Findings

The link between renal insufficiency and macular degeneration has been previously investigated with various reports. This review of more than 50 000 patients from many Asian countries, including China, India, Singapore, and Russia, reexamined this potential link comparing rates of early and late stages of macular degeneration in patients with chronic kidney disease (assessed as estimated glomerular filtration rate of 60 mL/min per 1.73 m2.

The results showed no significant associations with mild macular degeneration; however, patients with chronic kidney disease had a 1.5-fold risk of late macular degeneration. This association was proportional. For each reduction in estimate glomerular filtration rate, the risk of late macular degeneration increased 12%.

Although the mechanism of the link was not investigated, the authors speculated common genetic/pathologic susceptibilities or increases in oxidative stress due to chronic kidney disease resulting in increased lipid deposition with Bruch’s membrane.

Clinical Value/Implications

This association of late macular degeneration with chronic kidney disease in an Asian population gives us another tool for education and monitoring of patients with these diseases. Further, given the availability, cost, and risks of pegcetacoplan (Syfovre, Apellis Pharmaceuticals), this association may provide retinal specialists another variable to consider in their therapeutic algorithm.

Long-term Follow-up of a Phase 1/2a Clinical Trial of a Stem Cell-Derived Bioengineered Retinal Pigment Epithelium Implant for Geographic Atrophy

Humayun MS, Clegg DO, Dayan MS, et al. Ophthalmology. 2024;131(6):682-691. doi:10.1016/j.ophtha.2023.12.028

Question

Is a stem cell–derived retinal pigment epithelium implant feasible and safe?

Background/Summary of Findings

Given the limited medical treatment options available for improving vision in a patient population with geographic atrophy, novel therapies are being investigated. This study reviewed the safety and feasibility of implanting a monolayer of human embryonic stem cells on a carrier parylene membrane. Parylene is a bioinert thin film coating that is very thin (2-20 µm) and conforms readily to complex shapes and is already Food and Drug Administration approved for human implant devices. These implants were placed in 15 eyes of 15 patients over areas of geographic atrophy. The worse-sighted eye was the operative eye and needed to be 20/200 or worse prior to the procedure. At the conclusion of surgery, the vitreous cavity was filled with silicone oil.

Side effects and complications of these eyes were tracked over time. No unexpected adverse outcomes occurred. There was no evidence of intraocular immune response to the implant, and no production of antibodies specific to the implant were detected. Intraoperative complications were primarily bleeding. Although safety and feasibility were the primary concerns of this study, the eyes that received the implant were less likely to worsen compared with fellow eye controls and were more likely to gain some vision, although the effect was not profound.

Clinical Value/Implication

This study indicates that surgical placement of composite stem cell retina pigment epithelium implants over areas of geographic atrophy is possible and generally safe. Further studies will be needed to determine the efficacy of this possible tool in treating geographic atrophy.

Effect of Low-Concentration Atropine Eyedrops vs Placebo on Myopia Incidence in Children: The LAMP2 Randomized Clinical Trial

Yam JC, Zhang XJ, Zhang Y, et al. JAMA. 2023;329(6):472-481. doi:10.1001/jama.2022.24162

Question

What are factors associated with the efficacy of low-concentration atropine in delaying myopia onset in the Low-concentration Atropine for Myopia Prevention (LAMP2) study?

Background/Summary of Findings

The Low-concentration Atropine for Myopia Prevention (LAMP2) study has shown that 0.05% atropine eye drops can delay the onset of myopia by 47% over 2 years. However, not all children are at equal risk of developing myopia. Targeting those children at risk for myopia and avoiding unnecessary treatment in those who are unlikely to develop myopia would be ideal. In this secondary analysis of LAMP2, the authors evaluated factors associated with the efficacy of low-concentration atropine in delaying myopia onset to better identify which patients will benefit from atropine prophylaxis.

The participants who completed the 2-year follow-up in the LAMP2 study were included. In the original LAMP2 study, participants were nonmyopic children aged 4 to 9 years with cycloplegic spherical equivalent between +1.00 and 0.00 D, who were randomized into groups that received 0.05% or 0.01% atropine or placebo eye drops once nightly in both eyes. A total of 353 (74.5%) participants completed the 2-year follow-up. Over 2 years, low baseline hyperopic reserve and high level of parental myopia increased the risk of myopia onset, spherical equivalent progression, and axial length elongation.

Subgroup analysis found that in the 0.05% atropine group, the spherical equivalent progression over 2 years was similar across groups with various baseline hyperopic reserve. However, in the 0.01% atropine and placebo groups, the spherical equivalent progression was affected by the baseline hyperopic reserve, with the lesser baseline hyperopic reserve resulting in faster progression. Similar trends were observed for axial length elongation. Among participants with a baseline hyperopic reserve of less than +0.75 D, the spherical equivalent progression over 2 years was less in the 0.05% atropine group than that in the placebo and 0.01% atropine groups. Among participants with the baseline hyperopic reserve between +1.0 D and +0.75 D, the spherical equivalent progression had no significant difference among all treatment groups. Similar relationships were found with axial length.

These results indicate that over 2 years, 0.05% atropine is more effective than 0.01% or placebo for eyes with less hyperopic reserve, and the authors suggest that patients with hyperopic reserve of +0.75 may serve as a potential cutoff for 0.05% atropine as a potential myopia prophylaxis.

Clinical Value/Implications

Preventively treating those that will likely benefit from 0.05% atropine is preferable to treating all nonmyopic children with low levels of hyperopia. The cutoff of +0.75 hyperopic reserve in the secondary analysis of this LAMP2 cohort may provide a marker that has the potential to inform practitioners and families in their decision to commence atropine prophylaxis or not.

Ophthalmic and Systemic Factors of Acute Nonarteritic Anterior Ischemic Optic Neuropathy in the Quark207 Treatment Trial

Kupersmith MJ, Fraser CL, Morgenstern R, et al. Ophthalmology. 2024;131(7):790-802. doi:10.1016/j.ophtha.2024.01.011

Question

What are the baseline ophthalmic and cardiovascular risk factors across countries, race, and sex for the Quark207 treatment trial for acute nonarteritic anterior ischemic optic neuropathy?

Background/Summary of Findings

Nonarteritic anterior ischemic optic neuropathy is the most common acute optic neuropathy in older adults. Studies most commonly implicate ischemia as the primary driver of optic nerve damage, and cardiovascular risk factors are consistently associated. There is, however, a disagreement about whether nonarteritic ischemic optic neuropathy is associated with an increased frequency of all or just some cardiovascular risk factors.

Seven hundred twenty-nine participants with acute unilateral nonarteritic anterior ischemic optic neuropathy in 8 countries across 4 continents were enrolled in a multicenter, prospective, randomized controlled trial design (the Quark207 trial) designed to examine the safety and efficacy of intravitreal delivery of a silencing RNA directed against caspase 2 in patients with recent-onset nonarteritic anterior ischemic optic neuropathy. Given the study’s design, data on age, race, sex, and associated systemic diseases, particularly cardiovascular disease, were able to be collected. Additionally, best-corrected visual acuity, visual field total deviation, and cup-to-disc ratio of the uninvolved fellow eye were evaluated.

This report on baseline clinical and laboratory characteristics of enrolled participants was stratified by country, race, and sex. The study results showed that the overall prevalence of cardiovascular risk factors was as follows: hypertension, 30.3%; diabetes mellitus, 15.1%; hyperlipidemia, 15.2%; obesity, 35.6%; tobacco smoking, 10.2%; and sleep apnea, 19.1%; most participants had between 1 and 3 of these risk factors. Although cardiovascular risk varied by country, it was not typically worse in patients with acute nonarteritic anterior ischemic optic neuropathy than that found in the general population. Ophthalmic features were consistent across countries, race, and sex with the exception of China, which has worse average best-corrected visual acuity and total deviation. The most prevalent ophthalmic risk factor across all demographics was having a small cup-to-disc ratio in the fellow uninvolved eye (average 0.1 vertical/horizontal ratio). The authors concluded that although a small cup-to-disc ratio is a predisposing factor consistent with past studies, the finding that cardiovascular disease was not any more prevalent than in the general population may indicate that it causes a predisposition for nonarteritic ischemic optic neuropathy but other features not yet identified may be the actual trigger or cause of the condition.

Clinical Value/Implications

Although this report from Quark207 demonstrates that there may not be a universally accepted trigger for nonarteritic ischemic optic neuropathy, the consistently reported cardiovascular disease association should prompt us to advocate that our patients with these risk factors, especially those with concomitant predisposing ophthalmic features, seek to optimize their systemic health.

Risk of Recurrence in Acute Anterior Uveitis

Brodie JT, Thotathil AZ, Jordan CA, Sims J, Niederer RL. Ophthalmology. Published online June 7, 2024. doi:10.1016/j.ophtha.2024.06.003

Question

What is the frequency of recurrence in patients with acute anterior uveitis in a single tertiary ophthalmic care center, and what are risk factors for that recurrence?

Background/Summary of Findings

The uveitides are a heterogeneous collection of ocular diseases characterized by inflammation of the uveal tissue. The anterior uveitides make up the majority of uveitis cases (92%), and although there is less risk of visual morbidity than with the other anatomical locations, vision loss can still be a complication from these conditions. The risk of long-term vision loss from recurrent forms of anterior uveitis has not be well studied, and therefore, attaining a clearer picture of any risks conferred by recurrence would be beneficial in understanding future visual prognosis and the disease course in patients with recurrent disease.

Employing a retrospective cohort design from a single tertiary care center (Te Whatu Ora (Auckland, New Zealand), 2763 eyes of 2092 participants with acute anterior uveitis who presented for care between 2008 and 2021 were studied, with a median follow-up time of 8.9 years. Recurrence occurred in the ipsilateral eye in 1258 eyes (45.5%) and in the contralateral eye in 522 eyes (27.3%). Rates of ipsilateral recurrence over 10 years were 38.1% for idiopathic disease, 43.2% for human leukocyte antigen B27/inflammatory arthritis, and 44.9% for viral uveitis. On multivariate analysis, the following were associated with increased risk of ipsilateral 50 recurrence: older age (P < .001), Māori ethnicity (P = .006), Asian ethnicity (P < .001), human leukocyte antigen B51 B27/inflammatory arthritis (P < .001), and viral uveitis (P = .018). Contralateral recurrence at 10 years was 15.2% in 54 idiopathic uveitis, 37.6% in human leukocyte antigen B27/inflammatory arthritis, and 2.0% in viral uveitis. Risk factors identified for contralateral eye involvement were Māori ethnicity (P = .003), Pasifika (Pacific Islanders) ethnicity (P = .021), and human leukocyte antigen B27/inflammatory arthritis (P < .001). Moderate vision loss (≤20/50) was present in 411 eyes (14.9%) at final follow-up and was more common if the time to first recurrence was shorter (P < .001)

The authors noted that approximately half of patients with acute anterior uveitis will develop recurrence in the ipsilateral eye and a quarter will have recurrence in the contralateral eye.

Clinical Value/Implications

Although these results are not generalizable coming from a single center in a specific region of the world, these findings should compel us to educate our patients with acute anterior uveitis that they have a significant future chance of recurrence and a risk of moderate vision loss. Increased patient awareness and understanding of the necessity of seeking timely care if recurrence occurs should increase the chance of minimizing potentially negative complications.

The Efficacy and Safety of Standard versus Soft Topical Steroids after Cataract Surgery: A Systematic Review and Meta-analysis

Noyman DBE, Chan CC, Mimouni M, Safir M. Ophthalmology. 2024;131(5):595-610. doi:10.1016/j.ophtha.2023.11.022

Question

In postoperative cataract care, does steroid choice alter healing time or risks of steroid-related side effects?

Background/Summary of Findings

Cataract surgery is among the most frequently performed surgeries across the globe with generally standard surgical technique worldwide. However, the postoperative care—especially as it pertains to postoperative anti-inflammatory use—may vary widely from surgeon to surgeon, particularly in regard to which steroidal agent is used. More potent steroids result in more rapid clearance of inflammation but also have a greater risk of side effects, particularly in their potential to increase intraocular pressure. This meta-analysis reviewed the efficacy and safety of “standard” steroid regimens (those using prednisolone acetate 1% or dexamethasone 0.1%) versus so-called “soft” steroids (loteprednol etabonate 0.5% or fluorometholone 0.1%). After exclusion criteria were applied, 7 studies describing the management of 593 eyes were analyzed.

Analysis showed little difference between a “standard” (prednisolone or dexamethasone) and a “soft” (fluorometholone or loteprednol) steroid approach in either resolution of inflammation or risks. Eyes were monitored at 1 day, 7 days, and 28 days postoperatively. At day 7 postoperatively, the standard steroids achieved slightly greater control of inflammation via a reduced aqueous flare grading. The authors state that although this effect was statistically significant, it did not seem to achieve clinical significance, with vision measuring the same between groups. Likewise, soft steroids only achieved a statistically significant reduction in intraocular pressure compared with the standard group at day 7, with the effect dissipating by final follow-up.

Clinical Value/Implications

This study suggests that for the average patient having cataract surgery, the specific steroid used postoperatively is probably not a cause of healing differences or complications encountered. The study is limited by the breadth of its focus. The authors acknowledge that selecting different steroid regimens in known steroid responders, or in people with high risk of postoperative inflammatory issues, remains reasonable, but for the general population of patients with cataracts, a standard approach is safe and effective.