Publications

2024

Lai, Michelle, Simon T Dillon, Xuesong Gu, Tina L Morhardt, Yuyan Xu, Noel Y Chan, Beibei Xiong, et al. (2024) 2024. “Serum Protein Risk Stratification Score for Diagnostic Evaluation of Metabolic Dysfunction-Associated Steatohepatitis.”. Hepatology Communications 8 (12). https://doi.org/10.1097/HC9.0000000000000586.

BACKGROUND: Reliable, noninvasive tools to diagnose at-risk metabolic dysfunction-associated steatohepatitis (MASH) are urgently needed to improve management. We developed a risk stratification score incorporating proteomics-derived serum markers with clinical variables to identify high-risk patients with MASH (NAFLD activity score >4 and fibrosis score >2).

METHODS: In this 3-phase proteomic study of biopsy-proven metabolic dysfunction-associated steatotic fatty liver disease, we first developed a multi-protein predictor for discriminating NAFLD activity score >4 based on SOMAscan proteomics quantifying 1305 serum proteins from 57 US patients. Four key predictor proteins were verified by ELISA in the expanded US cohort (N = 168) and enhanced by adding clinical variables to create the 9-feature MASH Dx score, which predicted MASH and also high-risk MASH (F2+). The MASH Dx score was validated in 2 independent, external cohorts from Germany (N = 139) and Brazil (N = 177).

RESULTS: The discovery phase identified a 6-protein classifier that achieved an AUC of 0.93 for identifying MASH. Significant elevation of 4 proteins (THBS2, GDF15, SELE, and IGFBP7) was verified by ELISA in the expanded discovery and independently in the 2 external cohorts. MASH Dx score incorporated these proteins with established MASH risk factors (age, body mass index, ALT, diabetes, and hypertension) to achieve good discrimination between MASH and metabolic dysfunction-associated steatotic fatty liver disease without MASH (AUC: 0.87-discovery; 0.83-pooled external validation cohorts), with similar performance when evaluating high-risk MASH F2-4 (vs. MASH F0-1 and metabolic dysfunction-associated steatotic fatty liver disease without MASH).

CONCLUSIONS: The MASH Dx score offers the first reliable noninvasive approach combining novel, biologically plausible ELISA-based fibrosis markers and clinical parameters to detect high-risk MASH in patient cohorts from the United States, Brazil, and Europe.

Ke, Janny X C, Tim T H Jen, Sihaoyu Gao, Long Ngo, Lang Wu, Alana M Flexman, Stephan K W Schwarz, Carl J Brown, and Matthias Görges. (2024) 2024. “Development and Internal Validation of Time-to-Event Risk Prediction Models for Major Medical Complications Within 30 Days After Elective Colectomy.”. PloS One 19 (12): e0314526. https://doi.org/10.1371/journal.pone.0314526.

BACKGROUND: Patients undergoing colectomy are at risk of numerous major complications. However, existing binary risk stratification models do not predict when a patient may be at highest risks of each complication. Accurate prediction of the timing of complications facilitates targeted, resource-efficient monitoring. We sought to develop and internally validate Cox proportional hazards models to predict time-to-complication of major complications within 30 days after elective colectomy.

METHODS: We studied a retrospective cohort from the multicentered American College of Surgeons National Surgical Quality Improvement Program procedure-targeted colectomy dataset. Patients aged 18 years or above, who underwent elective colectomy between January 1, 2014 and December 31, 2019 were included. A priori candidate predictors were selected based on variable availability, literature review, and multidisciplinary team consensus. Outcomes were mortality, hospital readmission, myocardial infarction, cerebral vascular events, pneumonia, venous thromboembolism, acute renal failure, and sepsis or septic shock within 30 days after surgery.

RESULTS: The cohort consisted of 132145 patients (mean ± SD age, 61 ± 15 years; 52% females). Complication rates ranged between 0.3% (n = 383) for cardiac arrest and acute renal failure to 5.3% (n = 6986) for bleeding requiring transfusion, with readmission rate of 8.6% (n = 11415). We observed distinct temporal patterns for each complication: the median [quartiles] postoperative day of complication diagnosis ranged from 1 [0, 2] days for bleeding requiring transfusion to 12 [6, 18] days for venous thromboembolism. Models for mortality, myocardial infarction, pneumonia, and renal failure showed good discrimination with a concordance > 0.8, while models for readmission, venous thromboembolism, and sepsis performed poorly with a concordance of 0.6 to 0.7. Models exhibited good calibration but ranges were limited to low probability areas.

CONCLUSIONS: We developed and internally validated time-to-event prediction models for complications after elective colectomy. Once further validated, the models can facilitate tailored monitoring of high risk patients during high risk periods.

TRIAL REGISTRATION: Clinicaltrials.gov (NCT05150548; Principal Investigator: Janny Xue Chen Ke, M.D., M.Sc., F.R.C.P.C.; initial posting: November 25, 2021).

Weissman, Gary E, Laura Zwaan, and Sigall K Bell. (2024) 2024. “Diagnostic Scope: The AI Can’t See What the Mind Doesn’t Know.”. Diagnosis (Berlin, Germany). https://doi.org/10.1515/dx-2024-0151.

BACKGROUND: Diagnostic scope is the range of diagnoses found in a clinical setting. Although the diagnostic scope is an essential feature of training and evaluating artificial intelligence (AI) systems to promote diagnostic excellence, its impact on AI systems and the diagnostic process remains under-explored.

CONTENT: We define the concept of diagnostic scope, discuss its nuanced role in building safe and effective AI-based diagnostic decision support systems, review current challenges to measurement and use, and highlight knowledge gaps for future research.

SUMMARY: The diagnostic scope parallels the differential diagnosis although the latter is at the level of an encounter and the former is at the level of a clinical setting. Therefore, diagnostic scope will vary by local characteristics including geography, population, and resources. The true, observed, and considered scope in each setting may also diverge, both posing challenges for clinicians, patients, and AI developers, while also highlighting opportunities to improve safety. Further work is needed to systematically define and measure diagnostic scope in terms that are accurate, equitable, and meaningful at the bedside. AI tools tailored to a particular setting, such as a primary care clinic or intensive care unit, will each require specifying and measuring the appropriate diagnostic scope.

OUTLOOK: AI tools will promote diagnostic excellence if they are aligned with patient and clinician needs and trained on an accurately measured diagnostic scope. A careful understanding and rigorous evaluation of the diagnostic scope in each clinical setting will promote optimal care through human-AI collaborations in the diagnostic process.

Glenn, Andrea J, Fenglei Wang, Anne-Julie Tessier, JoAnn E Manson, Eric B Rimm, Kenneth J Mukamal, Qi Sun, et al. (2024) 2024. “Dietary Plant-to-Animal Protein Ratio and Risk of Cardiovascular Disease in 3 Prospective Cohorts.”. The American Journal of Clinical Nutrition 120 (6): 1373-86. https://doi.org/10.1016/j.ajcnut.2024.09.006.

BACKGROUND: Dietary guidelines recommend substituting animal protein with plant protein, however, the ideal ratio of plant-to-animal protein (P:A) remains unknown.

OBJECTIVES: We aimed to evaluate associations between the P:A ratio and incident cardiovascular disease (CVD), coronary artery disease (CAD), and stroke in 3 cohorts.

METHODS: Multivariable-adjusted Cox proportional hazard models were used to estimate hazard ratios (HRs) for CVD outcomes among 70,918 females in the Nurses' Health Study (NHS) (1984-2016), 89,205 females in the NHSII (1991-2017) and 42,740 males from the Health Professionals Follow-up Study (1986-2016). The P:A ratio was based on percent energy from plant and animal protein and assessed using food frequency questionnaires every 4 y.

RESULTS: During 30 y of follow-up, 16,118 incident CVD cases occurred. In the pooled multivariable-adjusted models, participants had a lower risk of total CVD [HR: 0.81; 95% confidence interval (CI): 0.76, 0.87; P trend < 0.001], CAD (HR: 0.73; 95% CI: 0.67, 0.79; P trend < 0.001), but not stroke (HR: 0.98; 95% CI: 0.88, 1.09; P trend = 0.71), when comparing highest to lowest deciles of the P:A ratio (ratio: ∼0.76 compared with ∼0.24). Dose-response analyses showed evidence of linear and nonlinear relationships for CVD and CAD, with more marked risk reductions early in the dose-response curve. Lower risk of CVD (HR: 0.72; 95% CI: 0.64, 0.82) and CAD (HR: 0.64; 95% CI: 0.55, 0.75) were also observed with higher ratios and protein density (20.8% energy) combined. Substitution analyses indicated that replacing red and processed meat with several plant protein sources showed the greatest cardiovascular benefit.

CONCLUSIONS: In cohort studies of United States adults, a higher plant-to-animal protein ratio is associated with lower risks of CVD and CAD, but not stroke. Furthermore, a higher ratio combined with higher protein density showed the most cardiovascular benefit.

Lin, Pi-I Debby, Andres Cardenas, Lisa B Rokoff, Sheryl L Rifas-Shiman, Mingyu Zhang, Julianne Botelho, Antonia M Calafat, et al. (2024) 2024. “Associations of PFAS Concentrations During Pregnancy and Midlife With Bone Health in Midlife: Cross-Sectional and Prospective Findings from Project Viva.”. Environment International 194: 109177. https://doi.org/10.1016/j.envint.2024.109177.

BACKGROUND: PFAS may impair bone health, but effects of PFAS exposure assessed during pregnancy and the perimenopause-life stages marked by rapidly changing bone metabolism-on later life bone health are unknown.

METHODS: We studied 531 women in the Boston-area Project Viva cohort. We used multivariable linear, generalized additive, and mixture models to examine associations of plasma PFAS concentrations during early pregnancy [median (IQR) maternal age 32.9 (6.2) years] and midlife [age 51.2 (6.3)] with lumbar spine, total hip, and femoral neck areal bone mineral density (aBMD) and bone turnover biomarkersassessed in midlife. We examined effect modification by diet and physical activity measured at the time of PFAS exposure assessment and by menopausal status in midlife.

RESULTS: Participants had higher PFAS concentrations during pregnancy [1999-2000; e.g., PFOA median (IQR) 5.4 (3.8) ng/mL] than in midlife [2017-2021; e.g.

, PFOA: 1.5 (1.2) ng/mL]. Women with higher PFOA, PFOS and PFNA during pregnancy had higher midlife aBMD, especially of the spine [e.g., 0.28 (95% CI: 0.07, 0.48) higher spine aBMD T-score, per doubling of PFOA], with stronger associations observed among those with higher diet quality. In contrast, higher concentrations of all PFAS measured in midlife were associated with lower concurrent aBMD at all sites [e.g., -0.21 (-0.35, -0.07) lower spine aBMD T-score, per doubling of PFOA]; associations were stronger among those who were postmenopausal. The associations of several PFAS with bone resorption (loss) were also stronger among postmenopausal versus premenopausal women.

DISCUSSION: Plasma PFAS measured during pregnancy versus in midlife had different associations with midlife aBMD. We found an adverse association of PFAS measured in midlife with midlife aBMD, particularly among postmenopausal women. Future studies with longer follow-up are needed to elucidate the effect of PFAS on bone health during the peri- and postmenopausal years.

Li, Xinyi, Jinhee Hur, Yin Zhang, Mingyang Song, Stephanie A Smith-Warner, Liming Liang, Kenneth J Mukamal, Eric B Rimm, and Edward L Giovannucci. (2024) 2024. “Drinking Pattern and Time Lag of Alcohol Consumption With Colorectal Cancer Risk in US Men and Women.”. Journal of the National Cancer Institute. https://doi.org/10.1093/jnci/djae330.

BACKGROUND: Association between light to moderate alcohol consumption and colorectal cancer (CRC) incidence remains understudied, especially regarding drinking pattern, beverage type and temporal aspects.

METHODS: Hazard ratios (HRs) and 95% confidence intervals (CIs) for time to CRC diagnosis were estimated among 137,710 participants. Estimates based on remote (eg, >10 years before follow-up) and recent (eg, the preceding 10 years before follow-up) alcohol intake, using different cutoffs (eg, 8, 10, 12 years, etc) and mutual adjustment, enabled separating independent effects and investigating time lag of alcohol-CRC association.

RESULTS: 3,599 CRC cases were documented over three decades. Light to moderate drinking was associated with an increased CRC risk only in men: HR (95% CI) for 5-14.9 and 15-29.9 vs 0 g/day of alcohol intake was 1.19 (1.01, 1.41) and 1.38 (1.13, 1.67). In women, that for 0.1-4.9 and 5-14.9 vs 0 g/day of alcohol was 1.07 (0.96, 1.20) and 1.05 (0.91, 1.20). Drinkers with both high drinking frequency and daily intake had the highest CRC risk, suggesting total alcohol intake was the critical factor. We estimated the time lag between alcohol consumption and CRC occurrence to be 8 to 12 years. Former drinkers did not experience a significant reduction in CRC risk even after 10 years of quitting or reducing consumption.

CONCLUSIONS: Based on two cohorts of health professionals, our findings suggest that the increased risk of CRC associated with alcohol intake is mainly driven by total quantity and remote intake. Former drinkers did not experience an immediate reduction in CRC risk after quitting or reducing consumption.

Hashimoto, Tadayuki, Rachel K Putman, Anthony F Massaro, Youkie Shiozawa, Katherine McGough, Kerry K McCabe, Judith A Linden, et al. (2024) 2024. “Study Protocol for a Randomized Controlled Trial: Integrating the ’Time-Limited Trial’ in the Emergency Department.”. PloS One 19 (12): e0313858. https://doi.org/10.1371/journal.pone.0313858.

INTRODUCTION: Time-limited trial (TLT) is a structured approach between clinicians and seriously ill patients or their surrogates to discuss patients' values and preferences, prognosis, and shared decision-making to use specific therapies for a prespecified period of time in the face of prognostic uncertainty. Some evidence exists that this approach may lead to more patient-centered care in the intensive care unit; however, it has never been evaluated in the emergency department (ED). The study protocol aims to assess the feasibility and acceptability of TLTs initiated in the ED.

METHODS AND ANALYSIS: We will conduct a parallel group, clinician-level, pilot randomized clinical trial among 40 ED clinicians. We will measure feasibility (e.g., the time it takes to conduct the TLTs by ED clinicians) and clinician and patient-reported acceptability of the TLT, and also track patients' clinical outcomes via medical record review.

DISCUSSION: This study protocol will investigate the potential of TLT initiated in the ED to lead to patient-centered intensive care utilization. By doing so, the study intends to improve palliative care integration for seriously ill older adults in the ED and intensive care unit.

TRIAL IDENTIFIER AND REGISTRY NAME: ClinicalTrials.gov ID: NCT06378151 https://clinicaltrials.gov/study/NCT06378151; Pre-results; a randomized controlled trial: Time-limited Trials in the Emergency Department.

Dang, My T, Yen N Le, Sarah Naz-McLean, Nhung T T Vo, Phuong T Do, Linh T T Doan, Nhan T Do, et al. (2024) 2024. “Anticipated Facilitators and Barriers for Long-Acting Injectable Antiretrovirals As HIV Treatment and Prevention in Vietnam: A Qualitative Study Among Healthcare Workers.”. BMC Infectious Diseases 24 (1): 1462. https://doi.org/10.1186/s12879-024-10352-w.

BACKGROUND: Long-acting injectable antiretrovirals (LAI-ARVs) for HIV prevention and treatment have been demonstrated in clinical trials to be non-inferior to daily oral medications, providing an additional option to help users overcome the challenges of daily adherence. Approval and implementation of these regimens in low- and middle-income settings have been limited.

METHOD: This study describes the anticipated barriers and facilitators to implementing LAI-ARVs in Vietnam to inform future roll-out. From July to August 2022, we conducted 27 in-depth interviews with healthcare workers and public health stakeholders involved in HIV programs at national, provincial, and clinic levels across four provinces in Vietnam. The interviews followed a semi-structured questionnaire and were audio recorded. Data were analyzed using a rapid thematic analysis approach to identify facilitators and barriers to the adoption of LAI-ARVs.

RESULTS: In total, 27 participants from 4 provinces were interviewed including 14 (52%) men and 13 (48%) women. Participants median age was 48 years and they had 11.5 years of experience with HIV services and programs. Perceived user-level facilitators included the greater convenience of injectables in comparison to oral regimens, while barriers included the increased frequency of visits, fear of pain and side effects, and cost. Clinic-level facilitators included existing technical capacity to administer injections and physical storage availability in district health centers, while barriers included lack of space and equipment for administering injections for HIV-related services, concerns about cold chain maintenance for LAI-ART, and workload for healthcare workers. Health system-level facilitators included existing mechanisms for medication distribution, while barriers included regulatory approval processes and concerns about supply chain continuity.

CONCLUSION: Overall, participants were optimistic about the potential impact of LAI-ARVs but highlighted important considerations at multiple levels needed to ensure successful implementation in Vietnam.

CLINICAL TRIAL NUMBER: Not applicable.