id
stringlengths
16
27
title
stringlengths
18
339
abstract
stringlengths
95
38.7k
category
stringlengths
7
44
10.1101/2021.01.03.20248997
Augmented curation of clinical notes of COVID-19 and influenza patients reveals that long-term neuropsychiatric and coagulopathic symptoms are more associated with COVID-19
After one year of the COVID-19 pandemic, over 130 million individuals worldwide have been infected with the novel coronavirus, yet the post-acute sequelae of COVID-19 (PASC), also referred to as the long COVID syndrome, remains mostly uncharacterized. We leveraged machine-augmented curation of the physician notes from electronic health records (EHRs) across the multi-state Mayo Clinic health system to retrospectively contrast the occurrence of symptoms and diseases in COVID-19 patients in the post-COVID period relative to the pre-COVID period (n=6,413). Through comparison of the frequency of 10,039 signs and symptoms before and after diagnosis, we identified an increase in hypertensive chronic kidney disease (OR 47.3, 95% CI 23.9-93.6, p=3.50x10-9), thromboembolism (OR 3.84, 95% CI 3.22-4.57, p=1.18x10-4), and hair loss (OR 2.44, 95% CI 2.15-2.76, p=8.46x10-3) in COVID-19 patients three to six months after diagnosis. The sequelae associated with long COVID were notably different among male vs female patients and patients above vs under 55 years old, with the hair loss enrichment found primarily in females and the thromboembolism enrichment in males. These findings compel targeted investigations into what may be persistent dermatologic, cardiovascular, and coagulopathic phenotypes following SARS-CoV-2 infection.
health informatics
10.1101/2021.01.03.20248997
Hair loss in females and thromboembolism in males are significantly enriched in post-acute sequelae of COVID (PASC) relative to recent medical history
After one year of the COVID-19 pandemic, over 130 million individuals worldwide have been infected with the novel coronavirus, yet the post-acute sequelae of COVID-19 (PASC), also referred to as the long COVID syndrome, remains mostly uncharacterized. We leveraged machine-augmented curation of the physician notes from electronic health records (EHRs) across the multi-state Mayo Clinic health system to retrospectively contrast the occurrence of symptoms and diseases in COVID-19 patients in the post-COVID period relative to the pre-COVID period (n=6,413). Through comparison of the frequency of 10,039 signs and symptoms before and after diagnosis, we identified an increase in hypertensive chronic kidney disease (OR 47.3, 95% CI 23.9-93.6, p=3.50x10-9), thromboembolism (OR 3.84, 95% CI 3.22-4.57, p=1.18x10-4), and hair loss (OR 2.44, 95% CI 2.15-2.76, p=8.46x10-3) in COVID-19 patients three to six months after diagnosis. The sequelae associated with long COVID were notably different among male vs female patients and patients above vs under 55 years old, with the hair loss enrichment found primarily in females and the thromboembolism enrichment in males. These findings compel targeted investigations into what may be persistent dermatologic, cardiovascular, and coagulopathic phenotypes following SARS-CoV-2 infection.
health informatics
10.1101/2021.01.04.21249219
The impact of vitamin D supplementation on mortality rate and clinical outcomes of COVID-19 patients: A systematic review and meta-analysis
BackgroundSeveral studies have suggested the positive impact of vitamin D on patients infected with SARS-CoV-2. This systematic review aims to evaluate the effects of vitamin D supplementation on clinical outcomes and mortality rate of COVID-19 patients. MethodsA comprehensive search was conducted through the databases of PubMed, Scopus, Web of Knowledge, Embase, Ovid, and The Cochrane Library with no limitation in time and language, until December 16, 2020. The results were screened based on their accordance with the subject. Two independent reviewers selected the eligible studies and the outcomes of interest were extracted. Using the Joanna Briggs Institute (JBI) Critical Appraisal Tools for Randomized Controlled Trials (RCTs) and Quasi-Experimental Studies, the remaining results were appraised critically. Statistical analysis was performed using the Comprehensive Meta-Analysis (CMA) software version 2.0. ResultsOf the 2311 results, 1305 duplicated results were removed. After screening the titles, abstracts, and the full-text articles of the remaining records, four studies and 259 patients were enrolled, including 139 patients in vitamin D intervention groups. In three of the studies, the patients survival and mortality rate were evaluated. The pooled analysis of these studies showed a significantly lower mortality rate among the intervention groups (10.56%) compared with the control groups (23.88%) (OR = 0.264, 95% CI = 0.099-0.708, p-value = 0.008). Two of the studies reported the clinical outcomes based on the World Health Organizations Ordinal Scale for Clinical Improvement (OSCI) score for COVID-19, where both of them showed a significant decrease in OSCI score in the vitamin D intervention groups. Additionally, One study reported a lower rate of intensive care unit (ICU) admission, and one study reported a significant decrease in serum levels of Fibrinogen. ConclusionPrescribing vitamin D supplementation to patients with COVID-19 infection seems to decrease the mortality rate, the severity of the disease, and serum levels of the inflammatory markers. Further studies are needed to determine the ideal type, dosage and duration of supplementation.
infectious diseases
10.1101/2020.12.30.20249051
The social experience of participation in a COVID-19 vaccine trial: Subjects' motivations, others' concerns, and insights for vaccine promotion
BackgroundVaccine hesitancy could undermine the effectiveness of COVID-19 vaccination programs. Knowledge about peoples lived experiences regarding COVID-19 vaccination can enhance vaccine promotion and increase uptake. AimTo use COVID-19 vaccine trial participants experiences to identify key themes in the lived experience of vaccination early in the vaccine approval and distribution process. MethodsWe interviewed 31 participants in the Iowa City, Iowa US site of the Pfizer/BioNTech COVID-19 vaccine phase 3 clinical trial. While trial participation differs from clinical receipt of an approved vaccine in key ways, it offers the first view of peoples lived experiences of potentially receiving a COVID-19 vaccine. The trial context is also useful since decision-making about vaccination and medical research participation often involve similar hopes and concerns, and because the public appears to view even approved COVID-19 vaccines as experimental given their novelty. Semi-structured interviews addressed subjects experiences, including decision-making and telling others about their trial participation. We analyzed verbatim transcripts of these interviews thematically and identified common themes relevant for vaccination decision-making. ResultsParticipants across demographic groups, including age, sex/gender, race/ethnicity, and political affiliation, described largely similar experiences. Key motivations for participation included ending the pandemic/restoring normalcy, protecting oneself and others, doing ones duty, promoting/modeling vaccination, and expressing aspects of identity like being a helper, career-related motivations, and support of science/vaccines. Participants often felt uniquely qualified to help via trial participation due to personal attributes like health, sex/gender or race/ethnicity. They reported hearing concerns about side effects and the speed and politicization of vaccine development. Participants responded by normalizing and contextualizing side effects, de-politicizing vaccine development, and explaining how the rapid development process was nevertheless safe. ConclusionThese findings regarding participants reported motivations for trial participation and interactions with concerned others can be incorporated into COVID-19 vaccine promotion messaging aimed at similar populations.
infectious diseases
10.1101/2021.01.03.20237602
Different selection dynamics of S and RdRp between SARS-CoV-2 genomes with and without the dominant mutations
SARS-CoV-2 is a betacoronavirus responsible for the COVID-19 pandemic that has affected millions of people worldwide, with no dedicated treatment or vaccine currently available. As pharmaceutical research against and the most frequently used tests for SARS-CoV-2 infection both depend on the genomic and peptide sequences of the virus for their efficacy, understanding the mutation rates and content of the virus is critical. Two key proteins for SARS-CoV-2 infection and replication are the S protein, responsible for viral entry into the cells, and RdRp, the RNA polymerase responsible for replicating the viral genome. Due to their roles in the viral cycle, these proteins are crucial for the fitness and infectiousness of the virus. Our previous findings had shown that the two most frequently observed mutations in the SARS-CoV-2 genome, 14408C>T in the RdRp coding region, and 23403A>G in the S gene, are correlated with higher mutation density over time. In this study, we further detail the selection dynamics and the mutation rates of SARS-CoV-2 genes, comparing them between isolates carrying both mutations, and isolates carrying neither. We find that the S gene and the RdRp coding region show the highest variance between the genotypes, and their selection dynamics contrast each other over time. The S gene displays higher positive selection in mutant isolates early on, and undergoes increasing negative selection over time, whereas the RdRp region in the mutant isolates shows strong negative selection throughout the pandemic.
infectious diseases
10.1101/2021.01.03.21249159
Pharmacokinetic modelling to estimate intracellular favipiravir ribofuranosyl-5'-triphosphate exposure to support posology for SARS-CoV-2
BackgroundThe role of favipiravir as a treatment for COVID-19 is unclear, with discrepant activity against SARS-CoV-2 in vitro, concerns about teratogenicity and pill burden, and an unknown optimal dose. In Vero-E6 cells, high concentrations are needed to inhibit SARS-CoV-2 replication. The purpose of this analysis was to use available data to simulate intracellular pharmacokinetics of favipiravir ribofuranosyl-5-triphosphate (FAVI-RTP) to better understand the putative applicability as a COVID-19 intervention. MethodsPreviously published in vitro data for the intracellular production and elimination of FAVI- RTP in MDCK cells incubated with parent favipiravir was fitted with a mathematical model to describe the time course of intracellular FAVI-RTP concentrations as a function of incubation concentration of parent favipiravir. Parameter estimates from this model fitting were then combined with a previously published population PK model for the plasma exposure of parent favipiravir in Chinese patients with severe influenza (the modelled free plasma concentration of favipiravir substituting for in vitro incubation concentration) to predict the human intracellular FAVI-RTP pharmacokinetics. ResultsIn vitro FAVI-RTP data was adequately described as a function of in vitro incubation media concentrations of parent favipiravir with an empirical model, noting that the model simplifies and consolidates various processes and is used under various assumptions and within certain limits. Parameter estimates from the fittings to in vitro data predict a flatter dynamic range of peak to trough for intracellular FAVI-RTP when driven by a predicted free plasma concentration profile. ConclusionThis modelling approach has several important limitations that are discussed in the main text of the manuscript. However, the simulations indicate that despite rapid clearance of the parent drug from plasma, sufficient intracellular FAVI-RTP may be maintained across the dosing interval because of its long intracellular half-life. Population average intracellular FAVI-RTP concentrations are estimated to maintain the Km for the SARS-CoV-2 polymerase for 3 days following 800 mg BID dosing and 9 days following 1200 mg BID dosing after a 1600 mg BID loading dose on day 1. Further evaluation of favipiravir as part of antiviral combinations for SARS-CoV-2 is warranted.
infectious diseases
10.1101/2020.12.29.20248991
Prior Bariatric Surgery in COVID-19 Positive Patients May Be Protective
IntroductionPatients infected with novel COVID-19 virus have a spectrum of illnesses ranging from asymptomatic to death. Data has shown that age, gender and obesity are strongly correlated with poor outcomes in COVID-19 positive patients. Bariatric surgery is the only treatment that provides significant, sustained weight loss in the severely obese. We look at whether prior bariatric surgery correlates with increased risk of hospitalization and outcome severity after COVID-19 infection. MethodsA cross-sectional retrospective analysis of a COVID-19 database from a single, NYC-based, academic institution was conducted. A cohort of COVID-19 positive patients with a history of bariatric surgery (n=124) were matched in a 4:1 ratio to a control cohort of COVID-19 positive patients who were eligible for bariatric surgery (BMI [&ge;]40 kg/m2 or BMI [&ge;]35 kg/m2 with a comorbidity) (n=496). A comparison of outcomes, including mechanical ventilation requirements and deceased at discharge, was done between cohorts using Chi-square test or Fishers exact test. Additionally, overall length of stay and duration of time in ICU were compared using Wilcoxon Rank Sum test. Conditional logistic regression analyses were done to determine both unadjusted (UOR) and adjusted odds ratios (AOR). ResultsA total of 620 COVID-19 positive patients were included in this analysis. The categorization of bariatric surgeries included 36% Roux-en-Y Gastric Bypass (RYGB, n=45), 35% laparoscopic adjustable gastric banding (LAGB, n=44), and 28% laparoscopic sleeve gastrectomy (LSG, n=35). The body mass index (BMI) for the bariatric group was 36.1 kg/m2 (SD=8.3), which was significantly lower than the control group, 41.4 kg/m2 (SD=6.5) (p<0.0001). There was also less burden of diabetes in the bariatric group (32%) compared to the control group (48%) (p=0.0019). Patients with a history of bariatric surgery were less likely to be admitted through the emergency room (UOR=0.39, p=0.0001), less likely to have had a ventilator used during the admission (UOR=0.42, p=0.028), had a shorter length of stay in both the ICU (p=0.033) and overall (UOR=0.44, p=0.0002), and were less likely to be deceased at discharge compared to the control group (OR=0.42, p=0.028). ConclusionA history of bariatric surgery significantly decreases the risk of emergency room admission, mechanical ventilation, prolonged ICU stay, and death in patients with COVID-19.
infectious diseases
10.1101/2021.01.04.21249209
Edoxaban versus warfarin on stroke risk in patients with atrial fibrillation: a territory-wide cohort study
BackgroundIn this territory-wide, observational, propensity score-matched cohort study, we evaluate the development of transient ischaemic attack and ischaemic stroke (TIA/Ischaemic stroke) in patients with AF treated with edoxaban or warfarin. MethodsThis was an observational, territory-wide cohort study of patients between January 1st, 2016 and December 31st, 2019, in Hong Kong. The inclusion were patients with i) atrial fibrillation, and ii) edoxaban or warfarin prescription. 1:2 propensity score matching was performed between edoxaban and warfarin users. Univariate Cox regression identifies significant risk predictors of the primary, secondary and safety outcomes. Hazard ratios (HRs) with corresponding 95% confidence interval [CI] and p values were reported. ResultsThis cohort included 3464 patients (54.18% males, median baseline age: 72 years old, IQR: 63-80, max: 100 years old), 664 (19.17%) with edoxaban use and 2800 (80.83%) with warfarin use. After a median follow-up of 606 days (IQR: 306-1044, max: 1520 days), 91(incidence rate: 2.62%) developed TIA/ischaemic stroke: 1.51% (10/664) in the edoxaban group and 2.89% (81/2800) in the warfarin group. Edoxaban was associated with a lower risk of TIA or ischemic stroke when compared to warfarin. ConclusionsEdoxaban use was associated with a lower risk of TIA or ischemic stroke after propensity score matching for demographics, comorbidities and medication use.
cardiovascular medicine
10.1101/2021.01.04.21249211
Comparative effects of sodium glucose cotransporter 2 (SGLT2) inhibitors and dipeptidyl peptidase-4 (DPP4) inhibitors on new-onset atrial fibrillation and stroke outcomes
BackgroundSGLT2I and DPP4I are medications prescribed for type 2 diabetes mellitus patients. However, there are few population-based studies comparing their effects on incident atrial fibrillation or ischemic stroke. MethodsThis was a territory-wide cohort study of type 2 diabetes mellitus patients prescribed SGLT2I or DPP4I between January 1st, 2015 to December 31st, 2019 in Hong Kong. Patients with both DPP4I and SGLT2I use and patients with drug discontinuation were excluded. Patients with prior AF or stroke were excluded for the respective analysis. 1:2 propensity-score matching was conducted for demographics, past comorbidities and medications using nearest-neighbor matching method. Cox models were used to identify significant predictors for new onset heart failure (HF) or myocardial infarction (MI), cardiovascular and all-cause mortality. ResultsThe AF-free cohort included 49108 patients (mean age: 66.48 years old [SD: 12.89], 55.32% males) and the stroke-free cohort included 49563 patients (27244 males [54.96%], mean baseline age: 66.7 years old [SD: 12.97, max: 104.6 years old]). After propensity score matching, SGLT2i use was associated with a lower risk of new onset AF (HR: 0.43[0.28, 0.66]), cardiovascular mortality (HR: 0.79[0.58, 1.09]) and all-cause mortality (HR: 0.69[0.60, 0.79]) in the AF-free cohort. It was also associated with a lower risk of new onset stroke (0.46[0.33, 0.64]), cardiovascular mortality (HR: 0.74[0.55, 1.00]) and all-cause mortality (HR: 0.64[0.56, 0.74]) in the stroke-free cohort. ConclusionsThe novelty of our work si that SGLT2 inhibitors are protective against atrial fibrillation and stroke development for the first time. These findings should be validated in other cohorts.
cardiovascular medicine
10.1101/2021.01.04.21249208
Triglyceride Glucose Index Predicts Risk of Adverse Cardio-metabolic and Mortality Outcomes Among Chinese Adults: A Territory-Wide Longitudinal Study
BackgroundThe triglyceride glucose (TyG) index has been proposed to be a surrogate of insulin resistance. In the study, we aimed to examine the relationship between TyG index and the risk of new onset DM and secondary outcomes included atrial fibrillation (AF), heart failure (HF), acute myocardial infarction (AMI), ventricular tachycardia/fibrillation (VTF), cardiovascular mortality (CAD) and all-cause mortality. MethodsThe retrospective observational study analyzed patients recruited from 1st January 2000 to 31st December 2003 and followed up until 31st December 2019. Demographics, past comorbidities, medications and laboratory tests were extracted. At baseline and follow-up, DM was defined as 1) any occasion Hb1Ac [&ge;]14 g/dL, 2) fasting glucose[&ge;]7 mmol/L on two occasions, or 3) with DM diagnosis. We excluded 1) patients with prior DM or with the use of antidiabetic medications; 2) patients with prior AMI/HF/AF or with the use of diuretic/beta blockers for HF were excluded. Univariate analysis and multivariate Cox analysis with adjustments on demographics, past comorbidities and medications were conducted to identify the significant risk predictors of primary and secondary outcomes. Optimal cutoffs of TyG index for the primary and secondary outcomes were found with maximally selected rank statistics approach. ResultLager TyG index is significantly associated with new onset DM (HR: 1.51, 95% CI: [1.47, 1.55], P vlaue<0.0001), new onset HF (HR: 1.27, 95% CI: [1.2, 1.34], P vlaue<0.0001), new onset AF (HR: 2.36, 95% CI: [2.26, 2.46], P vlaue<0.0001), new onset AMI (HR: 1.51, 95% CI: [1.42, 1.6], P vlaue<0.0001), new onset VTF (HR: 1.22, 95% CI: [1.13, 1.31], P vlaue<0.0001), new onset CAD (HR: 1.56, 95% CI: [1.45, 1.69], P value<0.0001) and all-cause mortality (HR: 1.21, 95% CI: [1.18, 1.25], P vlaue<0.0001). TyG index and its 3rd tertile remained significant after being adjusted with significant demographics, past comorbidities and medicactions in multivariate cox models (HR>1, P value<0.05). Optimal cut-off values of baseline TyG index and adjusted multivariate restricted cubic spline models further uncovered detailed associations of larger baseline TyG index with the primary and secondary outcome. ConclusionHigher TyG index remained significantly associated with the elevated risk of new onset DM, AF, HF, AMI, VTF, CAD and all-cause mortality after adjustments on demographics, past comorbidities, and medications.
cardiovascular medicine
10.1101/2021.01.04.21249214
Paediatric/young versus adult patients with long QT syndrome or catecholaminergic polymorphic ventricular tachycardia
IntroductionCatecholaminergic polymorphic ventricular tachycardia (CPVT) is a rare cardiac ion channelopathy. The aim of this study is to examine the genetic basis and identify pre-dictive factors for arrhythmic outcomes in CPVT patients from Hong Kong. MethodsThis was a territory-wide retrospective cohort study of consecutive patients diagnosed with CPVT at public hospitals or clinics in Hong Kong. The primary outcome was spontaneous ventricular tachycardia/ventricular fibrillation (VT/VF). ResultsA total of 16 (mean presentation age=11{+/-}4 years old) patients were included. All patients presented at or before 19 years of age. Fifteen patients (93.8%) were initially symptomatic. Ten patients had both premature ventricular complexes (PVCs) and VT/VF, whereas one patient had PVCs without VT/VF. Genetic tests were performed in 14 patients (87.5%). Eight (57.1%) tested positive for the RyR2 gene. Seven variants have been described else-where (c.14848G>A, c.12475C>A, c.7420A>G, c.11836G>A, c.14159T>C, c.10046C>T and c.7202G>A). c.14861C>G is a novel RyR2 variant that has not been reported outside this cohort. All patients were treated with beta-blockers, three patients received amiodarone and two received verapamil. Sympathectomy (n=8), ablation (n=1) and implantable-cardioverter defibrillator implantation (n=3) were performed. Over a median follow-up of 127 (IQR: 97-143) months, six patients suffered from incident VT/VF. No significant predictors were identified on Cox regression. Nevertheless, a random survival forest model identified initial VT/VF/sudden cardiac death, palpitations, QTc, initially symptomatic and heart rate as important variables for estimating the probability of developing incident VT/VF. ConclusionAll CPVT patients who are from Hong Kong presented at or before 19 years of age. Clinical and electrocardiographic findings can be used to predict arrhythmic outcomes. A nonparametric machine learning survival analysis achieved high accuracy for predicting the probability of incident VT/VF.
cardiovascular medicine
10.1101/2021.01.04.21249214
Arrhythmic Outcomes in Catecholaminergic Polymorphic Ventricular Tachycardia
IntroductionCatecholaminergic polymorphic ventricular tachycardia (CPVT) is a rare cardiac ion channelopathy. The aim of this study is to examine the genetic basis and identify pre-dictive factors for arrhythmic outcomes in CPVT patients from Hong Kong. MethodsThis was a territory-wide retrospective cohort study of consecutive patients diagnosed with CPVT at public hospitals or clinics in Hong Kong. The primary outcome was spontaneous ventricular tachycardia/ventricular fibrillation (VT/VF). ResultsA total of 16 (mean presentation age=11{+/-}4 years old) patients were included. All patients presented at or before 19 years of age. Fifteen patients (93.8%) were initially symptomatic. Ten patients had both premature ventricular complexes (PVCs) and VT/VF, whereas one patient had PVCs without VT/VF. Genetic tests were performed in 14 patients (87.5%). Eight (57.1%) tested positive for the RyR2 gene. Seven variants have been described else-where (c.14848G>A, c.12475C>A, c.7420A>G, c.11836G>A, c.14159T>C, c.10046C>T and c.7202G>A). c.14861C>G is a novel RyR2 variant that has not been reported outside this cohort. All patients were treated with beta-blockers, three patients received amiodarone and two received verapamil. Sympathectomy (n=8), ablation (n=1) and implantable-cardioverter defibrillator implantation (n=3) were performed. Over a median follow-up of 127 (IQR: 97-143) months, six patients suffered from incident VT/VF. No significant predictors were identified on Cox regression. Nevertheless, a random survival forest model identified initial VT/VF/sudden cardiac death, palpitations, QTc, initially symptomatic and heart rate as important variables for estimating the probability of developing incident VT/VF. ConclusionAll CPVT patients who are from Hong Kong presented at or before 19 years of age. Clinical and electrocardiographic findings can be used to predict arrhythmic outcomes. A nonparametric machine learning survival analysis achieved high accuracy for predicting the probability of incident VT/VF.
cardiovascular medicine
10.1101/2021.01.04.21249215
Predicting Stroke and Mortality in Mitral Regurgitation: A Gradient Boosting Approach
IntroductionWe hypothesized that an interpretable gradient boosting machine (GBM) model considering comorbidities, P-wave and echocardiographic measurements, can better predict mortality and cerebrovascular events in mitral regurgitation (MR). MethodsPatients from a tertiary center were analyzed. The GBM model was used as an interpretable statistical approach to identify the leading indicators of high-risk patients with either outcome of CVAs and all-cause mortality. ResultsA total of 706 patients were included. GBM analysis showed that age, systolic blood pressure, diastolic blood pressure, plasma albumin levels, mean P-wave duration (PWD), MR regurgitant volume, left ventricular ejection fraction (LVEF), left atrial dimension at end-systole (LADs), velocity-time integral (VTI) and effective regurgitant orifice were significant predictors of TIA/stroke. Age, sodium, urea and albumin levels, platelet count, mean PWD, LVEF, LADs, left ventricular dimension at end systole (LVDs) and VTI were significant predictors of all-cause mortality. The GBM demonstrates the best predictive performance in terms of precision, sensitivity c-statistic and F1-score compared to logistic regression, decision tree, random forest, support vector machine, and artificial neural networks. ConclusionGradient boosting model incorporating clinical data from different investigative modalities significantly improves risk prediction performance and identify key indicators for outcome prediction in MR.
cardiovascular medicine
10.1101/2021.01.03.21249182
COVID-HEART: Development and Validation of a Multi-Variable Model for Real-Time Prediction of Cardiovascular Complications in Hospitalized Patients with COVID-19
Cardiovascular (CV) manifestations of COVID-19 infection carry significant morbidity and mortality. Current risk prediction for CV complications in COVID-19 is limited and existing approaches fail to account for the dynamic course of the disease. Here, we develop and validate the COVID-HEART predictor, a novel continuously-updating risk prediction technology to forecast CV complications in hospitalized patients with COVID-19. The risk predictor is trained and tested with retrospective registry data from 2178 patients to predict two outcomes: cardiac arrest and imaging-confirmed thromboembolic events. In repeating model validation many times, we show that it predicts cardiac arrest with an average median early warning time of 18 hours (IQR: 13-20 hours) and an AUROC of 0.92 (95% CI: 0.91-0.92), and thromboembolic events with a median early warning time of 72 hours (IQR: 12-204 hours) and an AUROC of 0.70 (95% CI: 0.67-0.73). The COVID-HEART predictor is anticipated to provide tangible clinical decision support in triaging patients and optimizing resource utilization, with its clinical utility potentially extending well beyond COVID-19.
cardiovascular medicine
10.1101/2021.01.03.21249182
COVID-HEART: Development and Validation of a Multi-Variable Model for Real-Time Prediction of Cardiovascular Complications in Hospitalized Patients with COVID-19
Cardiovascular (CV) manifestations of COVID-19 infection carry significant morbidity and mortality. Current risk prediction for CV complications in COVID-19 is limited and existing approaches fail to account for the dynamic course of the disease. Here, we develop and validate the COVID-HEART predictor, a novel continuously-updating risk prediction technology to forecast CV complications in hospitalized patients with COVID-19. The risk predictor is trained and tested with retrospective registry data from 2178 patients to predict two outcomes: cardiac arrest and imaging-confirmed thromboembolic events. In repeating model validation many times, we show that it predicts cardiac arrest with an average median early warning time of 18 hours (IQR: 13-20 hours) and an AUROC of 0.92 (95% CI: 0.91-0.92), and thromboembolic events with a median early warning time of 72 hours (IQR: 12-204 hours) and an AUROC of 0.70 (95% CI: 0.67-0.73). The COVID-HEART predictor is anticipated to provide tangible clinical decision support in triaging patients and optimizing resource utilization, with its clinical utility potentially extending well beyond COVID-19.
cardiovascular medicine
10.1101/2020.12.31.20249099
Optimizing vaccine allocation for COVID-19 vaccines: critical role of single-dose vaccination.
Most COVID-19 vaccines require two doses, however with limited vaccine supply, policymakers are considering single-dose vaccination as an alternative strategy. Using a mathematical model combined with optimization algorithms, we determined optimal allocation strategies with one and two doses of vaccine under various degrees of viral transmission. Under low transmission, we show that the optimal allocation of vaccine vitally depends on the single-dose efficacy (SDE). With high SDE, single-dose vaccination is optimal, preventing up to 22% more deaths than a strategy prioritizing two-dose vaccination for older adults. With low or moderate SDE, mixed vaccination campaigns with complete coverage of older adults are optimal. However, with modest or high transmission, vaccinating older adults first with two doses is best, preventing up to 41% more deaths than a singledose vaccination given across all adult populations. Our work suggests that it is imperative to determine the efficacy and durability of single-dose vaccines, as mixed or single-dose vaccination campaigns may have the potential to contain the pandemic much more quickly.
epidemiology
10.1101/2020.12.31.20249099
Optimizing vaccine allocation for COVID-19 vaccines: critical role of single-dose vaccination.
Most COVID-19 vaccines require two doses, however with limited vaccine supply, policymakers are considering single-dose vaccination as an alternative strategy. Using a mathematical model combined with optimization algorithms, we determined optimal allocation strategies with one and two doses of vaccine under various degrees of viral transmission. Under low transmission, we show that the optimal allocation of vaccine vitally depends on the single-dose efficacy (SDE). With high SDE, single-dose vaccination is optimal, preventing up to 22% more deaths than a strategy prioritizing two-dose vaccination for older adults. With low or moderate SDE, mixed vaccination campaigns with complete coverage of older adults are optimal. However, with modest or high transmission, vaccinating older adults first with two doses is best, preventing up to 41% more deaths than a singledose vaccination given across all adult populations. Our work suggests that it is imperative to determine the efficacy and durability of single-dose vaccines, as mixed or single-dose vaccination campaigns may have the potential to contain the pandemic much more quickly.
epidemiology
10.1101/2020.12.31.20249099
Optimizing vaccine allocation for COVID-19 vaccines: potential role of single-dose vaccination.
Most COVID-19 vaccines require two doses, however with limited vaccine supply, policymakers are considering single-dose vaccination as an alternative strategy. Using a mathematical model combined with optimization algorithms, we determined optimal allocation strategies with one and two doses of vaccine under various degrees of viral transmission. Under low transmission, we show that the optimal allocation of vaccine vitally depends on the single-dose efficacy (SDE). With high SDE, single-dose vaccination is optimal, preventing up to 22% more deaths than a strategy prioritizing two-dose vaccination for older adults. With low or moderate SDE, mixed vaccination campaigns with complete coverage of older adults are optimal. However, with modest or high transmission, vaccinating older adults first with two doses is best, preventing up to 41% more deaths than a singledose vaccination given across all adult populations. Our work suggests that it is imperative to determine the efficacy and durability of single-dose vaccines, as mixed or single-dose vaccination campaigns may have the potential to contain the pandemic much more quickly.
epidemiology
10.1101/2020.12.31.20249099
Optimizing vaccine allocation for COVID-19 vaccines: potential role of single-dose vaccination.
Most COVID-19 vaccines require two doses, however with limited vaccine supply, policymakers are considering single-dose vaccination as an alternative strategy. Using a mathematical model combined with optimization algorithms, we determined optimal allocation strategies with one and two doses of vaccine under various degrees of viral transmission. Under low transmission, we show that the optimal allocation of vaccine vitally depends on the single-dose efficacy (SDE). With high SDE, single-dose vaccination is optimal, preventing up to 22% more deaths than a strategy prioritizing two-dose vaccination for older adults. With low or moderate SDE, mixed vaccination campaigns with complete coverage of older adults are optimal. However, with modest or high transmission, vaccinating older adults first with two doses is best, preventing up to 41% more deaths than a singledose vaccination given across all adult populations. Our work suggests that it is imperative to determine the efficacy and durability of single-dose vaccines, as mixed or single-dose vaccination campaigns may have the potential to contain the pandemic much more quickly.
epidemiology
10.1101/2021.01.02.21249125
Similarities and differences of the COVID-19 second surge in Europe and the Northeast United States
Assessing a potential resurgence of an epidemic outbreak with certainty is as important as challenging. The low number of infectious individuals after a long regression, and the randomness associated with it, makes it difficult to ascertain whether the infectious population is growing or just fluctuating. We have developed an approach to compute confidence intervals for the switching time from decay to growth and to compute the corresponding supra-location aggregated quantities to increase the precision of the determination. We estimated the aggregate prevalence over time for Europe and the Northeast United States to characterize the COVID-19 second surge in these regions during year 2020. We find a starting date as early as July 3 (95% Confidence Interval (CI): July 1- July 6) for Europe and August 19 (95% CI: August 16 - August 23) for the Northeast; subsequent infectious populations that, as of December 31 have always increased or remained stagnant; and the resurgences being the collective effect of each overall region with no location dominating the regional dynamics by itself.
epidemiology
10.1101/2021.01.02.21249125
Assessment of the similarities and differences of the COVID-19 second surges in Europe and the Northeast United States
Assessing a potential resurgence of an epidemic outbreak with certainty is as important as challenging. The low number of infectious individuals after a long regression, and the randomness associated with it, makes it difficult to ascertain whether the infectious population is growing or just fluctuating. We have developed an approach to compute confidence intervals for the switching time from decay to growth and to compute the corresponding supra-location aggregated quantities to increase the precision of the determination. We estimated the aggregate prevalence over time for Europe and the Northeast United States to characterize the COVID-19 second surge in these regions during year 2020. We find a starting date as early as July 3 (95% Confidence Interval (CI): July 1- July 6) for Europe and August 19 (95% CI: August 16 - August 23) for the Northeast; subsequent infectious populations that, as of December 31 have always increased or remained stagnant; and the resurgences being the collective effect of each overall region with no location dominating the regional dynamics by itself.
epidemiology
10.1101/2021.01.02.21249125
Ascertaining the initiation of epidemic resurgences: an application to the COVID-19 second surges in Europe and the Northeast United States
Assessing a potential resurgence of an epidemic outbreak with certainty is as important as challenging. The low number of infectious individuals after a long regression, and the randomness associated with it, makes it difficult to ascertain whether the infectious population is growing or just fluctuating. We have developed an approach to compute confidence intervals for the switching time from decay to growth and to compute the corresponding supra-location aggregated quantities to increase the precision of the determination. We estimated the aggregate prevalence over time for Europe and the Northeast United States to characterize the COVID-19 second surge in these regions during year 2020. We find a starting date as early as July 3 (95% Confidence Interval (CI): July 1- July 6) for Europe and August 19 (95% CI: August 16 - August 23) for the Northeast; subsequent infectious populations that, as of December 31 have always increased or remained stagnant; and the resurgences being the collective effect of each overall region with no location dominating the regional dynamics by itself.
epidemiology
10.1101/2021.01.02.20248596
COVID-19 immunization threshold(s): an analysis
2As COVID-19 vaccine research efforts seem to be yielding the first tangible results, the proportion of individuals needed to reap the benefits of herd immunity is a key element from a Public Health programs perspective. This magnitude, termed the critical immunization threshold (q), can be obtained from the classical SIR model equilibrium equation, equaling (1 - 1/R0)/ {epsilon}, where R0 is the basic reproduction number and{epsilon} is the vaccine efficacy. When a significant proportion of the population is already immune, this becomes (n - 1/R0)/ {epsilon}, where n is the proportion of non-immune individuals. A similar equation can be obtained for short-term immunization thresholds(qt), which are dependent on Rt. qs for most countries are between 60-75% of the population. Current qt for most countries are between 20-40%. Therefore, the combination of gradual vaccination and other non-pharmaceutical interventions will mark the transition to the herd immunity, providing that the later turns out to be a feasible objective. Nevertheless, immunization through vaccination is a complex issue and many challenges might appear.
epidemiology
10.1101/2020.12.31.20249105
Prediction of COVID-19 Pandemic of Top Ten Countries in the World Establishing a Hybrid AARNN LTM Model
The novel COVID-19 global pandemic has become a public health emergency of international concern affecting 215 countries and territories around the globe. As of 28 November 2020, it has caused a pandemic outbreak with a total of more than 6,171,5119 confirmed infections and more than 1,44,4235 confirmed deaths reported worldwide. The main focus of this paper is to generate LTM real-time out of sample forecasts of the future COVID-19 confirmed and death cases respectively for the top ten profoundly affected countries including for the world. To solve this problem we introduced a novel hybrid approach AARNN model based on ARIMA and ARNN forecasting model that can generate LTM (fifty days ahead) out of sample forecasts of the number of daily confirmed and death COVID-19 cases for the ten countries namely USA, India, Brazil, Russia, France, Spain, UK, Italy, Argentina, Colombia and also for the world respectively. The predictions of the future outbreak for different countries will be useful for the effective allocation of health care resources and will act as early-warning system for health warriors, corporate leaders, economists, government/public-policy makers, and scientific experts.
epidemiology
10.1101/2021.01.03.21249183
Value of radiomics features from adrenal gland and periadrenal fat CT images predicting COVID-19 progression
BackgroundValue of radiomics features from the adrenal gland and periadrenal fat CT images for predicting disease progression in patients with COVID-19 has not been studied. MethodsA total of 1,245 patients (685 moderate and 560 severe patients) were enrolled in a retrospective study. We proposed 3D V-Net to segment adrenal glands in onset CT images automatically, and periadrenal fat was obtained using inflation operation around the adrenal gland. Next, we built a clinical model (CM), three radiomics models (adrenal gland model [AM], periadrenal fat model [PM], and fusion of adrenal gland and periadrenal fat model [FM]), and radiomics nomogram (RN) after radiomics features extracted to predict disease progression in patients with COVID-19. ResultsThe auto-segmentation framework yielded a dice value of 0.79 in the training set. CM, AM, PM, FM, and RN obtained AUCs of 0.712, 0.692, 0.763, 0.791, and 0.806, respectively in the training set. FM and RN had better predictive efficacy than CM (P < 0.0001) in the training set. RN showed that there was no significant difference in the validation set (mean absolute error [MAE] = 0.04) and test set (MAE = 0.075) between predictive and actual results. Decision curve analysis showed that if the threshold probability was more than 0.3 in the validation set or between 0.4 and 0.8 in the test set, it could gain more net benefits using RN than FM and CM. ConclusionRadiomics features extracted from the adrenal gland and periadrenal fat CT images may predict progression in patients with COVID-19. FundingThis study was funded by Science and Technology Foundation of Guizhou Province (QKHZC [2020]4Y002, QKHPTRC [2019]5803), the Guiyang Science and Technology Project (ZKXM [2020]4), Guizhou Science and Technology Department Key Lab. Project (QKF [2017]25), Beijing Medical and Health Foundation (YWJKJJHKYJJ-B20261CS) and the special fund for basic Research Operating Expenses of public welfare research institutes at the central level from Chinese Academy of Medical Sciences (2019PT320003).
radiology and imaging
10.1101/2021.01.04.20248473
Early prediction of renal graft function: Analysis of a multi-centre, multi-level data set
IntroductionLong-term graft survival rates after renal transplantation are still moderate. We aimed to build an early predictor of an established long-term outcomes marker, the glomerular filtration rate (eGFR) one year post-transplant (eGFR-1y). Materials and MethodsA large cohort of 376 patients was characterized for a multi-level bio-marker panel including gene expression, cytokines, metabolomics and antibody reactivity profiles. Almost one thousand samples from the pre-transplant and early post-transplant period were analysed. Machine learning-based predictors were built employing stacked generalization. ResultsPre-transplant data led to a prediction achieving a Pearsons correlation coefficient of r=0.39 between measured and predicted eGFR-1y. Two weeks post-transplant, the correlation was improved to r=0.63, and at the third month, to r=0.76. eGFR values were remarkably stable throughout the first year post-transplant and were the best estimators of eGFR-1y already two weeks post-transplant. Several markers were associated with eGFR: The cytokine stem cell factor demonstrated a strong negative correlation; and a subset of 19 NMR bins of the urine metabolome data was shown to have potential applications in non-invasive eGFR monitoring. Importantly, we identified the expression of the genes TMEM176B and HMMR as potential prognostic markers for changes in the eGFR. DiscussionOur multi-centre, multi-level data set represents a milestone in the efforts to predict transplant outcome. While an acceptable predictive capacity was achieved, we are still very far from predicting changes in the eGFR precisely. Further studies employing further marker panels are needed in order to establish predictors of eGFR-1y for clinical application; herein, gene expression markers seem to hold the most promise.
transplantation
10.1101/2021.01.04.21249195
The incidence, characteristics and outcomes of pregnant women hospitalized with symptomatic and asymptomatic SARS-CoV-2 infection in the UK from March to September 2020: a national cohort study using the UK Obstetric Surveillance System (UKOSS)
BackgroundEvidence on risk factors, incidence and impact of SARS-CoV-2 infection in pregnant mothers and their babies has rapidly expanded but there is a lack of population level data to inform accurate incidence rates and unbiased descriptions of characteristics and outcomes. The primary aim of this study was to describe the incidence, characteristics and outcomes of hospitalized pregnant women with symptomatic and asymptomatic SARS-CoV-2 in the UK compared to pregnant women without SARS-CoV-2 in order to inform future clinical guidance and management. Methods and FindingsWe conducted a national, prospective cohort study of all hospitalized pregnant women with confirmed SARS-CoV-2 from 1st March 2020 to 31st August 2020 using the UK Obstetric Surveillance System (UKOSS) across all 194 hospitals in the UK with a consultant-led maternity unit. Incidence was estimated using the latest national maternity data. Overall, 1148 hospitalized women had confirmed SARS-CoV-2 in pregnancy, 63% of which were symptomatic. Therefore, the estimated incidence of hospitalization with symptomatic SARS-CoV-2 was 2.0 per 1000 maternities (95% CI 1.9-2.2) and for asymptomatic SARS-CoV-2 was 1.2 per 1000 maternities (95% CI 1.1-1.4). Compared to pregnant women without SARS-CoV-2, women hospitalized with symptomatic SARS-CoV-2 were more likely to be overweight or obese (adjusted OR 1.86, 95% CI 1.39-2.48 and aOR 2.07, 95% CI 1.53-2.29 respectively), to be of Black, Asian or Other minority ethnic group (aOR 6.24, 95% CI 3.93-9.90, aOR 4.36, 95% CI 3.19-5.95 and aOR 12.95, 95% CI 4.93-34.01 respectively), and to have a relevant medical comorbidity (aOR 1.83, 95% CI 1.32-2.54). Compared to pregnant women without SARS-CoV-2, hospitalized pregnant women with symptomatic SARS-CoV-2 were more likely to be admitted to intensive care (aOR 57.67, 95% CI 7.80-426.70) but the absolute risk of poor outcomes was low. Cesarean births and neonatal unit admission were increased regardless of symptom status (symptomatic aOR 2.60, 95% CI 1.97-3.42 and aOR 3.08, 95% CI 1.99-4.77 respectively; asymptomatic aOR 2.02, 95% CI 1.52-2.70 and aOR 1.84, 95% 1.12-3.03 respectively). Iatrogenic preterm births were more common in women with symptomatic SARS-CoV-2 (aOR 11.43, 95% CI 5.07-25.75). The risks of stillbirth or neonatal death were not significantly increased, regardless of symptom status but numbers were small. The limitations of this study include the restriction to women hospitalized with SARS-CoV-2, who may by nature of their admission have been at greater risk of adverse outcome. ConclusionsWe have identified factors that increase the risk of symptomatic and asymptomatic SARS-CoV-2 in pregnancy. The increased risks of cesarean and iatrogenic preterm birth provide clear evidence of the indirect impact of SARS-CoV-2 on mothers and maternity care in high income settings. Clinicians can be reassured that the majority of women do not experience severe complications of SARS-CoV-2 in pregnancy and women with mild disease can be discharged to continue their pregnancy safely.
obstetrics and gynecology
10.1101/2021.01.04.21249206
Socioeconomic and gender disparities in tobacco smoking among Jamaican adults from a national health survey.
ObjectivesLittle is known of socioeconomic and gender disparities in tobacco use in the Caribbean. We evaluated education and occupation disparities in tobacco smoking prevalence in Jamaica. MethodsData on tobacco smoking, education attainment and usual occupation in adults 25-74 years in a national survey collected between 2007 and 2008 was analyzed. Using post stratification survey weights, Poisson regression models estimated sex-specific, age-adjusted prevalence estimates, prevalence differences and prevalence ratios. ResultsAnalyses included 2299 participants (696 men, 1603 women), mean age 43 years. Current smoking prevalence was 26% in men and 8% in women (p<0.001). Among men, age adjusted prevalence of current smoking was highest in primary education (36.5%) and lowest in the post-secondary education groups (10.2%), (p= 0.003). Among women, prevalence was highest in junior secondary education (10.2%) and lowest in primary education groups (4.7%), (p = 0.014). Among men, for education, age-adjusted prevalence ratios for current smoking ranged from 2.6 to 3.6 using post-secondary education as the reference category (p<0.05). For occupation, age-adjusted prevalence ratios ranged from 1.7 to 4.1 using professionals and managers as the reference category. Among women, using the same reference categories age-adjusted prevalence ratios for education ranged from 1.4 to 2.2 and for occupation 0.6 to 2.2, neither were statistically significant. ConclusionIn Jamaica, there are socioeconomic disparities in current tobacco smoking among men, where it is inversely associated with education attainment and occupation but in women is less clear. These findings suggest interventions to reduce smoking should consider these disparities.
public and global health
10.1101/2021.01.04.21249226
Family physicians supporting patients with palliative care needs within the Patient Medical Home in the community: An Appreciative Inquiry qualitative study
ObjectivesCanadians want to live and die in their home communities. Unfortunately, Canada has the highest proportion of deaths in acute care facilities as compared to other developed nations. This study aims to identify the essential components required to best support patients and families with palliative care needs in their communities to inform system changes and empower family physicians (FPs) in providing community-based palliative care for patients. DesignAppreciative Inquiry (AI) methodology with individual interviews. Interview transcripts were analyzed iteratively for emerging themes and used to develop "possibility statements" to frame discussion in subsequent focus groups. A conceptual framework emerged to describe the "destiny" state as per AI methods. SettingFamily physicians, palliative home care providers, patients and bereaved caregivers were recruited in the urban and surrounding rural health authority zones of Calgary, Canada. Participants9 females and 9 males FPs (range of practice years 2-42) in interviews; 8 bereaved caregivers, 1 patient, 26 Palliative Home Care team members in focus groups. Interviews and focus groups were recorded digitally and transcribed with consent. ResultsThe identified themes that transcended all three groups created the foundation for the conceptual framework. Enhanced communication and fostering team relationships between all care providers with the focus on the patient and caregivers was the cornerstone concept. The family physician/patient relationship must be protected and encouraged by all care providers, while more system flexibility is needed to respond more effectively to patients. These concepts must exist in the context that patients and caregivers need more education regarding the benefits of palliative care, while increasing public discourse about mortality. ConclusionsKey areas were identified for how the patients team can work together effectively to improve the patient and caregiver palliative care journey in the community with the cornerstone element of building on the trusting FP-patient longitudinal relationship. Strengths and limitations of this studyO_LIThis study uses Appreciative Inquiry qualitative methods to identify the essential components required to best support patients and families with palliative care needs in their communities. C_LIO_LIMultiple perspectives recruited including: community-based family physicians, palliative home care clinicians, patients, and bereaved caregivers. C_LIO_LIAnalysis was focused on the family physicians interviews to derive "possibility statements" and used to frame the focus groups with the other groups of participants. C_LIO_LIGeneralisability may be limited due to the lack of diversity in participants recruited for the patients and bereaved caregivers in terms of ethnicity, age, and gender. C_LIO_LIPatients and caregivers may have been reluctant to volunteer for this study as it involved discussing palliative care. C_LI
palliative medicine
10.1101/2021.01.05.21249252
Post operative acute kidney injury in abdominal Surgeries. A retrospective analysis of single center in western India.
AIMThe aim of our study was to evaluate the incidence and causative factors for acute kidney injury in abdominal surgeries. Material and MethodsAll the abdominal surgeries performed between April 2018 to December 2020, in our institution have been analyzed for acute kidney injury. Acute kidney injury defined according to acute kidney injury network classification. Categorical variables were evaluated by chi-square t-test or fishers t-test wherever appropriate and continuous variables by Mann Whitney U test for nonparametric data and student t-test for parametric test after skewness and kurtosis analysis. Statistical analysis was done using SPSS version 23. P< 0.05 was considered statistically significant. ResultsWe performed 402 gastrointestinal and hepatobiliary surgery from April 2018 to December 2020. After exclusion 372 patients were included in the study population. 20 patients (5.37%) were defined as having acute kidney injury according to acute kidney injury network classifications. On univariate analysis acute kidney injury was associated with open surgery (p= 0.003), Intraoperative hypotension (p<0.001), Colorectal surgeries (p<0.0001), Emergency surgery (p=0.028), CDC grade of surgery (p<0.001), increased used to blood products (p=0.001), higher ASA grade (p<0.0001), increased operative time(p<0.0001). On multivariate logistic regression analysis higher ASA grade (p<0.0001) and increased operative time (0.049) independently predicted acute kidney injury. Acute kidney injury was also significantly associated with 90 days mortality. (p= <0.0001). ConclusionPost-operative acute kidney injury was associated with significant mortality in abdominal surgery. Higher ASA grades and increased operative time predicted acute kidney injury.
surgery
10.1101/2021.01.04.21249202
Brain activity is contingent on neuropsychological function in an fMRI study of Verbal Working Memory in Amyotrophic Lateral Sclerosis
Amyotrophic lateral sclerosis (ALS) is a devastating neurodegenerative disease that causes progressive degeneration of neurons in motor and non-motor regions, affecting multiple cognitive domains. To contribute to the growing research field that employs structural and functional neuroimaging to investigate the effect of ALS on different working memory components, we conducted a functional magnetic resonance imaging (fMRI) study exploring the localization and intensity of alterations in neural activity. Being the first study to specifically address verbal working memory via fMRI in the context of ALS, we employed the verbal n-back task with 0-back and 2-back conditions. Despite ALS patients showing unimpaired accuracies (p = 0.724) and reaction times (p = 0.0785), there was significantly increased brain activity of frontotemporal and parietal regions in the 2-back minus 0-back contrast in patients compared to controls using nonparametric statistics with 5000 permutations and a T-threshold of 2.5. This increased brain activity during working memory performance was largely associated with better neuropsychological function within the ALS group, suggesting a compensatory effect. This study therefore adds to the current knowledge on neural correlates of working memory in ALS and contributes to a more nuanced understanding of hyperactivity during cognitive processes in fMRI studies of ALS.
neurology
10.1101/2021.01.04.21249202
Brain activity is contingent on neuropsychological function in an fMRI study of Verbal Working Memory in Amyotrophic Lateral Sclerosis
Amyotrophic lateral sclerosis (ALS) is a devastating neurodegenerative disease that causes progressive degeneration of neurons in motor and non-motor regions, affecting multiple cognitive domains. To contribute to the growing research field that employs structural and functional neuroimaging to investigate the effect of ALS on different working memory components, we conducted a functional magnetic resonance imaging (fMRI) study exploring the localization and intensity of alterations in neural activity. Being the first study to specifically address verbal working memory via fMRI in the context of ALS, we employed the verbal n-back task with 0-back and 2-back conditions. Despite ALS patients showing unimpaired accuracies (p = 0.724) and reaction times (p = 0.0785), there was significantly increased brain activity of frontotemporal and parietal regions in the 2-back minus 0-back contrast in patients compared to controls using nonparametric statistics with 5000 permutations and a T-threshold of 2.5. This increased brain activity during working memory performance was largely associated with better neuropsychological function within the ALS group, suggesting a compensatory effect. This study therefore adds to the current knowledge on neural correlates of working memory in ALS and contributes to a more nuanced understanding of hyperactivity during cognitive processes in fMRI studies of ALS.
neurology
10.1101/2021.01.04.21249202
Brain activity is contingent on neuropsychological function in an fMRI study of Verbal Working Memory in Amyotrophic Lateral Sclerosis
Amyotrophic lateral sclerosis (ALS) is a devastating neurodegenerative disease that causes progressive degeneration of neurons in motor and non-motor regions, affecting multiple cognitive domains. To contribute to the growing research field that employs structural and functional neuroimaging to investigate the effect of ALS on different working memory components, we conducted a functional magnetic resonance imaging (fMRI) study exploring the localization and intensity of alterations in neural activity. Being the first study to specifically address verbal working memory via fMRI in the context of ALS, we employed the verbal n-back task with 0-back and 2-back conditions. Despite ALS patients showing unimpaired accuracies (p = 0.724) and reaction times (p = 0.0785), there was significantly increased brain activity of frontotemporal and parietal regions in the 2-back minus 0-back contrast in patients compared to controls using nonparametric statistics with 5000 permutations and a T-threshold of 2.5. This increased brain activity during working memory performance was largely associated with better neuropsychological function within the ALS group, suggesting a compensatory effect. This study therefore adds to the current knowledge on neural correlates of working memory in ALS and contributes to a more nuanced understanding of hyperactivity during cognitive processes in fMRI studies of ALS.
neurology
10.1101/2021.01.04.21249246
Effect of Female Age on Infertility Diagnostic Factors and In-vitro Fertilization Treatment Outcomes: A Single-center Retrospective Cohort Study
BackgroundInfertility has become an important issue in modern world because of the increasing number of infertile couples around the world. Advanced maternal age was considered to be a main factor of infertile problem. With second child policy published in China and more women in China intend to seek help for infertility problem, this study provided information of fertility diagnostic factors and IVF treatment outcomes of female IVF patients in different age groups, which can be a guidance for infertility diagnostic and treatment. MethodsA retrospective cohort study was conducted to IVF patient data collected by Jinjiang District Maternal and Child Health Hospital, Chengdu, China. Clinical and laboratory data of 45,743 fresh, autologous IVF cycles from January 2008 to September 2017 were included in the analysis. The diagnostic factors and treatment outcomes were analyzed for different age groups (age<35, n=30,708; age 35-41, n=11,921 and age[&ge;]42, n=3,114) as well as further stratified advanced age groups (age 42, n=933; age 43, n=744; age 44, n=556; age[&ge;]45, n=881). The characteristics including number of previous cycles, duration of infertility, BMI, basic FSH, basic AFC, AMH, retrieved oocyte number, fertilized oocyte number, transferred embryo number, baby number and economic cost were stratified according to patient age. ResultsThe basic characteristics of obesity rate, basic FSH and cancellation rate of IVF cycles in [&ge;]42 years old group were significantly higher than other groups (p<0.01). Basic AFC, AMH, retrieved oocyte number, fertilized oocyte number and transferred embryo number in [&ge;]42 years old group were significantly lower than other groups (p<0.01). Diagnostics characteristics and IVF-ET outcome declined significantly when maternal age increased (p<0.05). In the meanwhile, a preliminary analysis of cost per cycle showed that cost per child increased with patients age increase. ConclusionAlthough with increasing number of advanced maternal age IVF cycles, the age group of [&ge;]42 years would intend to get unsatisfied outcome and higher cost per child. More guidance and considerations should be focused on encouraging earlier age treatment of infertility. Plain English summaryWith more and more women in the global range choose to get late pregnancy because of changes in society and economy, age has become an unavoidable factor in infertility diagnostic and treatment. Advanced age women may experience more infertility problems and negative IVF outcomes. A better understanding of the effect of maternal age on infertility would offer help to both diagnostic and treatment of IVF patient. This study conducted a single-center retrospective cohort analysis to female patients of different age groups and found that women with more advanced age ([&ge;]42) would be more easily to experience unsatisfied IVF outcome and higher economic cost to obtain one child. After the publication of second child policy in China in 2013, the number of advanced age patients increased, the necessity of special guidance for AMA patient may need to be taken into consideration.
obstetrics and gynecology
10.1101/2021.01.04.21249213
Implementing genomic screening in diverse populations
BackgroundPopulation-based genomic screening has the predicted ability to reduce morbidity and mortality associated with medically actionable conditions. However, much research is needed to develop standards for genomic screening, and to understand the perspectives of people offered this new testing modality. This is particularly true for non-European ancestry populations who are vastly underrepresented in genomic medicine research. Therefore, we implemented a pilot genomic screening program in the BioMe Biobank in New York City, where the majority of participants are of non-European ancestry. MethodsWe initiated genomic screening for well-established genes associated with hereditary breast and ovarian cancer syndrome (HBOC), Lynch syndrome (LS), and familial hypercholesterolemia (FH). We evaluated and included an additional gene (TTR) associated with hereditary transthyretin amyloidosis (hATTR), which has a common founder variant in African ancestry populations. We evaluated the characteristics of 74 participants who received results associated with these conditions. We also assessed the preferences of 7,461 newly enrolled BioMe participants to receive genomic results. ResultsIn the pilot genomic screening program, 74 consented participants received results related to HBOC (N=26), LS (N=6), FH (N=8), and hATTR (N=34). Thirty-three of 34 (97.1%) participants who received a result related to hATTR were self-reported African/African American (AA) or Hispanic/Latinx (HL), compared to 14 of 40 (35.0%) participants who received a result related to HBOC, LS, or FH. Among 7,461 participants enrolled after the BioMe protocol modification to allow the return of genomic results, 93.4% indicated that they would want to receive results. Younger participants, women, and HL participants were more likely to opt to receive results. ConclusionsThe addition of TTR to a pilot genomic screening program meant that we returned results to a higher proportion of AA and HL participants, in comparison with genes traditionally included in genomic screening programs in the U.S. We found that the majority of participants in a multi-ethnic biobank are interested in receiving genomic results for medically actionable conditions. These findings increase knowledge about the perspectives of diverse research participants on receiving genomic results, and inform the broader implementation of genomic medicine in underrepresented patient populations.
genetic and genomic medicine
10.1101/2021.01.04.21249241
The Effect of Frames on COVID-19 Vaccine Hesitancy
In order to control the spread of infectious diseases such as COVID-19, it will be important to develop a communication strategy to counteract "vaccine hesitancy". This paper reports the results of a survey experiment testing the impacts of several types of message content: the safety and efficacy of the vaccine itself, the likelihood that others will take the vaccine, and the possible role of politics in promoting the vaccine. In an original survey of 1123 American M-Turk respondents, we provided six different information conditions suggesting the safety and efficacy of the vaccine, the lack of safety/efficacy of the vaccine, the suggestion that most others would take the vaccine, the suggestion that most others would not take the vaccine, the suggestion that the vaccine is being promoted to gain greater control over individual freedom, and the suggestion that it is being rushed for political motivations. We compared the responses for those in the treatment groups with a control group who received no additional information. In comparison to the control group, those who received information about the safety/efficacy of the vaccine were more likely to report that they would take the vaccine, those who received information that others were reluctant to take the vaccine were more likely to report that they themselves would not take it, that other Americans would not take it, and that it was not important to get the vaccine, and those who received information about political influences on vaccine development expressed hesitancy to take it. Communication of effective messages about the vaccine will be essential for public health agencies that seek to promote vaccine take-up.
health policy
10.1101/2021.01.05.20248202
Coronavirus-19 and coagulopathy: A Systematic Review
BackgroundUnderstanding the association between Coronavirus Disease 2019 (COVID-19) and coagulopathy may assist clinical prognostication, and influence treatment and outcomes. We aimed to systematically describe the relationship between hemostatic laboratory parameters and important clinical outcomes among adults with COVID-19. MethodsA systematic review of randomized clinical trials, observational studies and case series published in PubMed (Medline), EMBASE, and CENTRAL from December 1, 2019 to March 25, 2020. Studies of adult patients with COVID-19 that reported at least one hemostatic laboratory parameter were included. ResultsData were extracted from 57 studies (N=12,050 patients) that met inclusion criteria. The average age of patients was 52 years and 45% were women. Of the included studies, 92.7% (N=38/41 studies) reported an average platelet count [&ge;] 150 x 109/L, 68.2% (N=15/22 studies) reported an average prothrombin time (PT) between 11-14 s, 55% (N=11/20 studies) reported an average activated partial thromboplastin time (aPTT) between 25-35 s, and 34.4% (N=11/32 studies) reported a D-dimer concentration above the upper limit of normal (ULN). Eight studies (7 cohorts and 1 case series) reported hemostatic lab values for survivors versus non-survivors. Among non-survivors, D-dimer concentrations were reported in 4 studies and all reported an average above the ULN. InterpretationMost patients had a normal platelet count, elevated D-dimer, PT and aPTT values in the upper reference interval; D-dimer elevation appeared to correlate with poor outcomes. Further studies are needed to better correlate these hemostatic parameters with the risk of adverse outcomes such as thrombosis and bleeding.
hematology
10.1101/2021.01.05.20248202
Coronavirus-19 and coagulopathy: A Systematic Review
BackgroundUnderstanding the association between Coronavirus Disease 2019 (COVID-19) and coagulopathy may assist clinical prognostication, and influence treatment and outcomes. We aimed to systematically describe the relationship between hemostatic laboratory parameters and important clinical outcomes among adults with COVID-19. MethodsA systematic review of randomized clinical trials, observational studies and case series published in PubMed (Medline), EMBASE, and CENTRAL from December 1, 2019 to March 25, 2020. Studies of adult patients with COVID-19 that reported at least one hemostatic laboratory parameter were included. ResultsData were extracted from 57 studies (N=12,050 patients) that met inclusion criteria. The average age of patients was 52 years and 45% were women. Of the included studies, 92.7% (N=38/41 studies) reported an average platelet count [&ge;] 150 x 109/L, 68.2% (N=15/22 studies) reported an average prothrombin time (PT) between 11-14 s, 55% (N=11/20 studies) reported an average activated partial thromboplastin time (aPTT) between 25-35 s, and 34.4% (N=11/32 studies) reported a D-dimer concentration above the upper limit of normal (ULN). Eight studies (7 cohorts and 1 case series) reported hemostatic lab values for survivors versus non-survivors. Among non-survivors, D-dimer concentrations were reported in 4 studies and all reported an average above the ULN. InterpretationMost patients had a normal platelet count, elevated D-dimer, PT and aPTT values in the upper reference interval; D-dimer elevation appeared to correlate with poor outcomes. Further studies are needed to better correlate these hemostatic parameters with the risk of adverse outcomes such as thrombosis and bleeding.
hematology
10.1101/2021.01.05.21249131
Ivermectin shows clinical benefits in mild to moderate Covid19 disease: A randomised controlled double blind dose response study in Lagos.
IntroductionIn vitro studies have shown the efficacy of Ivermectin (IV) to inhibit the SARS - CoV-2 viral replication, but questions remained as to In-vivo applications. We set out to explore the efficacy and safety of Ivermectin in persons infected with COVID19. MethodsWe conducted a translational proof of concept (PoC) randomized, double blind placebo controlled, dose response, parallel group study of IV efficacy in RT - PCR proven COVID 19 positive patients. 62 patients were randomized to 3 treatment groups. (A) IV 6mg regime, (B)IV 12 mg regime (given Q84hrs for 2weeks) (C, control) Lopinavir/Ritonavir. All groups plus standard of Care. ResultsThe Days to COVID negativity [DTN] was significantly and dose dependently reduced by IV (p = 0.0066). The DTN for Control were, = 9.1+/-5.2, for A 6.0 +/- 2.9, and for B 4.6 +/-3.2. 2 Way repeated measures ANOVA of ranked COVID 19 + / - scores at 0, 84, 168, 232 hours showed a significant IV treatment effect (p=0.035) and time effect (p <0.0001). IV also tended to increase SPO2 % compared to controls, p = 0.073, 95% CI - 0.39 to 2.59 and increased platelet count compared to C (p = 0.037) 95%CI 5.55 - 162.55 x 103/ml. The platelet count increase was inversely correlated to DTN (r = -0.52, p = 0.005). No SAE was reported. Conclusions12 mg IV regime may have superior efficacy. IV should be considered for use in clinical management of SARS-Cov-2, and may find applications in community prophylaxis in high-risk areas.
infectious diseases
10.1101/2021.01.05.20232553
Molecular epidemiology of Escherichia coli and Klebsiella species bloodstream infections in Oxfordshire (UK) 2008-2018
The incidence of Gram-negative bloodstream infections (BSIs), predominantly caused by Escherichia coli and Klebsiella species, continues to increase; however the causes of this are unclear and effective interventions are therefore hard to design. In this study we sequenced 3468 sequential, unselected isolates over a decade in Oxfordshire, UK. We demonstrate that the observed increases in E. coli incidence were not driven by clonal expansion; instead, four major sequence types (STs) continue to dominate a stable population structure, with no evidence of adaptation to hospital/community settings. Conversely in Klebsiella spp. most infections are caused by sporadic STs with the exception of a local drug-resistant outbreak strain (ST490). Virulence elements are highly structured by ST in E. coli but not Klebsiella spp. where they occur in a diverse spectrum of STs and equally across healthcare and community settings. Most clinically hypervirulent (i.e. community-onset) Klebsiella BSIs have no known acquired virulence loci. Finally we demonstrate a diverse but largely genus-restricted mobilome with close associations between antimicrobial resistance (AMR) genes and insertion sequences but not typically specific plasmid replicon types; consistent with the dissemination of AMR genes being highly contingent on smaller mobile genetic elements (MGEs). Our large genomic study highlights distinct differences in the molecular epidemiology of E. coli and Klebsiella BSIs, and suggests that no single specific pathogen genetic factors are likely contributing to the increasing incidence of BSI overall, that association with AMR genes in E. coli is a contributor to the increasing number of E. coli BSIs, and that more attention should be given to AMR gene associations with non-plasmid MGEs to try and understand horizontal gene transfer networks.
infectious diseases
10.1101/2021.01.05.21249154
Use of the FebriDx point-of-care assay as part of a triage algorithm for medical admissions with possible COVID-19
BackgroundPatients admitted to hospital with COVID-19 need rapid identification and isolation to prevent nosocomial transmission. However, isolation facilities are often limited, and SARS-CoV-2 RT-PCR results are often not available when discharged from the emergency department. We evaluated a triage algorithm to isolate patients with suspected COVID-19 using simple clinical criteria and the FebriDx assay. DesignRetrospective observational cohort SettingLarge acute care hospital in London, UK ParticipantsAll medical admissions from the ED between 10th August 2020 and 4th November 2020 with valid SARS-CoV-2 RT-PCR. InterventionsMedical admissions were triaged as likely, possible or unlikely COVID-19 based on clinical criteria. Patients triaged as possible COVID-19 underwent FebriDx lateral flow assay on capillary blood, and those positive for MxA were managed as likely COVID-19. Primary Outcome measuresDiagnostic accuracy (sensitivity, specificity and predictive values) of the algorithm and the FebriDx assay compared to SARS-CoV-2 RT-PCR from nasopharyngeal swabs as the reference standard. Results4.0% (136/3,443) of medical admissions had RT-PCR confirmed COVID-19. Prevalence of COVID-19 was 45.7% (80/175) in those triaged as likely, 4.1% (50/1,225) in possible and 0.3% (6/2,033) in unlikely COVID-19. Compared to SARS-CoV-2 RT-PCR, clinical triage had sensitivity of 95.6% (95%CI: 90.5% - 98.0%) and specificity of 61.5% (95%CI: 59.8% - 63.1%), whilst the triage algorithm including FebriDx had sensitivity of 92.6% (95%CI: 86.8% - 96.0%) and specificity of 86.4% (95%CI: 85.2% - 87.5%). The triage algorithm reduced the need for 2,859 patients to be admitted to isolation rooms. Ten patients missed by the algorithm had mild or asymptomatic COVID-19. ConclusionsA triage algorithm including FebriDx assay had good sensitivity and was useful to rule-out COVID-19 among medical admissions to hospital. STRENGTHS AND LIMITATIONS OF THIS STUDYO_LIPragmatic study including a large cohort of consecutive medical admissions providing routine clinical care. C_LIO_LIA single SARS-CoV-2 RT-PCR is an imperfect reference standard for COVID-19. Multiple RT-PCR platforms used, with different PCR targets and performance. C_LIO_LIA higher prevalence of COVID-19 or other respiratory pathogens might alter performance. C_LIO_LICriteria for likely and possible COVID-19 groups changed subtly during the study period. C_LI
infectious diseases
10.1101/2021.01.05.21249154
Use of the FebriDx point-of-care assay as part of a triage algorithm for medical admissions with possible COVID-19
BackgroundPatients admitted to hospital with COVID-19 need rapid identification and isolation to prevent nosocomial transmission. However, isolation facilities are often limited, and SARS-CoV-2 RT-PCR results are often not available when discharged from the emergency department. We evaluated a triage algorithm to isolate patients with suspected COVID-19 using simple clinical criteria and the FebriDx assay. DesignRetrospective observational cohort SettingLarge acute care hospital in London, UK ParticipantsAll medical admissions from the ED between 10th August 2020 and 4th November 2020 with valid SARS-CoV-2 RT-PCR. InterventionsMedical admissions were triaged as likely, possible or unlikely COVID-19 based on clinical criteria. Patients triaged as possible COVID-19 underwent FebriDx lateral flow assay on capillary blood, and those positive for MxA were managed as likely COVID-19. Primary Outcome measuresDiagnostic accuracy (sensitivity, specificity and predictive values) of the algorithm and the FebriDx assay compared to SARS-CoV-2 RT-PCR from nasopharyngeal swabs as the reference standard. Results4.0% (136/3,443) of medical admissions had RT-PCR confirmed COVID-19. Prevalence of COVID-19 was 45.7% (80/175) in those triaged as likely, 4.1% (50/1,225) in possible and 0.3% (6/2,033) in unlikely COVID-19. Compared to SARS-CoV-2 RT-PCR, clinical triage had sensitivity of 95.6% (95%CI: 90.5% - 98.0%) and specificity of 61.5% (95%CI: 59.8% - 63.1%), whilst the triage algorithm including FebriDx had sensitivity of 92.6% (95%CI: 86.8% - 96.0%) and specificity of 86.4% (95%CI: 85.2% - 87.5%). The triage algorithm reduced the need for 2,859 patients to be admitted to isolation rooms. Ten patients missed by the algorithm had mild or asymptomatic COVID-19. ConclusionsA triage algorithm including FebriDx assay had good sensitivity and was useful to rule-out COVID-19 among medical admissions to hospital. STRENGTHS AND LIMITATIONS OF THIS STUDYO_LIPragmatic study including a large cohort of consecutive medical admissions providing routine clinical care. C_LIO_LIA single SARS-CoV-2 RT-PCR is an imperfect reference standard for COVID-19. Multiple RT-PCR platforms used, with different PCR targets and performance. C_LIO_LIA higher prevalence of COVID-19 or other respiratory pathogens might alter performance. C_LIO_LICriteria for likely and possible COVID-19 groups changed subtly during the study period. C_LI
infectious diseases
10.1101/2021.01.04.20248979
Reverse Transcriptase Loop Mediated Isothermal Amplification (RT-LAMP) for COVID-19 Diagnosis: A Systematic Review and Meta-Analysis
BackgroundCoronavirus Disease 2019 (COVID-19) has caused a severe outbreak and become a global public health priority. Rapid increment of infection number along with significant deaths have placed the virus as a serious threat to human health. Rapid, reliable, and simple diagnostic methods are critically essential for disease control. While Reverse Transcriptase Polymerase Chain Reaction (RT-PCR) is the current diagnostic gold standard, Reverse Transcriptase Loop-Mediated Isothermal Amplification (RT-LAMP) appears as a compelling alternative diagnostic test due to its more simplicity, shorter time to result, and lower cost. This study examined RT-LAMP application for rapid identification of SARS-CoV-2 infection compared to RT-PCR assay. MethodsA systematic review and meta-analysis (2020) was conducted in 6 scientific databases following the PRISMA Guideline. Original published studies on human clinical samples in English were included. Articles evaluated sensitivity and specificity of RT-LAMP relative to RT-PCR were considered eligible. Quality assessment of bias and applicability was examined based on QUADAS-2. ResultsA total of 351 studies were found based on the keywords and search queries. 14 eligible case control studies fitted the respective criteria. Quality assessment using QUADAS-2 indicated low risk bias in all included studies. All case studies, comprises 2,112 samples, had the cumulative sensitivity of 95.5% (CI 97.5%=90.8-97.9%) and cumulative specificity of 99.5% (CI 97.5%=97.7-99.9%). ConclusionRT-LAMP assay could be suggested as a reliable alternative COVID-19 diagnostic method with reduced cost and time compared to RT-PCR. RT-LAMP could potentially be utilized during the high-throughput and high-demand critical situations.
infectious diseases
10.1101/2021.01.04.21249231
Modeling COVID-19 outbreaks in United States with distinct testing, lockdown speed and fatigue rates
Each state in the United States exhibited a unique response to the COVID-19 outbreak, along with variable levels of testing, leading to different actual case burdens in the country. In this study, via per-capita testing dependent ascertainment rates, along with case and death data, we fit a minimal epidemic model for each state. We estimate infection-level responsive lockdown entry and exit rates (representing government and behavioral reaction), along with the true number of cases as of May 31, 2020. Ultimately we provide error corrected estimates for commonly used metrics such as infection fatality ratio and overall case ascertainment for all 55 states and territories considered, along with the United States in aggregate, in order to correlate outbreak severity with first wave intervention attributes and suggest potential management strategies for future outbreaks. We observe a theoretically predicted inverse proportionality relation between outbreak size and lockdown rate, with scale dependent on the underlying reproduction number and simulations suggesting a critical population quarantine "half-life" of 30 days independent of other model parameters.
infectious diseases
10.1101/2021.01.04.21249218
State-level COVID-19 Trend Forecasting Using Mobility and Policy Data
AO_SCPLOWBSTRACTC_SCPLOWThe importance of pandemic forecast cannot be overemphasized. We propose an interpretable machine learning approach for forecasting pandemic transmission rates by utilizing local mobility statistics and government policies. A calibration step is introduced to deal with time-varying relationships between transmission rates and predictors. Experimental results demonstrate that our approach is able to make accurate two-week ahead predictions of the state-level COVID-19 infection trends in the US. Moreover, the models trained by our approach offer insights into the spread of COVID-19, such as the association between the baseline transmission rate and the state-level demographics, the effectiveness of local policies in reducing COVID-19 infections, and so on. This work provides a good understanding of COVID-19 evolution with respect to state-level characteristics and can potentially inform local policymakers in devising customized response strategies.
infectious diseases
10.1101/2020.12.30.20249062
Outbreak or pseudo-outbreak? Integrating SARS-CoV-2 sequencing to validate infection control practices in an end stage renal disease facility
BackgroundThe COVID-19 pandemic of 2020 poses a particularly high risk for End Stage Renal Disease (ESRD) patients and led to a need for facility-wide control plans to prevent introduction and spread of infection within ESRD facilities. Rapid identification of clusters of contemporaneous cases is essential, as these may be indicative of within-facility spread. Nevertheless, in a setting of high community COVID-19 prevalence, a series of ESRD patients may test positive at around the same time without their shared ESRD facility being the nexus for disease spread. Here we describe a series of five cases occurring within an eleven-day period in November 2020 in a hospital-based 32-station ESRD facility in southwest Wisconsin, the subsequent facility-wide testing, and the use of genetic sequence analysis of positive specimens to evaluate whether these cases were linked. MethodsFour patient cases and one staff case were identified in symptomatic individuals by RT-PCR. Facility-wide screening was initiated at the request of local public health and conducted using Abbot BinaxNOW antigen tests. SARS-CoV-2 genome sequences were obtained from residual diagnostic test specimens using an amplicon-based approach on an Ion Torrent S5 sequencer. ResultsResidual specimens from 4 of 5 cases were available for sequence analysis. Each sequence was very clearly genetically distinct from the others, indicating that these contemporaneous cases were not linked. Facility-wide screening of 47 staff and 107 patients did not identify any additional cases. ConclusionsThese data indicate that despite the outward appearance of a case cluster, the facility did not experience within-facility spread nor serve as the epicenter of a new outbreak, suggesting that the enacted rigorous infection control procedures (screening, masking, distancing) practiced stringently by patients and staff were sufficient to permit dialysis to proceed safely in a very high-risk population under pressure from increasing community spread. These data also demonstrate the utility of rapid turnaround SARS-CoV-2 sequencing in outbreak investigations in settings like ESRD facilities.
infectious diseases
10.1101/2021.01.03.21249175
Modelling COVID -19 Transmission in a Hemodialysis Centre Using Simulation Generated Contacts Matrices
The COVID-19 pandemic has been particularly threatening to the patients with end-stage kidney disease (ESKD) on intermittent hemodialysis and their care providers. Hemodialysis patients who receive life-sustaining medical therapy in healthcare settings, face unique challenges as they need to be at a dialysis unit three or more times a week, where they are confined to specific settings and tended to by dialysis nurses and staff with physical interaction and in close proximity. Despite the importance and critical situation of the dialysis units, modelling studies of the SARS-CoV-2 spread in these settings are very limited. In this paper, we have used a combination of discrete event and agent-based simulation models, to study the operations of a typical large dialysis unit and generate contact matrices to examine outbreak scenarios. We present the details of the contact matrix generation process and demonstrate how the simulation calculates a micro-scale contact matrix comprising the number and duration of contacts at a micro-scale time step. We have used the contacts matrix in an agent-based model to predict disease transmission under different scenarios. The results show that micro-simulation can be used to estimate contact matrices, which can be used effectively for disease modelling in dialysis and similar settings.
infectious diseases
10.1101/2021.01.03.21249175
Modelling COVID -19 Transmission in a Hemodialysis Centre Using Simulation Generated Contacts Matrices
The COVID-19 pandemic has been particularly threatening to the patients with end-stage kidney disease (ESKD) on intermittent hemodialysis and their care providers. Hemodialysis patients who receive life-sustaining medical therapy in healthcare settings, face unique challenges as they need to be at a dialysis unit three or more times a week, where they are confined to specific settings and tended to by dialysis nurses and staff with physical interaction and in close proximity. Despite the importance and critical situation of the dialysis units, modelling studies of the SARS-CoV-2 spread in these settings are very limited. In this paper, we have used a combination of discrete event and agent-based simulation models, to study the operations of a typical large dialysis unit and generate contact matrices to examine outbreak scenarios. We present the details of the contact matrix generation process and demonstrate how the simulation calculates a micro-scale contact matrix comprising the number and duration of contacts at a micro-scale time step. We have used the contacts matrix in an agent-based model to predict disease transmission under different scenarios. The results show that micro-simulation can be used to estimate contact matrices, which can be used effectively for disease modelling in dialysis and similar settings.
infectious diseases
10.1101/2021.01.03.21249175
Modelling COVID -19 Transmission in a Hemodialysis Centre Using Simulation Generated Contacts Matrices
The COVID-19 pandemic has been particularly threatening to the patients with end-stage kidney disease (ESKD) on intermittent hemodialysis and their care providers. Hemodialysis patients who receive life-sustaining medical therapy in healthcare settings, face unique challenges as they need to be at a dialysis unit three or more times a week, where they are confined to specific settings and tended to by dialysis nurses and staff with physical interaction and in close proximity. Despite the importance and critical situation of the dialysis units, modelling studies of the SARS-CoV-2 spread in these settings are very limited. In this paper, we have used a combination of discrete event and agent-based simulation models, to study the operations of a typical large dialysis unit and generate contact matrices to examine outbreak scenarios. We present the details of the contact matrix generation process and demonstrate how the simulation calculates a micro-scale contact matrix comprising the number and duration of contacts at a micro-scale time step. We have used the contacts matrix in an agent-based model to predict disease transmission under different scenarios. The results show that micro-simulation can be used to estimate contact matrices, which can be used effectively for disease modelling in dialysis and similar settings.
infectious diseases
10.1101/2021.01.04.21249227
Antithrombotic Therapy in COVID-19: Systematic Summary of Ongoing or Completed Randomized Trials
Endothelial injury and microvascular/macrovascular thrombosis are common pathophysiologic features of coronavirus disease-2019 (COVID-19). However, the optimal thromboprophylactic regimens remain unknown across the spectrum of illness severity of COVID-19. A variety of antithrombotic agents, doses and durations of therapy are being assessed in ongoing randomized controlled trials (RCTs) that focus on outpatients, hospitalized patients in medical wards, and critically-ill patients with COVID-19. This manuscript provides a perspective of the ongoing or completed RCTs related to antithrombotic strategies used in COVID-19, the opportunities and challenges for the clinical trial enterprise, and areas of existing knowledge, as well as data gaps that may motivate the design of future RCTs.
cardiovascular medicine
10.1101/2021.01.04.21249217
Angiotensin converting enzyme inhibitors and risk of lung cancer: a population-based cohort study
ObjectivesTo determine whether the use of angiotensin converting enzyme inhibitors (ACEIs) was associated with a higher risk of lung cancer when compared to use of angiotensin receptor blockers (ARBs). Study DesignPopulation-based cohort study. SettingPublic hospitals under the Hospital Authority in Hong Kong, P.R. China. MethodsPatients admitted to public hospitals and first prescribed with ACEI and/or ARB between 1 January 2001 and 31 December 2018 were analyzed. The last follow-up date was 31 August 2020, or death, whichever was earlier. OutcomesThe primary outcome was the incidence of lung cancer. Logistic regression was used to calculate odds ratio [ORs] with 95% confidence intervals associated with the use of ACEIs compared to ARBs. Incidence and odds ratios were estimated for temporal analysis of incident cancer risk associated with time since the first prescription of ACEI or ARB. ResultsIn the unmatched cohort, 56,697 patients and 357,011 patients were included the ARB and ACEI cohorts, with lung cancer incidence of 2.16% and 1.29%, respectively. Using 1:3 matching for ARB to ACEI users, the incidences were 2.32% and 1.29%. ACEI use was associated with increased risks of lung cancer both before (hazard ratio: 1.30 [1.21-1.40], P<0.0001) and after (1.40 [1.29-1.51], P<0.0001) matching. There was a dose-dependent relationship between ACEI exposure and lung cancer risk. ConclusionsACEI use was associated with increased risk of lung cancer compared with ARB use at all time points. Additionally, incidence risk increases with the duration of exposure.
cardiovascular medicine
10.1101/2021.01.05.21249294
Dopamine transporter in obesity: a meta-analysis
The brain plays a major role in controlling the desire to eat. This meta-analysis aimed to assess the association between dopamine transporter (DAT) availability, and obesity. We performed a systematic search of MEDLINE (from inception to November 2020) and EMBASE (from inception to November 2020) for articles published in English using the keywords "dopamine transporter," "obesity," and "neuroimaging". Data were plotted for each radiopharmaceutical, and linear regression was used to describe the relationship between DAT ratio, and body mass index (BMI), spline curves were adopted to fit data between DAT ratio and BMI. Five studies including 421 subjects were eligible for inclusion in this study. Two studies with 123I-FP-CIT, one with 99mTc-TRODAT, one with 123I-PE21, and one with 18F-FP-CIT were included. DAT availabilities from ENC-DAT project were higher than those from PPMI database both for caudate nucleus, and putamen. As there might be the inter-study variability, we calculated DAT ratio, after dividing DAT availabilities of subjects with overweight/obese BMI with mean DAT availabilities of subjects with normal BMI. In conclusion, we have shown that DAT availability of subjects with overweight/obesity was not different from those with normal BMI.
endocrinology
10.1101/2021.01.05.20249090
Analysis of 200,000 exome-sequenced UK Biobank subjects illustrates the contribution of rare genetic variants to hyperlipidaemia
A few genes have previously been identified in which very rare variants can have major effects on lipid levels. Weighted burden analysis of rare variants was applied to exome sequenced UK Biobank subjects with hyperlipidaemia as the phenotype, of whom 44,050 were designated cases and 156,578 controls, with the strength of association characterised by the signed log 10 p value (SLP). With principal components included as covariates there was a tendency for genes on the X chromosome to produce strongly negative SLPs, and this was found to be due to the fact that rare X chromosome variants were identified less frequently in males than females. The test performed well when both principal components and sex were included as covariates and strongly implicated LDLR (SLP = 50.08) and PCSK9 (SLP = -10.42) while also highlighting other genes previously found to be associated with lipid levels. Variants classified by SIFT as deleterious have on average a two-fold effect and their cumulative frequency is such that they are present in approximately 1.5% of the population. These analyses shed further light on the way that genetic variation contributes to risk of hyperlipidaemia and in particular that there are very many protein-altering variants which have on average moderate effects and whose effects can be detected when large samples of exome-sequenced subjects are available. This research has been conducted using the UK Biobank Resource.
endocrinology
10.1101/2021.01.05.20249090
Analysis of 200,000 exome-sequenced UK Biobank subjects illustrates the contribution of rare genetic variants to hyperlipidaemia
A few genes have previously been identified in which very rare variants can have major effects on lipid levels. Weighted burden analysis of rare variants was applied to exome sequenced UK Biobank subjects with hyperlipidaemia as the phenotype, of whom 44,050 were designated cases and 156,578 controls, with the strength of association characterised by the signed log 10 p value (SLP). With principal components included as covariates there was a tendency for genes on the X chromosome to produce strongly negative SLPs, and this was found to be due to the fact that rare X chromosome variants were identified less frequently in males than females. The test performed well when both principal components and sex were included as covariates and strongly implicated LDLR (SLP = 50.08) and PCSK9 (SLP = -10.42) while also highlighting other genes previously found to be associated with lipid levels. Variants classified by SIFT as deleterious have on average a two-fold effect and their cumulative frequency is such that they are present in approximately 1.5% of the population. These analyses shed further light on the way that genetic variation contributes to risk of hyperlipidaemia and in particular that there are very many protein-altering variants which have on average moderate effects and whose effects can be detected when large samples of exome-sequenced subjects are available. This research has been conducted using the UK Biobank Resource.
endocrinology
10.1101/2021.01.04.21249233
A novel computational approach to reconstruct SARS-CoV-2 infection dynamics through the inference of unsampled sources of infection
Infectious diseases such as the COVID19 pandemic cemented the importance of disease tracking. The role of asymptomatic, undiagnosed individuals in driving infection has become evident. Their unaccountability results in ineffective prevention. We developed a pipeline using genomic data to accurately predict a populations transmission network complete with the inference of unsampled sources. The system utilises Bayesian phylogenetics to capture evolutionary and infection dynamics of SARS-CoV-2. It identified the effectiveness of preventive measures in Canadas Atlantic bubble and mobile populations such as New York State. Its robustness extends to the prediction of cross-species disease transmission as we inferred SARS-CoV-2 transmission from humans to lions and tigers in New York Citys Bronx Zoo. The proposed methods ability to generate such complete transmission networks, provides a more detailed insight into the transmission dynamics within a population. This potential frontline tool will be of direct help in "the battle to bend the curve".
epidemiology
10.1101/2021.01.06.21249318
Stochastic model for COVID-19 in slums: interaction between biology and public policies
We present a mathematical model for the simulation of the development of an outbreak of COVID-19 in a slum area under different interventions. Instead of representing interventions as modulations of the parameters of a free running epidemic we introduce a model structure that accounts for the actions but does not assume the results. The disease is modelled in terms of the progression of viremia reported in scientific works. The emergence of symptoms in the model reflects the statistics of a nation-wide highly detailed database consisting of more than 62000 cases (about a half of the confirmed by RT-PCR tests) with recorded symptoms in Argentina. The stochastic model displays several of the characteristics of COVID-19 such as a high variability in the evolution of the outbreaks, including long periods in which they run undetected, spontaneous extinction followed by a late outbreak and unimodal as well as bimodal progressions of daily counts of cases (second waves without ad-hoc hypothesis). We show how the relation between undetected cases (including the "asymptomatic" cases) and detected cases changes as a function of the public policies, the efficiency of the implementation and the timing with respect to the development of the outbreak. We show also that the relation between detected cases and total cases strongly depends on the implemented policies and that detected cases cannot be regarded as a measure of the outbreak, being the dependency between total cases and detected cases in general not monotonic as a function of the efficiency in the intervention method. According to the model, it is possible to control an outbreak with interventions based on the detection of symptoms only in the case when the presence of just one symptom prompts isolation and the detection efficiency reaches about 80% of the cases. Requesting two symptoms to trigger intervention can be enough to fail in the goals.
epidemiology
10.1101/2021.01.06.21249318
Stochastic model for COVID-19 in slums: interaction between biology and public policies
We present a mathematical model for the simulation of the development of an outbreak of COVID-19 in a slum area under different interventions. Instead of representing interventions as modulations of the parameters of a free running epidemic we introduce a model structure that accounts for the actions but does not assume the results. The disease is modelled in terms of the progression of viremia reported in scientific works. The emergence of symptoms in the model reflects the statistics of a nation-wide highly detailed database consisting of more than 62000 cases (about a half of the confirmed by RT-PCR tests) with recorded symptoms in Argentina. The stochastic model displays several of the characteristics of COVID-19 such as a high variability in the evolution of the outbreaks, including long periods in which they run undetected, spontaneous extinction followed by a late outbreak and unimodal as well as bimodal progressions of daily counts of cases (second waves without ad-hoc hypothesis). We show how the relation between undetected cases (including the "asymptomatic" cases) and detected cases changes as a function of the public policies, the efficiency of the implementation and the timing with respect to the development of the outbreak. We show also that the relation between detected cases and total cases strongly depends on the implemented policies and that detected cases cannot be regarded as a measure of the outbreak, being the dependency between total cases and detected cases in general not monotonic as a function of the efficiency in the intervention method. According to the model, it is possible to control an outbreak with interventions based on the detection of symptoms only in the case when the presence of just one symptom prompts isolation and the detection efficiency reaches about 80% of the cases. Requesting two symptoms to trigger intervention can be enough to fail in the goals.
epidemiology
10.1101/2021.01.05.21249264
Representative Estimates of COVID-19 Infection Fatality Rates from Three Locations in India
There are very few estimates of the age-specific infection fatality rate (IFR) of SARS-CoV-2 in low- and middle-income countries. India reports the second highest number of SARS-CoV-2 infections in the world. We estimate age-specific IFR using data from seroprevalence surveys in Mumbai (population 12 million) and Karnataka (population 61 million), and a random sample of economically distressed migrants in Bihar with mortality followup. Among men aged 50-89, IFR is 0.12% in Karnataka (95% C.I. 0.09%-0.15%), 0.53% in Mumbai (0.52%-0.54%), and 5.64% among migrants in Bihar (0-11.16%). IFR in India is approximately twice as high for men as for women, is heterogeneous across contexts, and rises much less at older ages than in comparable studies from high income countries.
epidemiology
10.1101/2021.01.05.21249239
Potential Blood Transfusion Adverse Events Can be Found in Unstructured Text in Electronic Health Records using the 'Shakespeare Method'
BackgroundText in electronic health records (EHRs) and big data tools offer the opportunity for surveillance of adverse events (patient harm associated with medical care) (AEs) in the unstructured notes. Writers may explicitly state an apparent association between treatment and adverse outcome ("attributed") or state the simple treatment and outcome without an association ("unattributed"). We chose the case of transfusion adverse events (TAEs) and potential TAEs (PTAEs) because real dates were obscured in the study data, and new TAE types were becoming recognized during the study data period. ObjectiveDevelop a new method to identify attributed and unattributed potential adverse events using the unstructured text of EHRs. MethodsWe used EHRs for adult critical care admissions at a major teaching hospital, 2001-2012. We formed a transfusion (T) group (21,443 admissions treated with packed red blood cells, platelets, or plasma), excluded 2,373 ambiguous admissions, and formed a comparison (C) group of 25,468 admissions. We concatenated the text notes for each admission, sorted by date, into one document, and deleted replicate sentences and lists. We identified statistically significant words in T vs. C. T documents were filtered to those words, followed by topic modeling on the T filtered documents to produce 45 topics. For each topic, the three documents with the maximum topic scores were manually reviewed to identify events that occurred shortly after the first transfusion; documents with clear alternative explanations for heart, lung, and volume overload problems (e.g., advanced cancer, lung infection) were excluded. We also reviewed documents with the most topics, as well as 20 randomly selected T documents without alternate explanations. ResultsTopics centered around medical conditions. The average number of significant topics was 6.1. Most PTAEs were not attributed to transfusion in the notes. Admissions with a top-scoring cardiovascular topic (heart valve repair, tapped pericardial effusion, coronary artery bypass graft, heart attack, or vascular repair) were more likely than random T admissions to have at least one heart PTAE (heart rhythm changes or hypotension, proportion difference = 0.47, p = 0.022). Admissions with a top-scoring pulmonary topic (mechanical ventilation, acute respiratory distress syndrome, inhaled nitric oxide) were more likely than random T admissions (proportion difference = 0.37, p = 0.049) to have at least one lung PTAE (hypoxia, mechanical ventilation, bilateral pulmonary effusion, or pulmonary edema). ConclusionsThe "Shakespeare Method" could be a useful supplement to AE reporting and surveillance of structured EHR data. Future improvements should include automation of the manual review process.
epidemiology
10.1101/2021.01.05.20248952
Prevalence of RT-PCR-detected SARS-CoV-2 infection at schools: First results from the Austrian School-SARS-CoV-2 Study
BackgroundThe role of schools in the SARS-CoV-2 pandemic is much debated. We aimed to quantify reliably the prevalence of SARS-CoV-2 infections at schools detected with reverse-transcription quantitative polymerase-chain-reaction (RT-qPCR). MethodsThis nationwide prospective cohort study monitors a representative sample of pupils (grade 1-8) and teachers at Austrian schools throughout the school year 2020/2021. We repeatedly test participants for SARS-CoV-2 infection using a gargling solution and RT-qPCR. We herein report on the first two rounds of examinations. We used mixed-effect logistic regression to estimate odds ratios and robust 95% confidence intervals (95% CI). FindingsWe analysed data on 10734 participants from 245 schools (9465 pupils, 1269 teachers). Prevalence of SARS-CoV-2 infection increased from 0.39% at round 1 (95% CI 0.28-0{middle dot}55%, 29 September-22 October 2020) to 1{middle dot}39% at round 2 (95% CI 1{middle dot}04-1{middle dot}85%, 10-16 November). Odds ratios for SARS-CoV-2 infection were 2{middle dot}26 (95% CI 1{middle dot}25-4{middle dot}12, P=0{middle dot}007) in regions with >500 vs. [&le;]500 inhabitants/km2, 1{middle dot}67 (95% CI 1{middle dot}42-1{middle dot}97, P<0{middle dot}001) per two-fold higher regional 7-day incidence, and 2{middle dot}78 (95% CI 1{middle dot}73-4{middle dot}48, P<0{middle dot}001) in pupils at schools with high/very high vs. low/moderate social deprivation. Associations of community incidence and social deprivation persisted in a multivariable adjusted model. Prevalence did not differ by average number of pupils per class nor between age groups, sexes, pupils vs. teachers, or primary (grade 1-4) vs. secondary schools (grade 5-8). InterpretationThis monitoring study in Austrian schools revealed SARS-CoV-2 infection in 0{middle dot}39%-1{middle dot}39% of participants and identified associations of regional community incidence and social deprivation with higher prevalence. FundingBMBWF Austria.
epidemiology
10.1101/2021.01.05.20248952
Prevalence of RT-qPCR-detected SARS-CoV-2 infection at schools: First results from the Austrian School-SARS-CoV-2 prospective cohort study
BackgroundThe role of schools in the SARS-CoV-2 pandemic is much debated. We aimed to quantify reliably the prevalence of SARS-CoV-2 infections at schools detected with reverse-transcription quantitative polymerase-chain-reaction (RT-qPCR). MethodsThis nationwide prospective cohort study monitors a representative sample of pupils (grade 1-8) and teachers at Austrian schools throughout the school year 2020/2021. We repeatedly test participants for SARS-CoV-2 infection using a gargling solution and RT-qPCR. We herein report on the first two rounds of examinations. We used mixed-effect logistic regression to estimate odds ratios and robust 95% confidence intervals (95% CI). FindingsWe analysed data on 10734 participants from 245 schools (9465 pupils, 1269 teachers). Prevalence of SARS-CoV-2 infection increased from 0.39% at round 1 (95% CI 0.28-0{middle dot}55%, 29 September-22 October 2020) to 1{middle dot}39% at round 2 (95% CI 1{middle dot}04-1{middle dot}85%, 10-16 November). Odds ratios for SARS-CoV-2 infection were 2{middle dot}26 (95% CI 1{middle dot}25-4{middle dot}12, P=0{middle dot}007) in regions with >500 vs. [&le;]500 inhabitants/km2, 1{middle dot}67 (95% CI 1{middle dot}42-1{middle dot}97, P<0{middle dot}001) per two-fold higher regional 7-day incidence, and 2{middle dot}78 (95% CI 1{middle dot}73-4{middle dot}48, P<0{middle dot}001) in pupils at schools with high/very high vs. low/moderate social deprivation. Associations of community incidence and social deprivation persisted in a multivariable adjusted model. Prevalence did not differ by average number of pupils per class nor between age groups, sexes, pupils vs. teachers, or primary (grade 1-4) vs. secondary schools (grade 5-8). InterpretationThis monitoring study in Austrian schools revealed SARS-CoV-2 infection in 0{middle dot}39%-1{middle dot}39% of participants and identified associations of regional community incidence and social deprivation with higher prevalence. FundingBMBWF Austria.
epidemiology
10.1101/2021.01.05.21249255
Viral Variants and Vaccinations: If We Can Change the COVID-19 Vaccine, Should We?
As we close in on one year since the COVID-19 pandemic began, hope has been placed on bringing the virus under control through mass administration of recently developed vaccines. Unfortunately, newly emerged, fast-spreading strains of COVID-19 threaten to undermine progress by interfering with vaccine efficacy. While a long-term solution to this challenge would be to develop vaccines that simultaneously target multiple different COVID-19 variants, this approach faces both developmental and regulatory hurdles. A simpler option would be to switch the target of the current vaccine to better match the newest viral variant. I use a stochastic simulation to determine when it is better to target a newly emerged viral variant and when it is better to target the dominant but potentially less transmissible strain. My simulation results suggest that it is almost always better to target the faster spreading strain, even when the initial prevalence of this variant is much lower. In scenarios where targeting the slower spreading variant is best, all vaccination strategies perform relatively well, meaning that the choice of vaccination strategy has a small effect on public health outcomes. In scenarios where targeting the faster spreading variant is best, use of vaccines against the faster spreading viral variant can save many lives. My results provide rule of thumb guidance for those making critical decisions about vaccine formulation over the coming months.
epidemiology
10.1101/2021.01.05.20248973
Associations between Stress and Child Verbal Abuse and Corporal Punishment during the COVID-19 Pandemic and Potential Effect Modification by Lockdown Measures
BackgroundChild abuse appears to be on the increase during the COVID-19 pandemic, but the extent that lockdown measures modified the association between stress and abuses has not been systematically assessed. ObjectivesTo assess: 1) the association between caregivers stress and self-reported verbal abuse and corporal punishment of a child in the household, and; 2) modification of the stated association by experienced COVID-19 lockdown measures. Participants and settingsCaregivers residing in villages on lockdown in the Deep South of Thailand (n=466 participants) MethodsWe randomly sampled 12 villages in the study area, and 40 households per village. Trained enumerators who were residents of the sampled villages collected the data using phone-based interview. We measured stress level using the standard ST-5 questionnaire. We developed and pilot-tested questions for measurement of child abuse and lockdown experiences specifically for this study. ResultsCaregivers with moderate and higher levels of stress were more likely than caregivers with low level of stress to report verbal abuse (48% vs. 23%, respectively; Adj. OR = 3.12, 95% CI = 1.89, 5.15) and corporal punishment (28% vs. 8%, respectively; Adj. OR = 2.76, 95% CI = 1.41, 5.42). We found that COVID-19 lockdown experiences modified the associations between stress and verbal abuse and corporal punishment. ConclusionThere were associations between stress and abuses, which were modified by lockdown experiences. However, social desirability, lack of details in the answers, and potential confounding by mental illness co-morbidities were notable limitations of the study. Caveat is advised in the interpretation of the study findings.
epidemiology
10.1101/2021.01.05.21249247
SARS-CoV-2 seroprevalence in the urban population of Qatar: An analysis of antibody testing on a sample of 112,941 individuals
BackgroundQatar has experienced a large SARS-CoV-2 epidemic. Our first objective was to assess the proportion of the urban population that has been infected with SARS-CoV-2, by measuring the prevalence of detectable antibodies. Our second objective was to identify predictors for infection and for having higher antibody titers. MethodsResidual blood specimens from individuals receiving routine and other clinical care between May 12-September 9, 2020 were tested for anti-SARS-CoV-2 antibodies. Associations with seropositivity and higher antibody titers were identified through regression analyses. Probability weights were applied in deriving the epidemiological measures. ResultsWe tested 112,941 individuals ([~]10% of Qatars urban population), of whom 51.6% were men and 66.0% were 20-49 years of age. Seropositivity was 13.3% (95% CI: 13.1-13.6%) and was significantly associated with sex, age, nationality, clinical-care type, and testing date. The proportion with higher antibody titers varied by age, nationality, clinical-care type, and testing date. There was a strong correlation between higher antibody titers and seroprevalence in each nationality, with a Pearson correlation coefficient of 0.85 (95% CI: 0.47-0.96), suggesting that higher antibody titers may indicate repeated exposure to the virus. The percentage of antibody-positive persons with prior PCR-confirmed diagnosis was 47.1% (95% CI: 46.1-48.2%), severity rate was 3.9% (95% CI: 3.7-4.2%), criticality rate was 1.3% (95% CI: 1.1-1.4%), and fatality rate was 0.3% (95% CI: 0.2-0.3%). ConclusionsFewer than two in every 10 individuals in Qatars urban population had detectable antibodies against SARS-CoV-2 between May 12-September 9, 2020, suggesting that this population is still far from the herd immunity threshold and at risk from a subsequent epidemic wave.
epidemiology
10.1101/2021.01.04.21249235
Machine Learning Forecast of Growth in COVID-19 Confirmed Infection Cases with Non-Pharmaceutical Interventions and Cultural Dimensions: Algorithm Development and Validation
BackgroundNational governments have implemented non-pharmaceutical interventions to control and mitigate against the COVID-19 pandemic. A deep understanding of these interventions is required. ObjectiveWe investigate the prediction of future daily national Confirmed Infection Growths - the percentage change in total cumulative cases across 14 days - using metrics representative of non-pharmaceutical interventions and cultural dimensions of each country. MethodsWe combine the OxCGRT dataset, Hofstedes cultural dimensions, and COVID-19 daily reported infection case numbers to train and evaluate five non-time series machine learning models in predicting Confirmed Infection Growth. We use three validation methods - in-distribution, out-of-distribution, and country-based cross-validation - for evaluation, each applicable to a different use case of the models. ResultsOur results demonstrate high R2 values between the labels and predictions for the in-distribution, out-of-distribution, and country-based cross-validation methods (0.959, 0.513, and 0.574 respectively) using random forest and AdaBoost regression. While these models may be used to predict the Confirmed Infection Growth, the differing accuracies obtained from the three tasks suggest a strong influence of the use case. ConclusionsThis work provides new considerations in using machine learning techniques with non-pharmaceutical interventions and cultural dimensions data for predicting the national growth of confirmed infections of COVID-19.
epidemiology
10.1101/2021.01.05.20248250
Common Factors in Serious Case Reviews of Child Maltreatment where there is a Medical Cause of Death: Qualitative Thematic Analysis
AimTo identify common factors in Serious Case Reviews (SCRs) where a child has died of a medical cause. Design: Qualitative thematic analysis. BackgroundSCRs take place when neglect or abuse results in children dying or being seriously harmed. Known key factors within SCRs include parental substance misuse, mental health problems, and domestic abuse. To date there has been no investigation of children who die of a medical cause where there are concerns about child maltreatment. Data sourcesA list of SCRs relating to deaths through medical causes was provided from previous coded studies and accessed from the NSPCC National Case Review Repository. Twenty-three SCRs with a medical cause of death from 1st April 2009-31st March 2017 were sourced. ResultsTwenty children died of an acute condition and 12 of a chronic condition; 20 of the deaths were unexpected and maltreatment contributed to the deaths of 18 children. Most children were either under one or over 16 at time of death. Many parents were caring for a child with additional vulnerabilities including behavioural issues (6/23), learning difficulties (6/23), mental health issues (5/23) or a chronic medical condition (12/23). Common parental experiences included: domestic violence/abuse (13/23), drug/alcohol misuse (10/23), mental ill health or struggling to cope (7/23), criminal history (11/23), and caring for another vulnerable individual (8/23). Most children lived in a chaotic household characterised by missed medical appointments (18/23), poor school attendance (11/23), poor physical home environment (7/23) and disguised compliance (12/23). All 23 SCRs reported elements of abusive or neglectful parenting. In most there was evidence of cumulative harm, where multiple factors contributed to their premature death. At the time of death 11 children were receiving social care support. ConclusionWhile the underlying medical cause of the childs death was often incurable, the maltreatment that often exacerbated the medical issue could have been prevented. Article Summary. Strengths and weaknesses of the studyO_LINo other study has analysed SCRs in which children have died of medical causes. C_LIO_LIThe most complete dataset possible was used to conduct the robust analysis: SCRs were sourced from the complete list from the Department for Education used for previous national analyses of SCRs. C_LIO_LIRandomly selected SCRs were re-coded by two further researchers to check for any discrepancies in coding, increasing the reliability of results. C_LIO_LINot all child deaths lead to SCR, even when there are concerns about maltreatment; local areas may differ on their threshold of suspicion; content within SCRs is often variable and inconsistent; so there may be deaths relevant to this study which were not included. C_LIO_LIWe only investigated those cases in which a child died, focusing therefore on the worst cases and perhaps missing incidents in which a child had a medical condition and experienced maltreatment but did not die. C_LI
pediatrics
10.1101/2021.01.05.21249234
Assessment of Pharmacokinetic Interaction Potential Between Caffeine and Methylliberine
Methylliberine and theacrine are methylurates found in the leaves of various Coffea species and Camellia assamica var. kucha, respectively. We previously demonstrated that the methylxanthine caffeine increased theacrines oral bioavailability in humans. Consequently, we conducted a double-blind, placebo-controlled study pharmacokinetic study in humans administered methylliberine, theacrine, and caffeine to determine methylliberines pharmacokinetic interaction potential with either caffeine or theacrine. Subjects (n = 12) received an oral dose of either methylliberine (25 or 100 mg), caffeine (150 mg), methylliberine (100 mg) plus caffeine (150 mg), or methylliberine (100 mg) plus theacrine (50 mg) using a randomized, double-blind, crossover design. Blood samples were collected over 24 hours and analyzed for methylliberine, theacrine, and caffeine using UPLC-MS/MS. Methylliberine exhibited linear pharmacokinetics that were unaffected by co-administration of either caffeine or theacrine. However, methylliberine co-administration resulted in decreased oral clearance (41.9 {+/-} 19.5 vs. 17.1 {+/-} 7.80 L/hr) and increased half-life (7.2 {+/-} 5.6 versus 15 {+/-} 5.8 hrs) of caffeine. Methylliberine had no impact on caffeines maximum concentration (440 {+/-} 140 vs. 458 {+/-} 93.5 ng/mL) or oral volume of distribution (351 {+/-} 148 vs. 316 {+/-} 76.4 L). We previously demonstrated theacrine bioavailability was enhanced by caffeine, however, caffeine pharmacokinetics were unaffected by theacrine. Herein, we found that methylliberine altered caffeine pharmacokinetics without a reciprocal interaction, which suggests caffeine may interact uniquely with different methylurates. Understanding the mechanism(s) of interaction between methylxanthines and methylurates is of critical importance in light of the recent advent of dietary supplements containing both purine alkaloid classes.
pharmacology and therapeutics
10.1101/2021.01.05.21249234
Caffeine and Methylliberine: A Human Pharmacokinetic Interaction Study
Methylliberine and theacrine are methylurates found in the leaves of various Coffea species and Camellia assamica var. kucha, respectively. We previously demonstrated that the methylxanthine caffeine increased theacrines oral bioavailability in humans. Consequently, we conducted a double-blind, placebo-controlled study pharmacokinetic study in humans administered methylliberine, theacrine, and caffeine to determine methylliberines pharmacokinetic interaction potential with either caffeine or theacrine. Subjects (n = 12) received an oral dose of either methylliberine (25 or 100 mg), caffeine (150 mg), methylliberine (100 mg) plus caffeine (150 mg), or methylliberine (100 mg) plus theacrine (50 mg) using a randomized, double-blind, crossover design. Blood samples were collected over 24 hours and analyzed for methylliberine, theacrine, and caffeine using UPLC-MS/MS. Methylliberine exhibited linear pharmacokinetics that were unaffected by co-administration of either caffeine or theacrine. However, methylliberine co-administration resulted in decreased oral clearance (41.9 {+/-} 19.5 vs. 17.1 {+/-} 7.80 L/hr) and increased half-life (7.2 {+/-} 5.6 versus 15 {+/-} 5.8 hrs) of caffeine. Methylliberine had no impact on caffeines maximum concentration (440 {+/-} 140 vs. 458 {+/-} 93.5 ng/mL) or oral volume of distribution (351 {+/-} 148 vs. 316 {+/-} 76.4 L). We previously demonstrated theacrine bioavailability was enhanced by caffeine, however, caffeine pharmacokinetics were unaffected by theacrine. Herein, we found that methylliberine altered caffeine pharmacokinetics without a reciprocal interaction, which suggests caffeine may interact uniquely with different methylurates. Understanding the mechanism(s) of interaction between methylxanthines and methylurates is of critical importance in light of the recent advent of dietary supplements containing both purine alkaloid classes.
pharmacology and therapeutics
10.1101/2021.01.04.20232520
Genomic epidemiology of the SARS-CoV-2 epidemic in Zimbabwe: Role of international travel and regional migration in spread
Zimbabwe reported its first case of SARS-Cov-2 infection in March 2020, and case numbers increased to more than 8,099 to 16th October 2020. An understanding of the SARS-Cov-2 outbreak in Zimbabwe will assist in the implementation of effective public health interventions to control transmission. Nasopharyngeal samples from 92,299 suspected and confirmed COVID-19 cases reported in Zimbabwe between 20 March and 16 October 2020 were obtained. Available demographic data associated with those cases identified as positive (8,099) were analysed to describe the national breakdown of positive cases over time in more detail (geographical location, sex, age and travel history). The whole genome sequence (WGS) of one hundred SARS-CoV-2-positive samples from the first 120 days of the epidemic in Zimbabwe was determined to identify their relationship to one another and WGS from global samples. Overall, a greater proportion of infections were in males (55.5%) than females (44.85%), although in older age groups more females were affected than males. Most COVID-19 cases (57 %) were in the 20-40 age group. Eight lineages, from at least 25 separate introductions into the region were found using comparative genomics. Of these, 95% had the D614G mutation on the spike protein which was associated with higher transmissibility than the ancestral strain. Early introductions and spread of SARS-CoV-2 were predominantly associated with genomes common in Europe and the United States of America (USA), and few common in Asia at this time. As the pandemic evolved, travel-associated cases from South Africa and other neighbouring countries were also recorded. Transmission within quarantine centres occurred when travelling nationals returning to Zimbabwe. International and regional migration followed by local transmission were identified as accounting for the development of the SARS-CoV-2 epidemic in Zimbabwe. Based on this, rapid implementation of public health interventions are critical to reduce local transmission of SARS-CoV-2. Impact of the predominant G614 strain on severity of symptoms in COVID-19 cases needs further investigation.
public and global health
10.1101/2021.01.05.21249292
A novel age-informed approach for genetic association analysis in Alzheimer's disease
IntroductionMany Alzheimers disease (AD) genetic association studies disregard age or incorrectly account for it, hampering variant discovery. MethodUsing simulated data, we compared the statistical power of several models: logistic regression on AD diagnosis adjusted and not adjusted for age; linear regression on a score integrating case-control status and age; and multivariate Cox regression on age-at-onset. We applied these models to real exome-wide data of 11,127 sequenced individuals (54% cases) and replicated suggestive associations in 21,631 genotype-imputed individuals (51% cases). ResultsModelling variable AD risk across age results in 10-20% statistical power gain compared to logistic regression without age adjustment, while incorrect age adjustment leads to critical power loss. Applying our novel AD-age score and/or Cox regression, we discovered and replicated novel variants associated with AD on KIF21B, USH2A, RAB10, RIN3 and TAOK2 genes. DiscussionOur AD-age score provides a simple means for statistical power gain and is recommended for future AD studies.
neurology
10.1101/2021.01.05.21249232
In Vivo Assessment of the Safety of Standard Fractionation Temporally Feathered Radiation Therapy (TFRT) for Head and Neck Squamous Cell Carcinoma: An R-IDEAL Stage 1/2a First-in- Humans/Feasibility Demonstration of New Technology Implementation
IntroductionPrior in silico simulations of studies of Temporally Feathered Radiation Therapy (TFRT) have demonstrated potential reduction in normal tissue toxicity. This R-IDEAL Stage 1/2A study seeks to demonstrate the first-in-human implementation of TFRT in treating patients with head and neck squamous cell carcinoma (HNSCC). Materials and MethodsPatients with HNSCC treated with definitive radiation therapy were eligible (70 Gy in 35 fractions) were eligible. The primary endpoint was feasibility of TFRT planning as defined by radiation start within 15 days of CT simulation. Secondary endpoints included estimates of acute grade 3-5 toxicity. ResultsThe study met its accrual goal of 5 patients. TFRT plans were generated in four of the five patients within 15 business days of CT simulation, therefore meeting the primary endpoint. One patient was not treated with TFRT at physicians discretion, though the TFRT plan had been generated within sufficient time from the CT simulation. For patients who received TFRT, the median time from CT simulation to radiation start was 10 business days (range 8-15). The average time required for radiation planning was 6 days. In all patients receiving TFRT, each subplan and every daily fraction was delivered in the correct sequence without error. The OARs feathered included: oral cavity, each submandibular gland, each parotid gland, supraglottis, and posterior pharyngeal wall (OAR pharynx). Prescription dose PTV coverage (>95%) was ensured in each TFRT subplan and the composite TFRT plan. One of five patients developed an acute grade 3 toxicity. ConclusionsThis study demonstrates the first-in-human implementation of TFRT (R-IDEAL Stage 1), proving its feasibility in the modern clinical workflow. Additionally, assessments of acute toxicities and dosimetric comparisons to a standard radiotherapy plan were described (R-IDEAL Stage 2a). HighlightsO_LIThis prospective study is the first-in-human application for Temporally Feathered Radiation Therapy (TFRT). C_LIO_LIIn theory, TFRT may reduce radiation-induced toxicities by optimizing the time through which radiation is delivered and consequently improve normal tissue recovery. C_LIO_LIIn this study, patients with head and neck squamous cell carcinoma were treated with TFRT. C_LIO_LIThe primary endpoint of technical feasibility was met when patients were successfully treated with TFRT techniques without introducing delays in radiation commencement. C_LI
oncology
10.1101/2021.01.05.21249284
Twitter Discourse on Nicotine as Potential Prophylactic or Therapeutic for COVID-19
BackgroundAn unproven "nicotine hypothesis" that indicates nicotines therapeutic potential for COVID-19 has been proposed in recent literature. This study is about Twitter posts that misinterpret this hypothesis to make baseless claims about benefits of smoking and vaping in the context of COVID-19. We quantify the presence of such misinformation and characterize the tweeters who post such messages. MethodsTwitter premium API was used to download tweets (n = 17,533) that match terms indicating (a) nicotine or vaping themes, (b) a prophylactic or therapeutic effect, and (c) COVID-19 (January-July 2020) as a conjunctive query. A constraint on the length of the span of text containing the terms in the tweets allowed us to focus on those that convey the therapeutic intent. We hand-annotated these filtered tweets and built a classifier that identifies tweets that extrapolate the nicotine hypothesis to smoking/vaping with a positive predictive value of 85%. We analyzed the frequently used terms in author bios, top Web links, and hashtags of such tweets. Results21% of our filtered COVID-19 tweets indicate a vaping or smoking-based prevention/treatment narrative. Qualitative analyses show a variety of ways therapeutic claims are being made and tweeter bios reveal pre-existing notions of positive stances toward vaping. ConclusionThe social media landscape is a double-edged sword in tobacco communication. Although it increases information reach, consumers can also be subject to confirmation bias when exposed to inadvertent or deliberate framing of scientific discourse that may border on misinformation. This calls for circumspection and additional planning in countering such narratives as the COVID-19 pandemic continues to ravage our world. Our results also serve as a cautionary tale in how social media can be leveraged to spread misleading information about tobacco products in the wake of pandemics.
health informatics
10.1101/2021.01.05.21249284
Twitter Discourse on Nicotine as Potential Prophylactic or Therapeutic for COVID-19
BackgroundAn unproven "nicotine hypothesis" that indicates nicotines therapeutic potential for COVID-19 has been proposed in recent literature. This study is about Twitter posts that misinterpret this hypothesis to make baseless claims about benefits of smoking and vaping in the context of COVID-19. We quantify the presence of such misinformation and characterize the tweeters who post such messages. MethodsTwitter premium API was used to download tweets (n = 17,533) that match terms indicating (a) nicotine or vaping themes, (b) a prophylactic or therapeutic effect, and (c) COVID-19 (January-July 2020) as a conjunctive query. A constraint on the length of the span of text containing the terms in the tweets allowed us to focus on those that convey the therapeutic intent. We hand-annotated these filtered tweets and built a classifier that identifies tweets that extrapolate the nicotine hypothesis to smoking/vaping with a positive predictive value of 85%. We analyzed the frequently used terms in author bios, top Web links, and hashtags of such tweets. Results21% of our filtered COVID-19 tweets indicate a vaping or smoking-based prevention/treatment narrative. Qualitative analyses show a variety of ways therapeutic claims are being made and tweeter bios reveal pre-existing notions of positive stances toward vaping. ConclusionThe social media landscape is a double-edged sword in tobacco communication. Although it increases information reach, consumers can also be subject to confirmation bias when exposed to inadvertent or deliberate framing of scientific discourse that may border on misinformation. This calls for circumspection and additional planning in countering such narratives as the COVID-19 pandemic continues to ravage our world. Our results also serve as a cautionary tale in how social media can be leveraged to spread misleading information about tobacco products in the wake of pandemics.
health informatics
10.1101/2021.01.05.21249216
Prevalence and Factors associated with Mental health impact of COVID-19 Pandemic in Bangladesh: A survey-based cross-sectional study
BackgroundFeelings of isolation, insecurity, and instability triggered by COVID-19 could have a long-term impact on the mental health status of individuals. This study examined the prevalence and factors associated with the mental health symptoms of anxiety, depression, and stress during the COVID-19 pandemic in Bangladesh. MethodsFrom 1st - 30th April 2020, we used a validated self-administered questionnaire to conduct a cross-sectional study on 10,609 participants through an online survey platform. We assessed mental health status using the Depression, Anxiety, and Stress Scale (DASS-21). The total depression, anxiety, and stress subscale scores were divided into normal, mild, moderate, severe, and multinomial logistic regression was used to examine associated factors. ResultsThe prevalence of depressive symptoms was 15%, 34%, and 15% for mild, moderate, and severe depressive symptoms, respectively. The prevalence of anxiety symptoms was 59% for severe anxiety symptoms, 14% for moderate anxiety symptoms, and 14% for mild anxiety symptoms while, the prevalence for stress levels were 16% for severe stress level, 22% for moderate stress level and 13% for mild stress level. Multivariate analyses revealed that the most consistent factors associated with mild, moderate, and severe of the three mental health subscales (depression, anxiety, and stress) were respondents who lived in Dhaka and Rangpur division, females, those who self-quarantine in the previous 7 days before the survey and those respondents who experienced chills, breathing difficulty, dizziness, and sore throat. ConclusionOur results showed that about 64%, 87%, and 61% experienced depressive symptoms, anxiety symptoms, and levels of stress, respectively. In Bangladesh, there is a need for better mental health support for females especially those that lived in Dhaka and Rangpur division and experienced chills, breathing difficulty, dizziness, and sore throat during COVID-19 and other future pandemics.
health systems and quality improvement
10.1101/2021.01.05.21249216
Prevalence and Factors associated with Mental health impact of COVID-19 Pandemic in Bangladesh: A survey-based cross-sectional study
BackgroundFeelings of isolation, insecurity, and instability triggered by COVID-19 could have a long-term impact on the mental health status of individuals. This study examined the prevalence and factors associated with the mental health symptoms of anxiety, depression, and stress during the COVID-19 pandemic in Bangladesh. MethodsFrom 1st - 30th April 2020, we used a validated self-administered questionnaire to conduct a cross-sectional study on 10,609 participants through an online survey platform. We assessed mental health status using the Depression, Anxiety, and Stress Scale (DASS-21). The total depression, anxiety, and stress subscale scores were divided into normal, mild, moderate, severe, and multinomial logistic regression was used to examine associated factors. ResultsThe prevalence of depressive symptoms was 15%, 34%, and 15% for mild, moderate, and severe depressive symptoms, respectively. The prevalence of anxiety symptoms was 59% for severe anxiety symptoms, 14% for moderate anxiety symptoms, and 14% for mild anxiety symptoms while, the prevalence for stress levels were 16% for severe stress level, 22% for moderate stress level and 13% for mild stress level. Multivariate analyses revealed that the most consistent factors associated with mild, moderate, and severe of the three mental health subscales (depression, anxiety, and stress) were respondents who lived in Dhaka and Rangpur division, females, those who self-quarantine in the previous 7 days before the survey and those respondents who experienced chills, breathing difficulty, dizziness, and sore throat. ConclusionOur results showed that about 64%, 87%, and 61% experienced depressive symptoms, anxiety symptoms, and levels of stress, respectively. In Bangladesh, there is a need for better mental health support for females especially those that lived in Dhaka and Rangpur division and experienced chills, breathing difficulty, dizziness, and sore throat during COVID-19 and other future pandemics.
health systems and quality improvement
10.1101/2021.01.04.21249236
Saliva viral load is a dynamic unifying correlate of COVID-19 severity and mortality
While several clinical and immunological parameters correlate with disease severity and mortality in SARS-CoV-2 infection, work remains in identifying unifying correlates of coronavirus disease 2019 (COVID-19) that can be used to guide clinical practice. Here, we examine saliva and nasopharyngeal (NP) viral load over time and correlate them with patient demographics, and cellular and immune profiling. We found that saliva viral load was significantly higher in those with COVID-19 risk factors; that it correlated with increasing levels of disease severity and showed a superior ability over nasopharyngeal viral load as a predictor of mortality over time (AUC=0.90). A comprehensive analysis of immune factors and cell subsets revealed strong predictors of high and low saliva viral load, which were associated with increased disease severity or better overall outcomes, respectively. Saliva viral load was positively associated with many known COVID-19 inflammatory markers such as IL-6, IL-18, IL-10, and CXCL10, as well as type 1 immune response cytokines. Higher saliva viral loads strongly correlated with the progressive depletion of platelets, lymphocytes, and effector T cell subsets including circulating follicular CD4 T cells (cTfh). Anti-spike (S) and anti-receptor binding domain (RBD) IgG levels were negatively correlated with saliva viral load showing a strong temporal association that could help distinguish severity and mortality in COVID-19. Finally, patients with fatal COVID-19 exhibited higher viral loads, which correlated with the depletion of cTfh cells, and lower production of anti-RBD and anti-S IgG levels. Together these results demonstrated that viral load - as measured by saliva but not nasopharyngeal -- is a dynamic unifying correlate of disease presentation, severity, and mortality over time.
infectious diseases
10.1101/2021.01.04.21249236
Saliva viral load is a dynamic unifying correlate of COVID-19 severity and mortality
While several clinical and immunological parameters correlate with disease severity and mortality in SARS-CoV-2 infection, work remains in identifying unifying correlates of coronavirus disease 2019 (COVID-19) that can be used to guide clinical practice. Here, we examine saliva and nasopharyngeal (NP) viral load over time and correlate them with patient demographics, and cellular and immune profiling. We found that saliva viral load was significantly higher in those with COVID-19 risk factors; that it correlated with increasing levels of disease severity and showed a superior ability over nasopharyngeal viral load as a predictor of mortality over time (AUC=0.90). A comprehensive analysis of immune factors and cell subsets revealed strong predictors of high and low saliva viral load, which were associated with increased disease severity or better overall outcomes, respectively. Saliva viral load was positively associated with many known COVID-19 inflammatory markers such as IL-6, IL-18, IL-10, and CXCL10, as well as type 1 immune response cytokines. Higher saliva viral loads strongly correlated with the progressive depletion of platelets, lymphocytes, and effector T cell subsets including circulating follicular CD4 T cells (cTfh). Anti-spike (S) and anti-receptor binding domain (RBD) IgG levels were negatively correlated with saliva viral load showing a strong temporal association that could help distinguish severity and mortality in COVID-19. Finally, patients with fatal COVID-19 exhibited higher viral loads, which correlated with the depletion of cTfh cells, and lower production of anti-RBD and anti-S IgG levels. Together these results demonstrated that viral load - as measured by saliva but not nasopharyngeal -- is a dynamic unifying correlate of disease presentation, severity, and mortality over time.
infectious diseases
10.1101/2021.01.06.21249312
Long term impact on lung function of patients with moderate and severe COVID-19. A prospective cohort study
IntroductionA significant number of patients continue to recover from COVID-19; however, little is known about the lung function capacity among survivors. We aim to determine the long-term impact on lung function capacity in patients who have survived moderate or severe COVID-19 disease in a resource-poor setting. Methods and analysisThis prospective cohort study will include patients aged 15 years and above and have reverse transcriptase-polymerase chain reaction (RT-PCR) positive for COVID 19 (nasopharyngeal or oropharyngeal). Patients with a pre-existing diagnosis of obstructive or interstitial lung disease, lung fibrosis and cancers, connective tissue disorders, autoimmune conditions affecting the lungs, underlying heart disease, history of syncope and refuse to participate will be excluded. Pulmonary function will be assessed using spirometry and diffusion lung capacity for carbon monoxide (DLCO) at three- and six-months interval. A chest X-ray at three and six-month follow-up and CT-chest will be performed if clinically indicated after consultation with the study pulmonologist or Infectious Disease (ID) physician. Echocardiogram (ECHO) to look for pulmonary hypertension at the three months visit and repeated at six months if any abnormality is identified initially. Data analysis will be performed using standard statistical software. Ethics and disseminationThe proposal was reviewed and approved by ethics review committee (ERC) of the institution (ERC reference number 2020-4735-11311). Informed consent will be obtained from each study participant. The results will be disseminated among study participants, institutional, provincial and national level through seminars and presentations. Moreover, the scientific findings will be published in high-impact peer-reviewed medical journals. Strengths and Limitations of this study- The study has the potential to develop context-specific evidence on the long-term impact on lung function among COVID-19 survivors - Findings will play key role in understanding the impact of the disease on vital functions and help devise rehabilitative strategies to best overcome the effects of disease - This is a single-center, study recruiting only a limited number of COVID-19 survivors - The study participants may loss-to-follow up due to uncertain conditions and disease reemergence
infectious diseases
10.1101/2021.01.05.20249061
THE INTESTINAL AND ORAL MICROBIOMES ARE ROBUST PREDICTORS OF COVID-19 SEVERITY THE MAIN PREDICTOR OF COVID-19-RELATED FATALITY
The reason for the striking differences in clinical outcomes of SARS-CoV-2 infected patients is still poorly understood. While most recover, a subset of people become critically ill and succumb to the disease. Thus, identification of biomarkers that can predict the clinical outcomes of COVID-19 disease is key to help prioritize patients needing urgent treatment. Given that an unbalanced gut microbiome is a reflection of poor health, we aim to identify indicator species that could predict COVID-19 disease clinical outcomes. Here, for the first time and with the largest COVID-19 patient cohort reported for microbiome studies, we demonstrated that the intestinal and oral microbiome make-up predicts respectively with 92% and 84% accuracy (Area Under the Curve or AUC) severe COVID-19 respiratory symptoms that lead to death. The accuracy of the microbiome prediction of COVID-19 severity was found to be far superior to that from training similar models using information from comorbidities often adopted to triage patients in the clinic (77% AUC). Additionally, by combining symptoms, comorbidities, and the intestinal microbiota the model reached the highest AUC at 96%. Remarkably the model training on the stool microbiome found enrichment of Enterococcus faecalis, a known pathobiont, as the top predictor of COVID-19 disease severity. Enterococcus faecalis is already easily cultivable in clinical laboratories, as such we urge the medical community to include this bacterium as a robust predictor of COVID-19 severity when assessing risk stratification of patients in the clinic.
infectious diseases
10.1101/2021.01.05.21249240
Fever, Diarrhea, and Severe Disease Correlate with High Persistent Antibody Levels against SARS-CoV-2
Lasting immunity will be critical for overcoming the coronavirus disease 2019 (COVID-19) pandemic caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). However, factors that drive the development of high titers of anti-SARS-CoV-2 antibodies and how long those antibodies persist remain unclear. Our objective was to comprehensively evaluate anti-SARS-CoV-2 antibodies in a clinically diverse COVID-19 convalescent cohort at defined time points to determine if anti-SARS-CoV-2 antibodies persist and to identify clinical and demographic factors that correlate with high titers. Using a novel multiplex assay to quantify IgG against four SARS-CoV-2 antigens, a receptor binding domain-angiotensin converting enzyme 2 inhibition assay, and a SARS-CoV-2 neutralization assay, we found that 98% of COVID-19 convalescent subjects had anti-SARS-CoV-2 antibodies five weeks after symptom resolution (n=113). Further, antibody levels did not decline three months after symptom resolution (n=79). As expected, greater disease severity, older age, male sex, obesity, and higher Charlson Comorbidity Index score correlated with increased anti-SARS-CoV-2 antibody levels. We demonstrated for the first time that COVID-19 symptoms, namely fever, abdominal pain, diarrhea and low appetite, correlated consistently with higher anti-SARS-CoV-2 antibody levels. Our results provide new insights into the development and persistence of anti-SARS-CoV-2 antibodies.
infectious diseases
10.1101/2021.01.06.21249314
Impaired performance of SARS-CoV-2 antigen-detecting rapid tests at elevated temperatures
Rapid antigen-detecting tests (Ag-RDTs) can complement molecular diagnostics for COVID-19. The recommended temperature for storage of SARS-CoV-2 Ag-RDTs ranges between 5-30{degrees}C. In many countries that would benefit from SARS-CoV-2 Ag-RDTs, mean temperatures exceed 30{degrees}C. We assessed analytical sensitivity and specificity of eleven commercially available SARS-CoV-2 Ag-RDTs using different storage and operational temperatures, including (i) long-term storage and testing at recommended conditions, (ii) recommended storage conditions followed by 10 minutes exposure to 37{degrees}C and testing at 37{degrees}C and (iii) 3 weeks storage followed by testing at 37{degrees}C. The limits of detection of SARS-CoV-2 Ag-RDTs under recommended conditions ranged from 8.2x105-7.9x107 genome copies/ml of infectious SARS-CoV-2 cell culture supernatant. Despite long-term storage at recommended conditions, 10 minutes pre-incubation of Ag-RDTs and testing at 37{degrees}C resulted in about ten-fold reduced sensitivity for 46% of SARS-CoV-2 Ag-RDTs, including both Ag-RDTs currently listed for emergency use by the World Health Organization. After 3 weeks of storage at 37{degrees}C, 73% of SARS-CoV-2 Ag-RDTs exhibited about ten-fold reduced sensitivity. Specificity of SARS-CoV-2 Ag-RDTs using cell culture-derived human coronaviruses HCoV-229E and HCoV-OC43 was not affected by storage and testing at 37{degrees}C. In summary, short- and long-term exposure to elevated temperatures likely impairs sensitivity of several SARS-CoV-2 Ag-RDTs that may translate to false-negative test results at clinically relevant virus concentrations compatible with inter-individual transmission. Ensuring appropriate transport and storage conditions, and development of tests that are more robust across temperature fluctuations will be important for accurate use of SARS-CoV-2 Ag-RDTs in tropical settings.
infectious diseases
10.1101/2021.01.05.21249293
Modelling Decay of Population Immunity With Proposed Second Dose Deferral Strategy
A second dose deferred strategy has been proposed to increase initial population immunity as an alternative to the default two dose vaccine regimen with spacing of 21 or 28 days between vaccine doses for the mRNA vaccines from Pfizer and Moderna. This increased initial population immunity is only of value if one dose immunity does not decay so fast as to nullify the benefit. Because decay rates of one dose and two dose efficacy are currently unknown, a model to project population immunity between the two strategies was created. By evaluating the decay rate of one dose efficacy, two dose efficacy, and time until the second dose is given, the model shows that if there is an increased decay rate of one dose efficacy relative to the two dose decay rate, it is highly unlikely to nullify the benefit of increased population immunity seen in a second dose deferral strategy. Rather, all reasonable scenarios strongly favour a second dose deferral strategy with much higher projected population immunity in comparison to the default regimen.
allergy and immunology
10.1101/2021.01.06.21249311
Burden is in the eye of the beholder: Sensitivity of yellow fever disease burden estimates to modeling assumptions
Geographically stratified estimates of disease burden play an important role in setting priorities for the management of different diseases and for targeting interventions against a single disease. Such estimates involve numerous assumptions, which uncertainty about is not always well accounted for. We developed a framework for estimating the burden of yellow fever in Africa and evaluated its sensitivity to assumptions about the interpretation of serological data and choice of regression model. We addressed the latter with an ensemble approach, and we found that the former resulted in a nearly twentyfold difference in burden estimates (range of central estimates: 8.4x104-1.5x106 deaths in 2021-2030). Even so, statistical uncertainty made even greater contributions to variance in burden estimates (87%). Combined with estimates that most infections go unreported (range of 95% credible intervals: 99.65-99.99%), our results suggest that yellow fevers burden will remain highly uncertain without major improvements in surveillance.
epidemiology
10.1101/2021.01.05.21249283
Combining machine learning and mathematical models of disease dynamics to guide development of novel disease interventions
BackgroundSubstantial research is underway to develop next-generation interventions that address current malaria control challenges. As there is limited testing in their early development, it is difficult to predefine intervention properties such as efficacy that achieve target health goals, and therefore challenging to prioritize selection of novel candidate interventions. Here, we present a quantitative approach to guide intervention development using mathematical models of malaria dynamics coupled with machine learning. Our analysis identifies requirements of efficacy, coverage, and duration of effect for five novel malaria interventions to achieve targeted reductions in malaria prevalence. This study highlights the role of mathematical models to support intervention development. MethodsA mathematical model of malaria transmission dynamics is used to simulate deployment and predict potential impact of new malaria interventions by considering operational, health-system, population, and disease characteristics. Our method relies on consultation with product development stakeholders to define the putative space of novel intervention specifications. We couple the disease model with machine learning to search this multi-dimensional space and efficiently identify optimal intervention properties that achieve specified health goals. We demonstrate the power of our approach by application to five malaria interventions in development. ResultsAiming for malaria prevalence reduction, we identify and quantify key determinants of intervention impact along with their minimal properties required to achieve the desired health goals. While coverage is generally identified as the largest driver of impact, higher efficacy, longer protection duration or multiple deployments per year are needed to increase prevalence reduction. We show that the efficacy and duration needs depend on the biological action of the interventions. Interventions on multiple parasite or vector targets, as well as combinations the new interventions with drug treatment, lead to significant burden reductions and lower efficacy or duration requirements. ConclusionsOur approach uses disease dynamic models and machine learning to support decision-making and resource investment, facilitating development of new malaria interventions. By evaluating the intervention capabilities in relation to the targeted health goal, our analysis allows prioritization of interventions and of their specifications from an early stage in development, and subsequent investments to be channeled cost-effectively towards impact maximization. Although we focus on five malaria interventions, the analysis is generalizable to other new malaria interventions.
epidemiology
10.1101/2021.01.05.21249283
Combining machine learning and mathematical models of disease dynamics to guide development of novel disease interventions
BackgroundSubstantial research is underway to develop next-generation interventions that address current malaria control challenges. As there is limited testing in their early development, it is difficult to predefine intervention properties such as efficacy that achieve target health goals, and therefore challenging to prioritize selection of novel candidate interventions. Here, we present a quantitative approach to guide intervention development using mathematical models of malaria dynamics coupled with machine learning. Our analysis identifies requirements of efficacy, coverage, and duration of effect for five novel malaria interventions to achieve targeted reductions in malaria prevalence. This study highlights the role of mathematical models to support intervention development. MethodsA mathematical model of malaria transmission dynamics is used to simulate deployment and predict potential impact of new malaria interventions by considering operational, health-system, population, and disease characteristics. Our method relies on consultation with product development stakeholders to define the putative space of novel intervention specifications. We couple the disease model with machine learning to search this multi-dimensional space and efficiently identify optimal intervention properties that achieve specified health goals. We demonstrate the power of our approach by application to five malaria interventions in development. ResultsAiming for malaria prevalence reduction, we identify and quantify key determinants of intervention impact along with their minimal properties required to achieve the desired health goals. While coverage is generally identified as the largest driver of impact, higher efficacy, longer protection duration or multiple deployments per year are needed to increase prevalence reduction. We show that the efficacy and duration needs depend on the biological action of the interventions. Interventions on multiple parasite or vector targets, as well as combinations the new interventions with drug treatment, lead to significant burden reductions and lower efficacy or duration requirements. ConclusionsOur approach uses disease dynamic models and machine learning to support decision-making and resource investment, facilitating development of new malaria interventions. By evaluating the intervention capabilities in relation to the targeted health goal, our analysis allows prioritization of interventions and of their specifications from an early stage in development, and subsequent investments to be channeled cost-effectively towards impact maximization. Although we focus on five malaria interventions, the analysis is generalizable to other new malaria interventions.
epidemiology
10.1101/2021.01.05.21249283
Combining machine learning and mathematical models of disease dynamics to guide development of novel malaria interventions
BackgroundSubstantial research is underway to develop next-generation interventions that address current malaria control challenges. As there is limited testing in their early development, it is difficult to predefine intervention properties such as efficacy that achieve target health goals, and therefore challenging to prioritize selection of novel candidate interventions. Here, we present a quantitative approach to guide intervention development using mathematical models of malaria dynamics coupled with machine learning. Our analysis identifies requirements of efficacy, coverage, and duration of effect for five novel malaria interventions to achieve targeted reductions in malaria prevalence. This study highlights the role of mathematical models to support intervention development. MethodsA mathematical model of malaria transmission dynamics is used to simulate deployment and predict potential impact of new malaria interventions by considering operational, health-system, population, and disease characteristics. Our method relies on consultation with product development stakeholders to define the putative space of novel intervention specifications. We couple the disease model with machine learning to search this multi-dimensional space and efficiently identify optimal intervention properties that achieve specified health goals. We demonstrate the power of our approach by application to five malaria interventions in development. ResultsAiming for malaria prevalence reduction, we identify and quantify key determinants of intervention impact along with their minimal properties required to achieve the desired health goals. While coverage is generally identified as the largest driver of impact, higher efficacy, longer protection duration or multiple deployments per year are needed to increase prevalence reduction. We show that the efficacy and duration needs depend on the biological action of the interventions. Interventions on multiple parasite or vector targets, as well as combinations the new interventions with drug treatment, lead to significant burden reductions and lower efficacy or duration requirements. ConclusionsOur approach uses disease dynamic models and machine learning to support decision-making and resource investment, facilitating development of new malaria interventions. By evaluating the intervention capabilities in relation to the targeted health goal, our analysis allows prioritization of interventions and of their specifications from an early stage in development, and subsequent investments to be channeled cost-effectively towards impact maximization. Although we focus on five malaria interventions, the analysis is generalizable to other new malaria interventions.
epidemiology
10.1101/2021.01.05.21249283
Leveraging mathematical models of disease dynamics and machine learning to improve development of novel malaria interventions
BackgroundSubstantial research is underway to develop next-generation interventions that address current malaria control challenges. As there is limited testing in their early development, it is difficult to predefine intervention properties such as efficacy that achieve target health goals, and therefore challenging to prioritize selection of novel candidate interventions. Here, we present a quantitative approach to guide intervention development using mathematical models of malaria dynamics coupled with machine learning. Our analysis identifies requirements of efficacy, coverage, and duration of effect for five novel malaria interventions to achieve targeted reductions in malaria prevalence. This study highlights the role of mathematical models to support intervention development. MethodsA mathematical model of malaria transmission dynamics is used to simulate deployment and predict potential impact of new malaria interventions by considering operational, health-system, population, and disease characteristics. Our method relies on consultation with product development stakeholders to define the putative space of novel intervention specifications. We couple the disease model with machine learning to search this multi-dimensional space and efficiently identify optimal intervention properties that achieve specified health goals. We demonstrate the power of our approach by application to five malaria interventions in development. ResultsAiming for malaria prevalence reduction, we identify and quantify key determinants of intervention impact along with their minimal properties required to achieve the desired health goals. While coverage is generally identified as the largest driver of impact, higher efficacy, longer protection duration or multiple deployments per year are needed to increase prevalence reduction. We show that the efficacy and duration needs depend on the biological action of the interventions. Interventions on multiple parasite or vector targets, as well as combinations the new interventions with drug treatment, lead to significant burden reductions and lower efficacy or duration requirements. ConclusionsOur approach uses disease dynamic models and machine learning to support decision-making and resource investment, facilitating development of new malaria interventions. By evaluating the intervention capabilities in relation to the targeted health goal, our analysis allows prioritization of interventions and of their specifications from an early stage in development, and subsequent investments to be channeled cost-effectively towards impact maximization. Although we focus on five malaria interventions, the analysis is generalizable to other new malaria interventions.
epidemiology
10.1101/2021.01.05.21249269
Acceptability and efficacy of vaginal self-sampling for genital infection and bacterial vaginosis: A large, cross-sectional, non-inferiority trial
ObjectiveScreening for genital infection (GI), bacterial vaginosis (BV), sexually transmitted infection (STI) and asymptomatic carriage of group B streptococcus (GBS) in pregnant women is a common reason for medical appointments. Objectives were first to determine the non-inferiority of vaginal self-sampling compared with vaginal/cervical classical sampling to screen for GIs, bacterial vaginosis (BV), STIs, and GBS asymptomatic carriage in pregnant women; second to determine the feasibility of vaginal self-sampling. MethodsVaginal self-sampling (VSS) and vaginal/cervical classical sampling (VCS) of 1027 women were collected by health care professionals and simultaneously carried out on each patient. Bacterial infection, yeast infection, Chlamydia trachomatis, Neisseria gonorrhoea, Mycoplasma genitalium, Trichomonas vaginalis and Herpes simplex virus types were systematically screened in both paired VSS and VCS samples. ResultsStatistical tests supported the non-inferiority of VSS compared with VCS. Agreements between VCS and VSS remained high regardless of the type of studied infection. VSS had successful diagnostic performances, especially for Predictive negative value (PNV) (over 90%) for all studied infections. Most participants (84%) recommended the use of VSS. ConclusionsThis study remains the most exhaustive in screening for GI, BV, STI agents and asymptomatic GBS carriage. Given its efficacy and acceptability, VSS seems to be a viable alternative to classic physician sampling among women in the general population. This study provides evidence that vaginal self-sampling can be used as a universal specimen for detection of lower genital tract infections in women. Study Identification numberID-RCB 2014-A01250-4
infectious diseases
10.1101/2021.01.05.20248768
Thinner cortex is associated with psychosis onset in individuals at Clinical High Risk for Developing Psychosis: An ENIGMA Working Group mega-analysis
ImportanceThe ENIGMA clinical high risk for psychosis (CHR) initiative, the largest pooled CHR-neuroimaging sample to date, aims to discover robust neurobiological markers of psychosis risk in a sample with known heterogeneous outcomes. ObjectiveWe investigated baseline structural neuroimaging differences between CHR subjects and healthy controls (HC), and between CHR participants who later developed a psychotic disorder (CHR-PS+) and those who did not (CHR-PS-). We assessed associations with age by group and conversion status, and similarities between the patterns of effect size maps for psychosis conversion and those found in other large-scale psychosis studies. Design, Setting, and ParticipantsBaseline T1-weighted MRI data were pooled from 31 international sites participating in the ENIGMA CHR Working Group. MRI scans were processed using harmonized protocols and analyzed within a mega- and meta-analysis framework from January-October 2020. Main Outcome(s) and Measure(s)Measures of regional cortical thickness (CT), surface area (SA), and subcortical volumes were extracted from T1-weighted MRI scans. Independent variables were group (CHR, HC) and conversion status (CHR-PS+, CHR-PS-, HC). ResultsThe final dataset consisted of 3,169 participants (CHR=1,792, HC=1,377, age range: 9.5 to 39.8 years, 45% female). Using longitudinal clinical information, we identified CHR-PS+ (N=253) and CHR-PS- (N=1,234). CHR exhibited widespread thinner cortex compared to HC (average d=-0.125, range: -0.09 to -0.17), but not SA or subcortical volume. Thinner cortex in the fusiform, superior temporal, and paracentral regions was associated with psychosis conversion (average d=-0.22). Age showed a stronger negative association with left fusiform and left paracentral CT in HC, compared to CHR-PS+. Regional CT psychosis conversion effect sizes resembled patterns of CT alterations observed in other ENIGMA studies of psychosis. Conclusions and RelevanceWe provide evidence for widespread subtle CT reductions in CHR. The pattern of regions displaying greater CT alterations in CHR-PS+ were similar to those reported in other large-scale investigations of psychosis. Additionally, a subset of these regions displayed abnormal age associations. Widespread CT disruptions coupled with abnormal age associations in CHR may point to disruptions in postnatal brain developmental processes. Key PointsO_ST_ABSQuestionC_ST_ABSHow do baseline brain morphometric features relate to later psychosis conversion in individuals at clinical high risk (CHR)? FindingsIn the largest coordinated international analysis to date, reduced baseline cortical thickness, but not cortical surface area or subcortical volume, was more pronounced in CHR, in a manner highly consistent with thinner cortex in established psychosis. Regions that displayed greater cortical thinning in future psychosis converters additionally displayed abnormal associations with age. MeaningCHR status and later transition to psychosis is robustly associated with reduced cortical thickness. Abnormal age associations and specificity to cortical thickness may point to aberrant postnatal brain development in CHR, including pruning and myelination.
psychiatry and clinical psychology
10.1101/2021.01.05.20248768
Thinner cortex is associated with psychosis onset in individuals at Clinical High Risk for Developing Psychosis: An ENIGMA Working Group mega-analysis
ImportanceThe ENIGMA clinical high risk for psychosis (CHR) initiative, the largest pooled CHR-neuroimaging sample to date, aims to discover robust neurobiological markers of psychosis risk in a sample with known heterogeneous outcomes. ObjectiveWe investigated baseline structural neuroimaging differences between CHR subjects and healthy controls (HC), and between CHR participants who later developed a psychotic disorder (CHR-PS+) and those who did not (CHR-PS-). We assessed associations with age by group and conversion status, and similarities between the patterns of effect size maps for psychosis conversion and those found in other large-scale psychosis studies. Design, Setting, and ParticipantsBaseline T1-weighted MRI data were pooled from 31 international sites participating in the ENIGMA CHR Working Group. MRI scans were processed using harmonized protocols and analyzed within a mega- and meta-analysis framework from January-October 2020. Main Outcome(s) and Measure(s)Measures of regional cortical thickness (CT), surface area (SA), and subcortical volumes were extracted from T1-weighted MRI scans. Independent variables were group (CHR, HC) and conversion status (CHR-PS+, CHR-PS-, HC). ResultsThe final dataset consisted of 3,169 participants (CHR=1,792, HC=1,377, age range: 9.5 to 39.8 years, 45% female). Using longitudinal clinical information, we identified CHR-PS+ (N=253) and CHR-PS- (N=1,234). CHR exhibited widespread thinner cortex compared to HC (average d=-0.125, range: -0.09 to -0.17), but not SA or subcortical volume. Thinner cortex in the fusiform, superior temporal, and paracentral regions was associated with psychosis conversion (average d=-0.22). Age showed a stronger negative association with left fusiform and left paracentral CT in HC, compared to CHR-PS+. Regional CT psychosis conversion effect sizes resembled patterns of CT alterations observed in other ENIGMA studies of psychosis. Conclusions and RelevanceWe provide evidence for widespread subtle CT reductions in CHR. The pattern of regions displaying greater CT alterations in CHR-PS+ were similar to those reported in other large-scale investigations of psychosis. Additionally, a subset of these regions displayed abnormal age associations. Widespread CT disruptions coupled with abnormal age associations in CHR may point to disruptions in postnatal brain developmental processes. Key PointsO_ST_ABSQuestionC_ST_ABSHow do baseline brain morphometric features relate to later psychosis conversion in individuals at clinical high risk (CHR)? FindingsIn the largest coordinated international analysis to date, reduced baseline cortical thickness, but not cortical surface area or subcortical volume, was more pronounced in CHR, in a manner highly consistent with thinner cortex in established psychosis. Regions that displayed greater cortical thinning in future psychosis converters additionally displayed abnormal associations with age. MeaningCHR status and later transition to psychosis is robustly associated with reduced cortical thickness. Abnormal age associations and specificity to cortical thickness may point to aberrant postnatal brain development in CHR, including pruning and myelination.
psychiatry and clinical psychology
10.1101/2021.01.05.20248768
Thinner cortex is associated with psychosis onset in individuals at Clinical High Risk for Developing Psychosis: An ENIGMA Working Group mega-analysis
ImportanceThe ENIGMA clinical high risk for psychosis (CHR) initiative, the largest pooled CHR-neuroimaging sample to date, aims to discover robust neurobiological markers of psychosis risk in a sample with known heterogeneous outcomes. ObjectiveWe investigated baseline structural neuroimaging differences between CHR subjects and healthy controls (HC), and between CHR participants who later developed a psychotic disorder (CHR-PS+) and those who did not (CHR-PS-). We assessed associations with age by group and conversion status, and similarities between the patterns of effect size maps for psychosis conversion and those found in other large-scale psychosis studies. Design, Setting, and ParticipantsBaseline T1-weighted MRI data were pooled from 31 international sites participating in the ENIGMA CHR Working Group. MRI scans were processed using harmonized protocols and analyzed within a mega- and meta-analysis framework from January-October 2020. Main Outcome(s) and Measure(s)Measures of regional cortical thickness (CT), surface area (SA), and subcortical volumes were extracted from T1-weighted MRI scans. Independent variables were group (CHR, HC) and conversion status (CHR-PS+, CHR-PS-, HC). ResultsThe final dataset consisted of 3,169 participants (CHR=1,792, HC=1,377, age range: 9.5 to 39.8 years, 45% female). Using longitudinal clinical information, we identified CHR-PS+ (N=253) and CHR-PS- (N=1,234). CHR exhibited widespread thinner cortex compared to HC (average d=-0.125, range: -0.09 to -0.17), but not SA or subcortical volume. Thinner cortex in the fusiform, superior temporal, and paracentral regions was associated with psychosis conversion (average d=-0.22). Age showed a stronger negative association with left fusiform and left paracentral CT in HC, compared to CHR-PS+. Regional CT psychosis conversion effect sizes resembled patterns of CT alterations observed in other ENIGMA studies of psychosis. Conclusions and RelevanceWe provide evidence for widespread subtle CT reductions in CHR. The pattern of regions displaying greater CT alterations in CHR-PS+ were similar to those reported in other large-scale investigations of psychosis. Additionally, a subset of these regions displayed abnormal age associations. Widespread CT disruptions coupled with abnormal age associations in CHR may point to disruptions in postnatal brain developmental processes. Key PointsO_ST_ABSQuestionC_ST_ABSHow do baseline brain morphometric features relate to later psychosis conversion in individuals at clinical high risk (CHR)? FindingsIn the largest coordinated international analysis to date, reduced baseline cortical thickness, but not cortical surface area or subcortical volume, was more pronounced in CHR, in a manner highly consistent with thinner cortex in established psychosis. Regions that displayed greater cortical thinning in future psychosis converters additionally displayed abnormal associations with age. MeaningCHR status and later transition to psychosis is robustly associated with reduced cortical thickness. Abnormal age associations and specificity to cortical thickness may point to aberrant postnatal brain development in CHR, including pruning and myelination.
psychiatry and clinical psychology
10.1101/2021.01.05.20249027
Adaptive immune responses to SARS-CoV-2 in recovered severe COVID-19 patients
ObjectivesThere is an imperative need to determine the durability of adaptive immunity to SARS-CoV-2. We enumerated SARS-CoV-2-reactive CD4+ and CD8+ T cells targeting S1 and M proteins and measured RBD-specific serum IgG over a period of 2-6 months after symptoms onset in a cohort of subjects who had recovered from severe clinical forms of COVID-19. MethodsWe recruited 58 patients (38 males and 20 females; median age, 62.5 years), who had been hospitalized with bilateral pneumonia, 60% with one or more comorbidities. IgG antibodies binding to SARS-CoV-2 RBD were measured by ELISA. SARS-CoV-2-reactive CD69+-expressing-IFN{gamma}-producing-CD4+ and CD8+ T cells were enumerated in heparinized whole blood by flow cytometry for ICS. ResultsDetectable SARS-CoV-2-S1/M-reactive CD69+-IFN-{gamma} CD4+ and CD8+ T cells were displayed in 17 (29.3%) and 6 (10.3%) subjects respectively, at a median of 84 days after onset of symptoms (range, 58-191 days). Concurrent comorbidities increased the risk (OR, 3.15; 95% CI, 1.03-9.61; P=0.04) of undetectable T-cell responses in models adjusted for age, sex and hospitalization ward. Twenty-one out of the 35 patients (60%) had detectable RBD-specific serum IgGs at a median of 118 days (range, 60 to 145 days) after symptoms onset. SARS-CoV-2 RBD-specific IgG serum levels were found to drop significantly over time. ConclusionA relatively limited number of subjects who developed severe forms of COVID-19 had detectable SARS-CoV-2-S1/M IFN{gamma} CD4+ and CD8+ T cells at midterm after clinical diagnosis. Our data also indicated that serum levels of RBD-specific IgGs decline over time, becoming undetectable in some patients.
infectious diseases
10.1101/2021.01.06.20240903
The spread of breathing air from wind instruments and singers using schlieren techniques
In this article, the spread of breathing air when playing wind instruments and singing was investigated and visualized using two methods: (1) schlieren imaging with a schlieren mirror and (2) background-oriented schlieren (BOS). These methods visualize airflow by visualizing density gradients in transparent media. The playing of professional woodwind and brass instrument players, as well as professional classical trained singers, were investigated to estimate the spread distances of the breathing air. For a better comparison and consistent measurement series, a single high and a single low note as well as an extract of a musical piece were investigated. Additionally, anemometry was used to determine the velocity of the spreading breathing air and the extent to which it was still quantifiable. The results presented in this article show there is no airflow escaping from the instruments, which is transported farther than 1.2 m into the room. However, differences in the various instruments have to be considered to assess properly the spread of the breathing air. The findings discussed below help to estimate the risk of cross-infection for wind instrument players and singers and to develop efficacious safety precautions, which is essential during critical health periods such as the current COVID-19 pandemic.
occupational and environmental health
10.1101/2021.01.06.20249035
Dynamics of antibodies to SARS-CoV-2 in convalescent plasma donors
The novel SARS-CoV-2 virus emerged in late 2019 and has caused a global health and economic crisis. The characterization of the human antibody response to SARS-CoV-2 infection is vital for serosurveillance purposes as well for treatment options such as transfusion with convalescent plasma or immunoglobin products derived from convalescent plasma. In this study, we measured antibody responses in 844 longitudinal samples from 151 RT-PCR positive SARS-CoV-2 convalescent adults during the first 34 weeks after onset of symptoms. All donors were seropositive at the first sampling moment and only one donor seroreverted during follow-up analysis. Anti-RBD IgG and anti-nucleocapsid IgG levels slowly declined with median half-lifes of 62 and 59 days during 2-5 months after symptom onset, respectively. The rate of decline of antibody levels diminished during extended follow-up. In addition, the magnitude of the IgG response correlated with neutralization capacity measured in a classic plaque reduction assay as well in our in-house developed competition assay. The result of this study gives valuable insight into the longitudinal response of antibodies to SARS-CoV-2.
infectious diseases
10.1101/2021.01.06.21249332
No association between the SARS-CoV-2 variants and mortality rates in the Eastern Mediterranean Region
As the novel coronavirus SARS-CoV-2 continues to spread in all countries, there is a growing interest in monitoring and understanding the impact of emerging strains on virus transmission and disease severity. Here, we analyzed SARS-CoV-2 genomic sequences reported in the Eastern Mediterranean Region (EMR) countries, as of 1 January 2021. The majority ([~]75%) of these sequences originated from three out of 22 EMR countries, and 65.8% of all sequences belonged to GISAID clades GR, GH, G and GV. A delay ranging between 30-150 days from sample collection to sequence submission was observed across all countries, limiting the utility of such data in informing public health policies. We identified ten common non-synonymous mutations represented among SARS-CoV-2 in the EMR and several country-specific ones. Two substitutions, spike_D614G and NSP12_P323L, were predominantly concurrent in most countries. While the single incidence of NSP12_P323L was positively correlated with higher case fatality rates in EMR, no such association was established for the double (spike_D614G and NSP12_P323L) concurrent variant across the region. Our study identified critical data gaps in EMR highlighting the importance of enhancing surveillance and sequencing capacities in the region.
epidemiology
10.1101/2021.01.06.21249332
No Association Between the SARS-CoV-2 Variants and Mortality Rates in the Eastern Mediterranean Region
As the novel coronavirus SARS-CoV-2 continues to spread in all countries, there is a growing interest in monitoring and understanding the impact of emerging strains on virus transmission and disease severity. Here, we analyzed SARS-CoV-2 genomic sequences reported in the Eastern Mediterranean Region (EMR) countries, as of 1 January 2021. The majority ([~]75%) of these sequences originated from three out of 22 EMR countries, and 65.8% of all sequences belonged to GISAID clades GR, GH, G and GV. A delay ranging between 30-150 days from sample collection to sequence submission was observed across all countries, limiting the utility of such data in informing public health policies. We identified ten common non-synonymous mutations represented among SARS-CoV-2 in the EMR and several country-specific ones. Two substitutions, spike_D614G and NSP12_P323L, were predominantly concurrent in most countries. While the single incidence of NSP12_P323L was positively correlated with higher case fatality rates in EMR, no such association was established for the double (spike_D614G and NSP12_P323L) concurrent variant across the region. Our study identified critical data gaps in EMR highlighting the importance of enhancing surveillance and sequencing capacities in the region.
epidemiology
10.1101/2021.01.05.20249082
An Ebola virus disease model with fear and environmental transmission dynamics
Recent Ebola virus disease (EVD) outbreaks have been limited not only to the interactions between humans but also to the complex interplay of the environment, human and socio-economic factors. Changes in human behaviour as a result of fear can also affect disease transmission dynamics. In this paper, a compartmental model is used to study the dynamics of EVD incorporating fear and environmental transmission. We formulate a fear dependent contact rate function to measure the rate of person to person, as well as pathogen to person transmissions. The epidemic threshold and the model equilibria are determined and, their stabilities are analysed. The model is validated by fitting it to data from the 2019 and 2020 EVD outbreaks in the Democratic Republic of Congo. Our results suggest that the fear of death from EVD may reduce the transmission and aid the control of the disease, but it is not sufficient to eradicate the disease. Policymakers need to also implement other control measures such as case finding, media campaigns, Quarantine and increase in the number of beds in the Ebola treatment centers, good laboratory services, safe burials and social mobilisation, to eradicate the disease. HighlightsO_LIDue to its high case fatality rate, EVD undoubtedly instills fear in the inhabitants of any affected community. C_LIO_LIWe propose an Ebola model with fear, which considers the pathogens in the environment to quantify the effect of fear and environmental transmission on the EVD disease dynamics. C_LIO_LIThe fear of death from Ebola is proportional to the Ebola disease transmission rate. C_LIO_LIAt high levels of fear, the number of EVD cases decrease. C_LI
epidemiology
10.1101/2021.01.06.21249339
Continued need for non-pharmaceutical interventions after COVID-19 vaccination in long-term-care facilities
Long-term care facilities (LTCFs) bear disproportionate burden of COVID-19 and are prioritized for vaccine deployment. LTCF outbreaks could continue occurring during vaccine rollout due to incomplete population coverage, and the effect of vaccines on viral transmission are currently unknown. Declining adherence to non-pharmaceutical interventions (NPIs) against within-facility transmission could therefore limit the effectiveness of vaccination. We built a stochastic model to simulate outbreaks in LTCF populations with differing vaccination coverage and NPI adherence to evaluate their interacting effects. Vaccination combined with strong NPI adherence produced the least morbidity and mortality. Healthcare worker vaccination improved outcomes in unvaccinated LTCF residents but was less impactful with declining NPI adherence. To prevent further illness and deaths, there is a continued need for NPIs in LTCFs during vaccine rollout.
epidemiology
10.1101/2021.01.06.21249322
Indirect impacts of the COVID-19 pandemic at two tertiary neonatal units in Zimbabwe and Malawi: an interrupted time series analysis
BackgroundDeaths from COVID-19 have exceeded 1.8 million globally (January 2020). We examined trends in markers of neonatal care before and during the pandemic at two tertiary neonatal units in Zimbabwe and Malawi. MethodsWe analysed data collected prospectively via the NeoTree app at Sally Mugabe Central Hospital (SMCH), Zimbabwe, and Kamuzu Central Hospital (KCH), Malawi. Neonates admitted from 1 June 2019 to 25 September 2020 were included. We modelled the impact of the first cases of COVID-19 (Zimbabwe: 20 March 2020; Malawi: 3 April 2020) on number of admissions, gestational age and birth weight, source of admission referrals, prevalence of neonatal encephalopathy, and overall mortality. FindingsThe study included 3,450 neonates at SMCH and 3,350 neonates at KCH. Admission numbers at SMCH did not initially change after the first case of COVID-19 but fell by 48% during a nurses strike (Relative risk (RR) 0{middle dot}52, 95%CI 0{middle dot}40-0{middle dot}68, p < 0{middle dot}002). At KCH, admissions dropped by 42% (RR 0{middle dot}58; 95%CI 0{middle dot}48-0{middle dot}70; p < 0{middle dot}001) soon after the first case of COVID-19. At KCH, gestational age and birth weight decreased slightly (1 week, 300 grams), outside referrals dropped by 28%, and there was a slight weekly increase in mortality. No changes in these outcomes were found at SMCH. InterpretationThe indirect impacts of COVID-19 are context-specific. While this study provides vital evidence to inform health providers and policy makers, national data are required to ascertain the true impacts of the pandemic on newborn health. FundingInternational Child Health Group, Wellcome Trust. RESEARCH IN CONTEXTO_ST_ABSEvidence before this studyC_ST_ABSWe searched PubMed for evidence of the indirect impact of the COVID-19 pandemic on neonatal care in low-income settings using the search terms neonat* or newborn, and COVID-19 or SARS-CoV 2 or coronavirus, and the Cochrane low and middle income country (LMIC) filters, with no language limits between 01.10.2019 and 21.11.20. While there has been a decrease in global neonatal mortality rates, the smaller improvements seen in low-income settings are threatened by the direct and indirect impact of the COVID-19 pandemic. A modelling study of this threat predicted between 250000-1.1 million extra neonatal deaths as a result of decreased service provision and access in LMICs. A webinar and survey of frontline maternal/newborn healthcare workers in >60 countries reported a decline in both service attendance and in quality of service across the ante-, peri- and post-natal journey. Reporting fear of attending services, and difficulty in access, and a decrease in service quality due to exacerbation of existing service weaknesses, confusion over guidelines and understaffing. Similar findings were reported in a survey of healthcare workers providing childhood and maternal vaccines in LMICs. One study to date has reported data from Nepal describing an increase in stillbirths and neonatal deaths, with institutional deliveries nearly halved during lockdown. Added value of this studyTo our knowledge, this is the first and only study in Sub-Saharan Africa describing the impact of COVID-19 pandemic on health service access and outcomes for newborns in two countries. We analysed data from the digital quality improvement and data collection tool, the NeoTree, to carry out an interrupted time series analysis of newborn admission rates, gestational age, birth weight, diagnosis of hypoxic ischaemic encephalopathy and mortality from two large hospitals in Malawi and Zimbabwe (n[~]7000 babies). We found that the indirect impacts of COVID-19 were context-specific. In Sally Mugabe Central Hospital, Zimbabwe, initial resilience was demonstrated in that there was no evidence of change in mortality, birth weight or gestational age. In comparison, at Kamuzu Central Hospital, Malawi, soon after the first case of COVID-19, the data revealed a fall in admissions (by 42%), gestational age (1 week), birth weight (300 grams), and outside referrals (by 28%), and there was a slight weekly increase in mortality (2%). In the Zimbabwean hospital, admission numbers did not initially change after the first case of COVID-19 but fell by 48% during a nurses strike, which in itself was in response to challenges exacerbated by the pandemic. Implications of all the available evidenceOur data confirms the reports from frontline healthcare workers of a perceived decline in neonatal service access and provision in LMICs. Digital routine healthcare data capture enabled rapid profiling of indirect impacts of COVID-19 on newborn care and outcomes in two tertiary referral hospitals, Malawi and Zimbabwe. While a decrease in service access was seen in both countries, the impacts on care provided and outcome differed by national context. Health systems strengthening, for example digital data capture, may assist in planning context-specific mitigation efforts.
pediatrics
10.1101/2021.01.06.21249341
Construction of a demand and capacity model for intensive care and hospital ward beds, and mortality from COVID-19
BackgroundThis paper describes the construction of a model used to estimate the number of excess deaths that could be expected as a direct consequence of a lack of hospital bed and intensive care unit (ICU) capacity. MethodsA series of compartmental models was used to estimate the number of deaths under different combinations of care required (ICU or ward), and care received (ICU, ward or no care) in England up to the end of April 2021. Model parameters were sourced from publicly available government information, organisations collating COVID-19 data and calculations using existing parameters. A compartmental sub-model was used to estimate the mortality scalars that represent the increase in mortality that would be expected from a lack of provision of an ICU or general ward bed when one is required. Three illustrative scenarios for admissions numbers, Optimistic, Middling and Pessimistic, are described showing how the model can be used to estimate mortality rates under different scenarios of capacity. ResultsThe key output of our collaboration was the model itself rather than the results of any of the scenarios. The model allows a user to understand the excess mortality impact arising as a direct consequence of capacity being breached under various scenarios or forecasts of hospital admissions. The scenarios described in this paper are illustrative and are not forecasts. There were no excess deaths from a lack of capacity in any of the Optimistic scenario applications in sensitivity analysis. Several of the Middling scenario applications under sensitivity testing resulted in excess deaths directly attributable to a lack of capacity. Most excess deaths arose when we modelled a 20% reduction compared to best estimate ICU capacity. This led to 597 deaths (0.7% increase). All the Pessimistic scenario applications under sensitivity analysis had excess deaths. These ranged from 49,219 (19.4% increase) when we modelled a 20% increase in ward bed availability over the best-estimate, to 103,845 (40.9% increase) when we modelled a 20% shortfall in ward bed availability below the best-estimate. The emergence of a new, more transmissible variant (VOC 202012/01) increases the likelihood of real world outcomes at, or beyond, those modelled in our Pessimistic scenario. The results can be explained by considering how capacity evolves in each of the scenarios. In the Middling scenario, whilst ICU capacity may be approached and even possibly breached, there remains sufficient ward capacity to take lives who need either ward or ICU support, keeping excess deaths relatively low. However, the Pessimistic scenario sees ward capacity breached, and in many scenarios for a period of several weeks, resulting in much higher mortality in those lives who require care but do not receive it. ConclusionsNo excess deaths from breaching capacity would be expected under the unadjusted Optimistic assumptions of demand. The Middling scenario could result in some excess deaths from breaching capacity, though these would be small (0.7% increase) relative to the total number of deaths in that scenario. The Pessimistic scenario would certainly result in significant excess deaths from breaching capacity. Our sensitivity analysis indicated a range between 49,219 (19.4% increase) and 103,845 (40.9% increase) excess deaths. Without the new variant, exceeding capacity for hospital and ICU beds did not appear to be the most likely outcome but given the new variant it now appears more plausible and, if so, would result in a substantial increase in the number of deaths from COVID-19.
public and global health