id
stringlengths 16
27
| title
stringlengths 18
339
| abstract
stringlengths 95
38.7k
| category
stringlengths 7
44
|
---|---|---|---|
10.1101/2021.01.08.21249413 | Polygenic Risk Scores for Alzheimer's Disease and Mild Cognitive Impairment in Hispanics/Latinos in the U.S: The Study of Latinos - Investigation ofNeurocognitive Aging | IntroductionPolygenic Risk Score (PRS) are powerful summaries of genetic risk alleles that can potentially be used to predict disease outcomes and guide treatment decisions. Hispanics/Latinos suffer from higher rates of Alzheimers Disease (AD) and Mild Cognitive Impairment (MCI) compared to non-Hispanic Whites, yet the strongest known genetic risk factor for AD, APOE-{epsilon}4 allele, has weak association with AD in Hispanics/Latinos. We evaluated PRS constructed based on Genome-Wide Association Studies (GWAS) of AD in predicting MCI in Hispanics/Latinos when accounting for APOE alleles and variants.
MethodsWe used summary statistics from four GWAS of AD to construct PRS that predict MCI in 4,189 diverse Hispanics/Latinos (mean age 63 years, 47% males) from the Study of Latinos-Investigation of Neurocognitive Aging. We assessed the PRS associations with MCI in the combined set of people and in groups defined by genetic ancestry and Hispanic/Latino background, and when including and excluding single nucleotide polymorphisms (SNPs) from the APOE gene region.
ResultsA PRS constructed based on GWAS of AD in the FINNGEN Biobank was associated with MCI (OR = 1.34, 95% CI [1.15, 1.55]), and its association was mostly driven by 158 APOE region SNPs. A PRS constructed based on a multi-ethnic AD GWAS was associated with MCI (OR=1.22, 95% CI [1.08, 1.37]) without including any APOE region SNPs. APOE-{epsilon}4 and APOE-{epsilon}2 alleles were not associated with MCI.
DiscussionA combination of APOE region SNPs is associated with MCI in Hispanics/Latinos despite APOE-{epsilon}4 and APOE-{epsilon}2 alleles not being associated with MCI. | genetic and genomic medicine |
10.1101/2021.01.08.21249450 | BinomiRare: A carriers-only test for association of rare genetic variants with a binary outcome for mixed models and any case-control proportion | Whole genome and exome sequencing studies have become increasingly available and are being used to identify rare genetic variants associated with health and disease outcomes. Investigators routinely use mixed models to account for genetic relatedness or other clustering variables (e.g. family or household) when testing genetic associations. However, no existing tests of the association of a rare variant association with a binary outcome in the presence of correlated data controls the Type 1 error where there are (1) few carriers of the rare allele, (2) a small proportion of cases relative to controls, and (3) covariates to adjust for. Here, we address all three issues in developing the carriers-only test framework for testing rare variant association with a binary trait. In this framework, we estimate outcome probabilities under the null hypothesis, and then use them, within the carriers, to test variant associations. We extend the BinomiRare test, which was previously proposed for independent observations, and develop the Conway-Maxwell-Poisson (CMP) test, and study their properties in simulations. We show that the BinomiRare test always controls the type 1 error, while the CMP test sometimes does not. We then use the BinomiRare test to test the association of rare genetic variants in target genes with small vessel disease stroke, short sleep, and venous thromboembolism, in whole-genome sequence data from the Trans-Omics for Precision Medicine program. | genetic and genomic medicine |
10.1101/2021.01.08.21249449 | The Effect of Temperature on Covid-19 Confirmed Cases: Evidence from US Counties | This paper studies the effect of air temperature on the transmission of COVID-19 in the U.S. using daily observations across counties. This study uses various ordinary least squares (OLS) models with a comprehensive set of fixed effects to overcome unobserved heterogeneity issues across counties as well as the generalized method of moments (GMM) estimators as dynamic models to address endogeneity issue. Our main results indicate that an increase of one degree in temperature is associated with a reduction of 0.041 cases per 100,000 population at the county-level. We run several robustness tests and all the models confirm the impact of temperature on COVID-19 confirmed new cases. These results help policymakers and economists in optimizing decisions and investments to reduce COVID- 19 new cases.
JEL CodesI10; Q51; Q54; H12 | health economics |
10.1101/2021.01.08.21249378 | How can the public health impact of vaccination be estimated? | Deaths due to vaccine preventable diseases cause a notable proportion of mortality worldwide. To quantify the importance of vaccination, it is necessary to estimate the burden averted through vaccination. The Vaccine Impact Modelling Consortium (VIMC) was established to estimate the health impact of vaccination. We describe the methods implemented by the VIMC to estimate impact by calendar year, birth year and year of vaccination (YoV). The calendar and birth year methods estimate impact in a particular year and over the lifetime of a particular birth cohort, respectively. The YoV method estimates the impact of a particular years vaccination activities through the use of impact ratios which have no stratification and stratification by activity type and/or birth cohort. Furthermore, we detail an impact extrapolation (IE) method for use between coverage scenarios. We compare the methods, focusing on YoV for hepatitis B, measles and yellow fever. We find that the YoV methods estimate similar impact with routine vaccinations but have greater yearly variation when campaigns occur with the birth cohort stratification. The IE performs well for the YoV methods, providing a time-efficient mechanism for updates to impact estimates. These methods provide a robust set of approaches to quantify vaccination impact. | infectious diseases |
10.1101/2021.01.08.20249024 | Tuberculosis biomarkers discovered using Diversity Outbred mice. | BackgroundBiomarker discovery for pulmonary tuberculosis (TB) may be accelerated by modeling human genotypic diversity and phenotypic responses to Mycobacterium tuberculosis (Mtb). To meet these objectives, we use the Diversity Outbred (DO) mouse population and apply novel classifiers to identify informative biomarkers from multidimensional data sets.
MethodTo identify biomarkers, we infected DO mice with aerosolized Mtb confirmed a human-like spectrum of phenotypes, examined gene expression, and inflammatory and immune mediators in the lungs. We measured 11 proteins in 453 Mtb-infected and 29 non-infected mice. We have searched all combinations of six classification algorithms and 239 biomarker subsets and independently validated the selected classifiers. Finally, we selected two mouse lung biomarkers to test as candidate biomarkers of active TB, measuring their diagnostic performance in human sera acquired from the Foundation for Innovative New Diagnostics.
FindingsDO mice discovered two translationally relevant biomarkers, CXCL1 and MMP8 that accurately diagnosed active TB in humans with > 90% sensitivity and specificity compared to controls. We identified them through the two classifiers that accurately diagnosed supersusceptible DO mice with early-onset TB: Logistic Regression using MMP8 as a single biomarker, and Gradient Tree Boosting using a panel of 4 biomarkers (CXCL1, CXCL2, TNF, IL-10).
InterpretationThis work confirms that the DO population models human responses and can accelerate discovery of translationally relevant TB biomarkers.
FundingSupport was provided by NIH R21 AI115038; NIH R01 HL145411; NIH UL1-TR001430; and the American Lung Association Biomedical Research Grant RG-349504. | infectious diseases |
10.1101/2021.01.08.20248149 | Antiviral drugs in hospitalized patients with COVID-19 - the DisCoVeRy trial | BackgroundLopinavir/ritonavir, lopinavir/ritonavir-interferon (IFN)-{beta}-1a and hydroxychloroquine efficacy for COVID-19 have been evaluated, but detailed evaluation is lacking.
ObjectiveTo determine the efficacy of lopinavir/ritonavir, lopinavir/ritonavir-IFN-{beta}-1a, hydroxychloroquine or remdesivir for improving the clinical, virological outcomes in COVID-19 inpatients.
DesignOpen-label, randomized, adaptive, controlled trial.
SettingMulti-center trial with patients from France.
Participants583 COVID-19 inpatients requiring oxygen and/or ventilatory support
InterventionStandard of care (SoC, control), SoC plus lopinavir/ritonavir (400 mg lopinavir and 100 mg ritonavir every 12h for 14 days), SoC plus lopinavir/ritonavir plus IFN-{beta}-1a (44 g of subcutaneous IFN-{beta}-1a on days 1, 3, and 6), SoC plus hydroxychloroquine (400 mg twice on day 1 then 400 mg once daily for 9 days) or SoC plus remdesivir (200 mg intravenously on day 1 then 100 mg once-daily for hospitalization duration or 10 days).
MeasurementsThe primary outcome was the clinical status at day 15, measured by the WHO 7-point ordinal scale. Secondary outcomes included SARS-CoV-2 quantification in respiratory specimens and safety analyses.
ResultsAdjusted Odds Ratio (aOR) for the WHO 7-point ordinal scale were not in favor of investigational treatments: lopinavir/ritonavir versus control, aOR 0.83, 95%CI, 0.55 to 1.26, P=0.39; lopinavir/ritonavir-IFN-{beta}-1a versus control, aOR 0.69, 95%CI, 0.45 to 1.04, P=0.08; hydroxychloroquine versus control, aOR 0.93, 95%CI, 0.62 to 1.41, P=0.75. No significant effect on SARS-CoV-2 RNA clearance in respiratory tract was evidenced. Lopinavir/ritonavir-containing treatments were significantly associated with more SAE.
LimitationsNot a placebo-controlled, no anti-inflammatory agents tested.
ConclusionNo improvement of the clinical status at day 15 nor SARS-CoV-2 RNA clearance in respiratory tract specimens by studied drugs. This comforts the recent Solidarity findings.
RegistrationNCT04315948.
FundingPHRC 2020, Dim OneHealth, REACTing | infectious diseases |
10.1101/2021.01.05.21249310 | Ivermectin as a potential treatment for mild to moderate COVID-19: A double blind randomized placebo-controlled trial | ObjectiveIvermectin has been suggested as a treatment for COVID-19.This randomised control trial was conducted to test the efficacy of Ivermectin in the treatment of mild and moderate COVID-19.
DesignParallel, double blind, randomised, placebo controlled trial Setting: A tertiary care dedicated COVID-19 hospital in Bihar, India
ParticipantsAdult patients (> 18 years) admitted with mild to moderate COVID 19 disease (saturation > 90% on room air, respiratory rate < 30 and no features of shock) with no contraindications to ivermectin and willing to participate in the study
InterventionPatients in the intervention arm were given ivermectin 12 mg on day 1 and day 2 of admission. Patients in the placebo arm were given identical looking placebo tablets. Rest of the treatment was continued as per the existing protocol and the clinical judgment of the treating teams.
Outcome MeasuresThe primary outcome measure was a negative RT-PCR test for SARS-CoV-2 on day 6 of admission. The secondary outcome measures were symptom status on day 6, discharge status on day 10, admission to ICU, need for invasive mechanical ventilation and in-hospital mortality.
ResultsA total of 115 patients were enrolled for the study of which 112 were included in the final analysis. Of them, 55 were randomised to the intervention arm while 57 were randomised to the placebo arm. There was no significant difference in the baseline characteristics of the two arms. There was no significant difference in the primary outcome, i.e. negative RT-PCR status on day 6 between the two groups. Similarly, there was no significant difference between the two groups in most of the secondary outcome measures, viz. symptom status on day 6, discharge status on day 10, admission to ICU, and need for invasive mechanical ventilation. However, while there was no in-hospital mortality in the intervention arm, there were 4 deaths in the placebo arm. As a result, all patients in the intervention arm (n=56) were successfully discharged as compared to 93.1% (n=54/58) in the placebo arm (RR 1.1, 95% CI 1.0 to 1.2, p=0.019).
ConclusionThere was no difference in the primary outcome i.e. negative RT-PCR status on day 6 of admission with the use of ivermectin. However, a significantly higher proportion of patients were discharged alive from the hospital when they received ivermectin.
Strengths and Limitations of the StudyO_LIThis study was randomised and double blind, thereby minimizing the chance of bias.
C_LIO_LIAll outcome measures except symptom status on day 6 were objective and placebo control was used for comparison.
C_LIO_LIOnly single repeat RT-PCR was done. So median time to viral clearance in the two groups could not be calculated.
C_LIO_LISevere cases were not included in the study.
C_LI | infectious diseases |
10.1101/2021.01.08.20249017 | MassMark: A Highly Scalable Multiplex NGS-based Method for High-Throughput, Accurate and Sensitive Detection of SARS-CoV-2 for Mass Testing | Mass testing has been proposed as a strategy to address and contain the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic1,2. We have developed MassMark, a novel and highly scalable multiplex method that employs next generation sequencing for high-throughput, accurate and sensitive detection of SARS-CoV-2, while minimizing handling complexity and resources by utilizing a serial pooling strategy to accommodate over 9,000 samples per workflow. Analytical validation showed that MassMark was able to detect SARS-CoV-2 RNA down to a level of 100 copies per reaction. We evaluated the clinical performance of MassMark in a simulated screening testing with 22 characterized samples from three different sources (nasopharyngeal swabs, nasal swabs and saliva), comprising of 12 SARS-CoV-2 positive samples with mid to late Ct values (range: 22.98-32.72) and 10 negative samples. There was one false negative and no false positives, giving an overall sensitivity and specificity of 91.67% and 100% respectively, when compared against an optimized RT-PCR test with a target size within 70 bp (CDC 2019-nCoV Real-Time RT-PCR Diagnostic Panel3). | infectious diseases |
10.1101/2021.01.08.21249445 | COVID-19 seroprevalence among healthcare workers of a large COVID Hospital in Rome reveals strengths and limits of two different serological tests | In several hospitals worldwide, healthcare workers are currently at the forefront against coronavirus disease 2019 (COVID-19). Since Fondazione Policlinico Universitario A. Gemelli (FPG) IRCCS has been enlisted as a COVID hospital, healthcare workers deployed to COVID wards were separated from those with limited or no exposure, whereas administrative staff was destined to work-from-home.
Between June 4 and July 3 2020, an investigation was carried out to evaluate seroprevalence of SARS-CoV-2 IgG antibodies among employees of the FPG using point-of-care (POC) and venous blood tests. Sensitivity, specificity and predictive values were determined with reverse-transcription polymerase chain reaction (RT-PCR) on nasal/oropharyngeal swabs as gold standard.
Four thousand, seven hundred seventy-seven participants were enrolled. Seroprevalence was 3.66% using the POC test and 1.19% using venous blood test, with a significant difference between the two (p < 0.05).
POC sensitivity and specificity were, respectively, 63.64% (95% confidence interval (CI): 62.20% to 65.04%) and 96.64% (95% CI: 96.05% to 97.13%), while those of the venous blood test were, respectively, 78.79% (95% CI: 77.58% to 79.94%) and 99.36% (95% CI: 99.07% to 99.55%). Among low-risk population, point-of-cares predictive values were 58.33% (positive) and 98.23% (negative) whereas venous blood tests were 92.86% (positive) and 98.53% (negative). In conclusion, point-of-care tests have low diagnostic accuracy, while venous blood tests seem to show an overall poor reliability. | epidemiology |
10.1101/2021.01.08.21249445 | COVID-19 seroprevalence among healthcare workers of a large COVID Hospital in Rome reveals strengths and limits of two different serological tests | In several hospitals worldwide, healthcare workers are currently at the forefront against coronavirus disease 2019 (COVID-19). Since Fondazione Policlinico Universitario A. Gemelli (FPG) IRCCS has been enlisted as a COVID hospital, healthcare workers deployed to COVID wards were separated from those with limited or no exposure, whereas administrative staff was destined to work-from-home.
Between June 4 and July 3 2020, an investigation was carried out to evaluate seroprevalence of SARS-CoV-2 IgG antibodies among employees of the FPG using point-of-care (POC) and venous blood tests. Sensitivity, specificity and predictive values were determined with reverse-transcription polymerase chain reaction (RT-PCR) on nasal/oropharyngeal swabs as gold standard.
Four thousand, seven hundred seventy-seven participants were enrolled. Seroprevalence was 3.66% using the POC test and 1.19% using venous blood test, with a significant difference between the two (p < 0.05).
POC sensitivity and specificity were, respectively, 63.64% (95% confidence interval (CI): 62.20% to 65.04%) and 96.64% (95% CI: 96.05% to 97.13%), while those of the venous blood test were, respectively, 78.79% (95% CI: 77.58% to 79.94%) and 99.36% (95% CI: 99.07% to 99.55%). Among low-risk population, point-of-cares predictive values were 58.33% (positive) and 98.23% (negative) whereas venous blood tests were 92.86% (positive) and 98.53% (negative). In conclusion, point-of-care tests have low diagnostic accuracy, while venous blood tests seem to show an overall poor reliability. | epidemiology |
10.1101/2021.01.08.21249432 | Levels of SARS-CoV-2 population exposure are considerably higher than suggested by seroprevalence surveys | Accurate knowledge of accurate levels of prior population exposure has critical ramifications for preparedness plans of subsequent SARS-CoV-2 epidemic waves and vaccine prioritization strategies. Serological studies can be used to estimate levels of past exposure and thus position populations in their epidemic timeline. To circumvent biases introduced by decaying antibody titers over time, population exposure estimation methods should account for seroreversion, to reflect that changes in seroprevalence measures over time are the net effect of increases due to recent transmission and decreases due to antibody waning. Here, we present a new method that combines multiple datasets (serology, mortality, and virus positivity ratios) to estimate seroreversion time and infection fatality ratios and simultaneously infer population exposure levels. The results indicate that the average time to seroreversion is six months, and that true exposure may be more than double the current seroprevalence levels reported for several regions of England. | epidemiology |
10.1101/2021.01.08.21249432 | Levels of SARS-CoV-2 population exposure are considerably higher than suggested by seroprevalence surveys | Accurate knowledge of accurate levels of prior population exposure has critical ramifications for preparedness plans of subsequent SARS-CoV-2 epidemic waves and vaccine prioritization strategies. Serological studies can be used to estimate levels of past exposure and thus position populations in their epidemic timeline. To circumvent biases introduced by decaying antibody titers over time, population exposure estimation methods should account for seroreversion, to reflect that changes in seroprevalence measures over time are the net effect of increases due to recent transmission and decreases due to antibody waning. Here, we present a new method that combines multiple datasets (serology, mortality, and virus positivity ratios) to estimate seroreversion time and infection fatality ratios and simultaneously infer population exposure levels. The results indicate that the average time to seroreversion is six months, and that true exposure may be more than double the current seroprevalence levels reported for several regions of England. | epidemiology |
10.1101/2021.01.08.21249455 | Whole-Cell Dissociated Suspension Analysis in Human Brain Neurodegenerative Disease: A Pilot Study | Biochemical analysis of human brain tissue is typically done by homogenizing whole pieces of brain and separately characterizing the proteins, RNA, DNA, and other macromolecules within. While this has been sufficient to identify substantial changes, there is little ability to identify small changes or alterations that may occur in subsets of cells. To effectively investigate the biochemistry of disease in the brain, with its different cell types, we must first separate the cells and study them as phenotypically defined populations or even as individuals. In this project, we developed a new method for the generation of whole-cell-dissociated-suspensions (WCDS) in fresh human brain tissue that could be shared as a resource with scientists to study single human cells or populations. Characterization of WCDS was done in paraffin-embedded sections stained with H&E, and by phenotyping with antibodies using immunohistochemistry and fluorescence-activated cell sorting (FACS). Additionally, we compared extracted RNA from WCDS with RNA from adjacent intact cortical tissue, using RT-qPCR for cell-type-specific RNA for the same markers as well as whole transcriptome sequencing. More than 11,626 gene transcripts were successfully sequenced and classified using an external database either as being mainly expressed in neurons, astrocytes, microglia, oligodendrocytes, endothelial cells, or mixed (in two or more cell types). This demonstrates that we are currently capable of producing WCDS with a full representation of different brain cell types combined with RNA quality suitable for use in biochemical analysis. | pathology |
10.1101/2021.01.08.21249452 | Mathematical Model of a Personalized Neoantigen Cancer Vaccine and the Human Immune System: Evaluation of Efficacy | Cancer vaccines are an important component of the cancer immunotherapy toolkit enhancing immune response to malignant cells by activating CD4+ and CD8+ T cells. Multiple successful clinical applications of cancer vaccines have shown good safety and efficacy. Despite the notable progress, significant challenges remain in obtaining consistent immune responses across heterogeneous patient populations, as well as various cancers. We present as a proof of concept a mechanistic mathematical model describing key interactions of a personalized neoantigen cancer vaccine with an individual patients immune system. Specifically, the model considers the vaccine concentration of tumor-specific antigen peptides and adjuvant, the patients major histocompatibility complexes I and II copy numbers, tumor size, T cells, and antigen presenting cells. We parametrized the model using patient-specific data from a recent clinical study in which individualized cancer vaccines were used to treat six melanoma patients. Model simulations predicted both immune responses, represented by T cell counts, to the vaccine as well as clinical outcome (determined as change of tumor size). These kinds of models have the potential to lay the foundation for generating in silico clinical trial data and aid the development and efficacy assessment of personalized cancer vaccines.
Author summaryPersonalized cancer vaccines have gained attention in recent years due to the advances in sequencing techniques that have facilitated the identification of multiple tumor-specific mutations. This type of individualized immunotherapy has the potential to be specific, efficacious, and safe since it induces an immune response to protein targets not found on normal cells. This work focuses on understanding and analyzing important mechanisms involved in the activity of personalized cancer vaccines using a mechanistic mathematical model. This model describes the interactions of a personalized neoantigen peptide cancer vaccine, the human immune system and tumor cells operating at the molecular and cellular level. The molecular level captures the processing and presentation of neoantigens by dendritic cells to the T cells using cell surface proteins. The cellular level describes the differentiation of dendritic cells due to peptides and adjuvant concentrations in the vaccine, activation, and proliferation of T cells in response to treatment, and tumor growth. The model captures immune response behavior to a vaccine associated with patient specific factors (e.g., different initial tumor burdens). Our model serves as a proof of concept displaying its utility in clinical outcomes prediction, lays foundation for developing in silico clinical trials, and aids in the efficacy assessment of personalized vaccines. | pharmacology and therapeutics |
10.1101/2021.01.08.21249452 | Mathematical Model of a Personalized Neoantigen Cancer Vaccine and the Human Immune System: Evaluation of Efficacy | Cancer vaccines are an important component of the cancer immunotherapy toolkit enhancing immune response to malignant cells by activating CD4+ and CD8+ T cells. Multiple successful clinical applications of cancer vaccines have shown good safety and efficacy. Despite the notable progress, significant challenges remain in obtaining consistent immune responses across heterogeneous patient populations, as well as various cancers. We present as a proof of concept a mechanistic mathematical model describing key interactions of a personalized neoantigen cancer vaccine with an individual patients immune system. Specifically, the model considers the vaccine concentration of tumor-specific antigen peptides and adjuvant, the patients major histocompatibility complexes I and II copy numbers, tumor size, T cells, and antigen presenting cells. We parametrized the model using patient-specific data from a recent clinical study in which individualized cancer vaccines were used to treat six melanoma patients. Model simulations predicted both immune responses, represented by T cell counts, to the vaccine as well as clinical outcome (determined as change of tumor size). These kinds of models have the potential to lay the foundation for generating in silico clinical trial data and aid the development and efficacy assessment of personalized cancer vaccines.
Author summaryPersonalized cancer vaccines have gained attention in recent years due to the advances in sequencing techniques that have facilitated the identification of multiple tumor-specific mutations. This type of individualized immunotherapy has the potential to be specific, efficacious, and safe since it induces an immune response to protein targets not found on normal cells. This work focuses on understanding and analyzing important mechanisms involved in the activity of personalized cancer vaccines using a mechanistic mathematical model. This model describes the interactions of a personalized neoantigen peptide cancer vaccine, the human immune system and tumor cells operating at the molecular and cellular level. The molecular level captures the processing and presentation of neoantigens by dendritic cells to the T cells using cell surface proteins. The cellular level describes the differentiation of dendritic cells due to peptides and adjuvant concentrations in the vaccine, activation, and proliferation of T cells in response to treatment, and tumor growth. The model captures immune response behavior to a vaccine associated with patient specific factors (e.g., different initial tumor burdens). Our model serves as a proof of concept displaying its utility in clinical outcomes prediction, lays foundation for developing in silico clinical trials, and aids in the efficacy assessment of personalized vaccines. | pharmacology and therapeutics |
10.1101/2021.01.08.21249453 | Weighted burden analysis in 200,000 exome-sequenced subjects characterises rare variant effects on risk of type 2 diabetes | Type 2 diabetes (T2D) is a disease for which both common genetic variants and environmental factors influence risk. A few genes have been identified in which very rare variants have large effects on risk and here we carry out a weighted burden analysis of rare variants in a sample of over 200,000 exome-sequenced participants in the UK Biobank project, of whom over 13,000 have T2D. Variant weights were allocated based on allele frequency and predicted effect, as informed by a previous analysis of hyperlipidaemia. There was an exome-wide significant increased burden of rare, functional variants in three genes, GCK, HNF4A and GIGYF1. GIGYF1 has not previously been identified as a diabetes risk gene but its product is plausibly involved in the modification of insulin signalling. A number of other genes did not attain exome-wide significance but were highly ranked and potentially of interest, including ALAD, PPARG, GYG1 and GHRL. Loss of function (LOF) variants were associated with T2D in GCK and GIGYF1 whereas nonsynonymous variants annotated as probably damaging were associated in GCK and HNF4A. Overall, fewer than 1% of T2D cases carried one of these variants. In two genes previously implicated in diabetes aetiology, HNF1A and HNF1B, there was an excess of LOF variants among cases but the small numbers of these fell well short of statistical significance, suggesting that even larger datasets will be helpful for more fully elucidating the contribution of rare genetic variants to T2D risk. This research has been conducted using the UK Biobank Resource. | genetic and genomic medicine |
10.1101/2021.01.08.21249453 | Weighted burden analysis in 200,000 exome-sequenced subjects characterises rare variant effects on risk of type 2 diabetes | Type 2 diabetes (T2D) is a disease for which both common genetic variants and environmental factors influence risk. A few genes have been identified in which very rare variants have large effects on risk and here we carry out a weighted burden analysis of rare variants in a sample of over 200,000 exome-sequenced participants in the UK Biobank project, of whom over 13,000 have T2D. Variant weights were allocated based on allele frequency and predicted effect, as informed by a previous analysis of hyperlipidaemia. There was an exome-wide significant increased burden of rare, functional variants in three genes, GCK, HNF4A and GIGYF1. GIGYF1 has not previously been identified as a diabetes risk gene but its product is plausibly involved in the modification of insulin signalling. A number of other genes did not attain exome-wide significance but were highly ranked and potentially of interest, including ALAD, PPARG, GYG1 and GHRL. Loss of function (LOF) variants were associated with T2D in GCK and GIGYF1 whereas nonsynonymous variants annotated as probably damaging were associated in GCK and HNF4A. Overall, fewer than 1% of T2D cases carried one of these variants. In two genes previously implicated in diabetes aetiology, HNF1A and HNF1B, there was an excess of LOF variants among cases but the small numbers of these fell well short of statistical significance, suggesting that even larger datasets will be helpful for more fully elucidating the contribution of rare genetic variants to T2D risk. This research has been conducted using the UK Biobank Resource. | genetic and genomic medicine |
10.1101/2021.01.05.21249253 | DAGM: a novel modelling framework to assess the risk of HER2-negative breast cancer based on germline rare coding mutations | BackgroundBreast cancers can be divided into HER2-negative and HER2-positive subtypes according to the status of HER2 gene. Despite extensive studies connecting germline mutations with possible risk of HER2-negative breast cancer, the main category of breast cancer, it remains challenging to accurately assess its potential risk and to understand the potential mechanisms.
MethodsWe developed a novel framework named Damage Assessment of Genomic Mutations (DAGM), which projects rare coding mutations and gene expressions into Activity Profiles of Signalling Pathways (APSPs).
FindingsWe characterized and validated DAGM framework at multiple levels. Based on an input of germline rare coding mutations, we obtained the corresponding APSP spectrum to calculate the APSP risk score, which was capable of distinguish HER2-negative from HER2-positive cases. These findings were validated using breast cancer data from TCGA (AUC = 0.7). DAGM revealed the HER2 signalling pathway was up-regulated in the germline of HER2-negative patients, and those with high APSP risk scores had suppressed immunity. These findings were validated using RNA sequencing, phosphoproteome analysis, and CyTOF. Moreover, using germline mutations, DAGM could evaluate the risk of developing HER2-negative breast cancer, not only in women carrying BRCA1/2 mutations, but also in those without known disease-associated mutations.
InterpretationThe DAGM can facilitate the screening of subjects at high risk of HER2-negative breast cancer for primary prevention. This study also provides new insights into the potential mechanisms of developing HER2-negative breast cancer. The DAGM has the potential to be applied in the prevention, diagnosis, and treatment of HER2-negative breast cancer.
FundingThis work was supported by the National Key Research and Development Program of China (grant no. 2018YFC0910406 and 2018AAA0103302 to CZ); the National Natural Science Foundation of China (grant no. 81202076 and 82072939 to MY, 81871513 to KW); the Guangzhou Science and Technology Program key projects (grant no. 2014J2200007 to MY, 202002030236 to KW); the National Key R&D Program of China (grant no. 2017YFC1309100 to CL); and the Natural Science Foundation of Guangdong Province (grant no. 2017A030313882 to KW)
Research in contextO_ST_ABSEvidence before this studyC_ST_ABSThe majority of hereditary breast cancers are caused by BRCA1/2 mutations, and the presence of these mutations is strongly associated with an increased risk of breast cancer. Meanwhile, BRCA1/2 gene mutations are rarely found in sporadic breast cancers and only account for a modest percentage of all breast cancer patients. Polygenic risk score (PRS), a widely-used approach for stratifying individuals according to their risk of a certain kind of complex disease, has been used to predict subjects at high risk for breast cancer. However, relying on SNPs from genome-wide association studies (GWAS) without including gene expressions or pathway activities, PRS is not very suitable for cross-population prediction and describes disease risk in terms of genomic mutations without alluding to the underlying pathogenic mechanism(s). Therefore, there is still an urgent need for a population-independent comprehensive method to accurately assess the risk of breast cancer and to gain insights on potential mechanism(s).
Added value of this studyWhen subjecting germline rare coding mutations (gRCMs) to DAGM framework, which results in the corresponding APSP and APSP risk score. Both APSP and APSP risk score can identify HER2-negative from HER2-positive breast cancers. These findings suggest HER2-negative breast cancer does not develop accidentally, but rather is defined by a genomic evolutionary strategy. Furthermore, this study also revealed the up-regulation of HER2 signalling pathway in germlines of HER2-negative breast cancers and the immune suppression in subjects with high APSP risk score, shedding new light on the potential mechanisms of developing HER2-negative breast cancer. Moreover, our APSP risk score was able to relatively accurately evaluate the risk of developing HER2-negative breast cancer for each female, including not only BRCA1/2 carriers, but also non-carriers.
Implications of all the available evidenceThe present study suggests that HER2 signalling pathway activity, as an aggressive factor, contribute to the development of different types of breast cancers, either via the combined effects of multiple germline mutations in HER2-negative germlines or via amplifying the gene itself in HER2-positive tumour cells. This provides a theoretical basis for the prevention, diagnosis, and treatment of breast cancers. At the same time, the study provides preliminary methods for assessing the relative risk of HER2-negative breast cancer for females with or without BRCA1/2 mutations. Finally, our findings provide a new perspective and theoretical basis for identifying high-risk female subjects, based on the high APSP risk score, for early screening and prevention of HER2-negative breast cancer. | oncology |
10.1101/2021.01.08.20249019 | Quality of life and health-related utility after head&neck cancer surgery | PurposeThis work describes the methodology adopted and the results obtained in a utility elicitation task. The purpose was to elicit utility coefficients (UCs) needed to calculate quality-adjusted life years for a cost/utility analysis of TORS (Trans-Oral robotic Surgery) versus TLM (Trans-oral Laser Microsurgery), which are two minimally-invasive trans-oral surgery techniques for head & neck cancers.
MethodsSince the economic evaluation would be conducted from the point of view of the Swiss healthcare system, Swiss people (healthy volunteers) have been interviewed in order to tailor the model to that specific country. The utility elicitation was performed using a computerized tool (UceWeb). Standard gamble and rating scale methods were used.
ResultsUCs have been elicited from 47 individuals, each one providing values for 18 health states, for a total of 1692 expected values. Health states, described using graphical factsheets, ranged from remission to palliative care. Elicited UCs were different among states, ranging from 0.980 to 0.213. Those values were comparable to previously published results from a Canadian population, except for states related to recurrent disease (local, regional, and distant), and palliation, where the Swiss population showed lower utility values.
ConclusionFrom a methodological point of view, our study shows that the UceWeb tool can be profitably used for utility elicitation from healthy volunteers. From an application point of view, the study provides utility values that can be used not only for a specific cost-utility analysis, but for future studies involving health states following trans-oral head & neck surgery. Moreover, the study confirms that some UCs vary among countries, demanding for tailored elicitation tasks. | otolaryngology |
10.1101/2021.01.08.21249466 | HIV - infected patients on cART had similar level of arterial stiffness to the patients with ST - segment elevation myocardial infarction. | PurposeThe cardiovascular disease has become very common among HIV-infected patients. The aim was to compare the arterial stiffness and the endothelial dysfunction in HIV-infected patients to non-HIV-infected patients in week 4 after ST-segment elevation myocardial infarction (STEMI).
MethodsThe arterial stiffness was calculated by Endo-PAT 2000(ITAMAR(R)) and the endothelial function by Peripheral Arterial Tonometry(PAT(R)). The correct endothelial function was defined for natural logarithm of reactive hyperaemia index (lnRHI)>0.51. Arterial stiffness was assessed as the AI and corrected for heart rate of 75 bpm (AI@75).
ResultsSixty-three patients were recruited to this study, n=34 patients with HIV infection (n=18 on cART) and n=29 HIV-negative patients after recent STEMI. No statistically significant differences for AI and AI@75 were found in STEMI and in HIV on cART group. We observed p<0.05 for AI and AI@75 for patients without cART compared to STEMI and on cART patients. The observed lnRHI results were significantly different p<0.05 in STEMI and on cART patients. We get similar endothelial dysfunction p>0.05 for patients without cART compared to STEMI and on cART patients.
ConclusionsAssessing cardiovascular risk also with non-invasive methods among HIV-infected patients is very important especially in HIV-patients on cART. Endothelial dysfunction is connected with HIV infection and can be similar for STEMI and HIV-infected without cART. | hiv aids |
10.1101/2021.01.08.21249466 | HIV-infected patients on combined antiretroviral treatment had similar level of arterial stiffness to the patients with ST-segment elevation myocardial infarction. | PurposeThe cardiovascular disease has become very common among HIV-infected patients. The aim was to compare the arterial stiffness and the endothelial dysfunction in HIV-infected patients to non-HIV-infected patients in week 4 after ST-segment elevation myocardial infarction (STEMI).
MethodsThe arterial stiffness was calculated by Endo-PAT 2000(ITAMAR(R)) and the endothelial function by Peripheral Arterial Tonometry(PAT(R)). The correct endothelial function was defined for natural logarithm of reactive hyperaemia index (lnRHI)>0.51. Arterial stiffness was assessed as the AI and corrected for heart rate of 75 bpm (AI@75).
ResultsSixty-three patients were recruited to this study, n=34 patients with HIV infection (n=18 on cART) and n=29 HIV-negative patients after recent STEMI. No statistically significant differences for AI and AI@75 were found in STEMI and in HIV on cART group. We observed p<0.05 for AI and AI@75 for patients without cART compared to STEMI and on cART patients. The observed lnRHI results were significantly different p<0.05 in STEMI and on cART patients. We get similar endothelial dysfunction p>0.05 for patients without cART compared to STEMI and on cART patients.
ConclusionsAssessing cardiovascular risk also with non-invasive methods among HIV-infected patients is very important especially in HIV-patients on cART. Endothelial dysfunction is connected with HIV infection and can be similar for STEMI and HIV-infected without cART. | hiv aids |
10.1101/2021.01.08.21249467 | An Augmented SEIR Model with Protective and Hospital Quarantine Dynamics for the Control of COVID-19 Spread | In this work, an attempt is made to analyse the dynamics of COVID-19 outbreak mathematically using a modified SEIR model with additional compartments and a nonlinear incidence rate with the help of bifurcation theory. Existence of a forward bifurcation point is presented by deriving conditions in terms of parameters for the existence of disease free and endemic equilibrium points. The significance of having two additional compartments, viz., protective and hospital quarantine compartments, is then illustrated via numerical simulations. From the analysis and results, it is observed that, by properly selecting transfer functions to place exposed and infected individuals in protective and hospital quarantine compartments, respectively, and with apt governmental action, it is possible to contain the COVID-19 spread effectively. Finally, the capability of the proposed model in predicting/representing the COVID-19 dynamics is presented by comparing with real-time data. | infectious diseases |
10.1101/2021.01.08.21249379 | Detection of SARS-CoV-2 variants in Switzerland by genomic analysis of wastewater samples | The emergence of SARS-CoV-2 mutants with altered transmissibility, virulence, or immunogenicity emphasizes the need for early detection and epidemiological surveillance of genomic variants. Wastewater samples provide an opportunity to assess circulating viral lineages in the community. We performed genomic sequencing of 122 wastewater samples from three locations in Switzerland to analyze the B.1.1.7, B.1.351, and P.1 variants of SARS-CoV-2 on a population level. We called variant-specific signature mutations and monitored variant prevalence in the local population over time. To enable early detection of emerging variants, we developed a bioinformatics tool that uses read pairs carrying multiple signature mutations as a robust indicator of low-frequency variants. We further devised a statistical approach to estimate the transmission fitness advantage, a key epidemiological parameter indicating the speed at which a variant spreads through the population, and compared the wastewater-based findings to those derived from clinical samples. We found that the local outbreak of the B.1.1.7 variant in two Swiss cities was observable in wastewater up to 8 days before its first detection in clinical samples. We detected a high prevalence of the B.1.1.7 variant in an alpine ski resort popular among British tourists in December 2020, a time when the variant was still very rare in Switzerland. We found no evidence of local spread of the B.1.351 and P.1 variants at the monitored locations until the end of the study (mid February) which is consistent with clinical samples. Estimation of local variant prevalence performs equally well or better for wastewater samples as for a much larger number of clinical samples. We found that the transmission fitness advantage of B.1.1.7, i.e. the relative change of its reproductive number, can be estimated earlier and based on substantially fewer wastewater samples as compared to using clinical samples. Our results show that genomic sequencing of wastewater samples can detect, monitor, and evaluate genetic variants of SARS-CoV-2 on a population level. Our methodology provides a blueprint for rapid, unbiased, and cost-efficient genomic surveillance of SARS-CoV-2 variants. | infectious diseases |
10.1101/2021.01.08.21249379 | Detection of SARS-CoV-2 variants in Switzerland by genomic analysis of wastewater samples | The emergence of SARS-CoV-2 mutants with altered transmissibility, virulence, or immunogenicity emphasizes the need for early detection and epidemiological surveillance of genomic variants. Wastewater samples provide an opportunity to assess circulating viral lineages in the community. We performed genomic sequencing of 122 wastewater samples from three locations in Switzerland to analyze the B.1.1.7, B.1.351, and P.1 variants of SARS-CoV-2 on a population level. We called variant-specific signature mutations and monitored variant prevalence in the local population over time. To enable early detection of emerging variants, we developed a bioinformatics tool that uses read pairs carrying multiple signature mutations as a robust indicator of low-frequency variants. We further devised a statistical approach to estimate the transmission fitness advantage, a key epidemiological parameter indicating the speed at which a variant spreads through the population, and compared the wastewater-based findings to those derived from clinical samples. We found that the local outbreak of the B.1.1.7 variant in two Swiss cities was observable in wastewater up to 8 days before its first detection in clinical samples. We detected a high prevalence of the B.1.1.7 variant in an alpine ski resort popular among British tourists in December 2020, a time when the variant was still very rare in Switzerland. We found no evidence of local spread of the B.1.351 and P.1 variants at the monitored locations until the end of the study (mid February) which is consistent with clinical samples. Estimation of local variant prevalence performs equally well or better for wastewater samples as for a much larger number of clinical samples. We found that the transmission fitness advantage of B.1.1.7, i.e. the relative change of its reproductive number, can be estimated earlier and based on substantially fewer wastewater samples as compared to using clinical samples. Our results show that genomic sequencing of wastewater samples can detect, monitor, and evaluate genetic variants of SARS-CoV-2 on a population level. Our methodology provides a blueprint for rapid, unbiased, and cost-efficient genomic surveillance of SARS-CoV-2 variants. | infectious diseases |
10.1101/2021.01.08.21249443 | Polymorphisms affecting expression of the vaccine antigen factor H binding protein influence invasiveness of Neisseria meningitidis | Many bacterial diseases are caused by organisms that ordinarily are harmless components of the human microbiome. Effective interventions against these conditions requires an understanding of the processes whereby symbiosis or commensalism breaks down. Here, we performed bacterial genome-wide association studies (GWAS) of Neisseria meningitidis, a common commensal of the human respiratory tract despite being a leading cause of meningitis and sepsis. GWAS discovered single nucleotide polymorphisms (SNPs) and other bacterial genetic variants associated with invasive meningococcal disease (IMD) versus carriage in several loci across the genome, revealing the polygenic nature of this phenotype. Of note, we detected a significant peak around fHbp, which encodes factor H binding protein (fHbp); fHbp promotes bacterial immune evasion of human complement by recruiting complement factor H (CFH) to the meningococcal surface. We confirmed the association around fHbp with IMD in a validation GWAS, and found that SNPs identified in the validation affecting the 5 region of fHbp mRNA alter secondary RNA structures, increase fHbp expression, and enhance bacterial escape from complement-mediated killing. This finding mirrors the known link between complement deficiencies and CFH variation with human susceptibility to IMD, highlighting the central importance of human and bacterial genetic variation across the fHbp:CFH interface in IMD susceptibility, virulence, and the transition from carriage to disease. | infectious diseases |
10.1101/2021.01.08.20248932 | Efficacy of propolis as an adjunct treatment for hospitalized COVID-19 patients: a randomized, controlled clinical trial | Among candidate treatment options for COVID-19, propolis, produced by honey bees from bioactive plant exudates, has shown potential against viral targets and has demonstrated immunoregulatory properties. We conducted a randomized, controlled, open-label, single center trial, with a standardized propolis product (EPP-AF) on hospitalized adult COVID-19 patients. Patients received standard care plus propolis at an oral dose of 400mg/day (n=40) or 800mg/day (n=42) for seven days, or standard care alone (n=42). Standard care included all necessary interventions, as determined by the attending physician. The primary end point was the time to clinical improvement defined as the length of hospital stay or oxygen therapy dependency. Secondary outcomes included acute kidney injury and need for intensive care or vasoactive drugs. Time in the hospital after intervention was significantly shortened in both propolis groups compared to the controls; median 7 days with 400mg/day and 6 days with 800mg/day, versus 12 days for standard care alone. Propolis did not significantly affect the need for oxygen supplementation. With the higher dose, significantly fewer patients developed acute kidney injury than in the controls (2 versus 10 of 42 patients). Propolis as an adjunct treatment was safe and reduced hospitalization time. The registration number for this clinical trial is: NCT04480593 (20/07/2020). | infectious diseases |
10.1101/2021.01.09.21249480 | The importance of non-pharmaceutical interventions during the COVID-19 vaccine rollout | The promise of efficacious vaccines against SARS-CoV-2 is fulfilled and vaccination campaigns have started worldwide. However, the fight against the pandemic is far from over. Here, we propose an age-structured compartmental model to study the interplay of disease transmission, vaccines rollout, and behavioural dynamics. We investigate, via in-silico simulations, individual and societal behavioural changes, possibly induced by the start of the vaccination campaigns, and manifested as a relaxation in the adoption of non-pharmaceutical interventions. We explore different vaccine efficacy, vaccination rollout speeds, prioritization strategies, as well as multiple behavioural responses. We apply our model to six countries worldwide (Egypt, Peru, Serbia, Ukraine, Canada, and Italy) selected to sample diverse socio-demographic and socio-economic contexts. To isolate the effects of age-structures and contacts patterns from the particular pandemic history of each location, we first study the model considering the same hypothetical initial epidemic scenario in all countries. We then calibrate the model using real epidemiological and mobility data for the different countries. Our findings suggest that early relaxation of safe behaviours can jeopardize the benefits brought by the vaccine in the short term: a fast vaccine distribution and policies aimed at keeping high compliance of individual safe behaviours are key to mitigate disease resurgence. | epidemiology |
10.1101/2021.01.09.21249480 | The importance of non-pharmaceutical interventions during the COVID-19 vaccine rollout | The promise of efficacious vaccines against SARS-CoV-2 is fulfilled and vaccination campaigns have started worldwide. However, the fight against the pandemic is far from over. Here, we propose an age-structured compartmental model to study the interplay of disease transmission, vaccines rollout, and behavioural dynamics. We investigate, via in-silico simulations, individual and societal behavioural changes, possibly induced by the start of the vaccination campaigns, and manifested as a relaxation in the adoption of non-pharmaceutical interventions. We explore different vaccine efficacy, vaccination rollout speeds, prioritization strategies, as well as multiple behavioural responses. We apply our model to six countries worldwide (Egypt, Peru, Serbia, Ukraine, Canada, and Italy) selected to sample diverse socio-demographic and socio-economic contexts. To isolate the effects of age-structures and contacts patterns from the particular pandemic history of each location, we first study the model considering the same hypothetical initial epidemic scenario in all countries. We then calibrate the model using real epidemiological and mobility data for the different countries. Our findings suggest that early relaxation of safe behaviours can jeopardize the benefits brought by the vaccine in the short term: a fast vaccine distribution and policies aimed at keeping high compliance of individual safe behaviours are key to mitigate disease resurgence. | epidemiology |
10.1101/2021.01.08.21249473 | Migration and Outbreaks of Vaccine-Preventable Disease in Europe: A Systematic Review | BackgroundMigrant populations (defined as foreign-born) are one of several under-immunised groups in the EU/EEA, yet little is known about how they are affected by outbreaks of vaccine-preventable diseases (VPDs). This information is vital to develop targeted strategies to improve the health of diverse migrant communities and to assess risk factors and correlations with major European peaks in incidence of key VPDs over time.
MethodsWe did a systematic review (PROSPERO CRD42019157473; Medline, EMBASE, and Global Health January 2000 to October 2019) adhering to PRISMA guidelines, to identify studies on VPD outbreaks (measles, mumps, rubella, diphtheria, pertussis, polio, hepatitis A, N meningitidis, and H influenzae) in migrants residing in the EU/EEA and Switzerland.
Results45 studies were included, reporting on 47 distinct VPD outbreaks across 13 countries (26 [55%] were reported between 2010 and 2020, including 16 [34%] since 2015). Most reported outbreaks involving migrants were of measles (n=24; 6578 total cases), followed by varicella (n=11; 596 cases), hepatitis A (n=7; 1510 cases), rubella (n=3; 487 cases) and mumps (n=2; 295 cases). 19 (40%) of outbreaks, predominantly varicella and measles, were reported in temporary camps or shelters for asylum seekers and refugees. Of 11 varicella outbreaks, 82% were associated with adult migrants. Half of measles outbreaks (n=12) were associated with migrants from Eastern European countries, often involving migrants of Roma ethnicity.
ConclusionsMigrants represent one of several under-immunised groups involved in VPD outbreaks in Europe, with adult and child refugees and asylum seekers residing in shelters or temporary camps at particular risk, alongside specific nationality groups. Vulnerability varies by disease, setting, and individual demographics, highlighting the importance of tailoring strategies for implementing catch-up vaccination to specific groups, alongside the strengthening of routine data collection, in order to meet regional and global vaccination targets. Better understanding vaccine uptake and demand issues in migrant groups, and reducing the barriers they face to accessing vaccination services, is urgently needed, with direct implications for COVID-19 vaccine delivery at the current time. Strengthening vaccine delivery to migrant populations will require a greater focus on co-designing vaccine uptake strategies in close collaboration with affected communities.
FunderNIHR | public and global health |
10.1101/2021.01.08.21249468 | Women in Health Care Experiencing Occupational Stress and Burnout during COVID-19: A Review | ContextCOVID-19 has had an unprecedent impact on physicians, nurses, and other health professionals around the world, and a serious health care burnout crisis is emerging as a result of this pandemic.
ObjectivesWe aim to identify the causes of occupational stress and burnout in women in medicine, nursing, and other health professions during the COVID-19 pandemic and interventions that can support female health professionals deal with this crisis through a rapid review.
MethodsWe searched MEDLINE, Embase, CINAHL, PsycINFO, and ERIC from December 2019 through September 30, 2020. The review protocol was registered in PROSPERO and is available online. We selected all empirical studies that discussed stress and burnout in women health care workers during the COVID-19 pandemic.
ResultsThe literature search identified 6148 citations. A review of abstracts led to the retrieval of 721 full-text articles for assessment, of which 47 articles were included for review. Our findings show that concerns of safety (65%), staff and resource adequacy (43%), workload and compensation (37%), job roles and security (41%) appeared as common triggers of stress in the literature.
Conclusions and RelevanceThe current literature primarily focuses on self-focused initiatives such as wellness activities, coping strategies, reliance of family, friends and work colleagues to organizational led initiatives such as access to psychological support and training. Very limited evidence exists about the organizational interventions such as work modification, financial security, and systems improvement. | health systems and quality improvement |
10.1101/2021.01.08.21249474 | Deep learning-based detection of COVID-19 using wearables data | BackgroundCOVID-19 is an infectious disease caused by SARS-CoV-2 that is primarily diagnosed using laboratory tests, which are frequently not administered until after symptom onset. However, SARS-CoV-2 is contagious multiple days before symptom onset and diagnosis, thus enhancing its transmission through the population.
MethodsIn this retrospective study, we collected 15 seconds to one-minute heart rate and steps interval data from Fitbit devices during the COVID-19 period (February 2020 until June 2020). Resting heart rate was computed by selecting the heart rate intervals where steps were zero for 12 minutes ahead of an interrogated time point. Data for each participant was divided into train or baseline by taking the days before the non-infectious period and test data by taking the days during the COVID-19 infectious period. Data augmentation was used to increase the size of the training days. Here, we developed a deep learning approach based on a Long Short-Term Memory Networks-based autoencoder, called LAAD, to predict COVID-19 infection by detecting abnormal resting heart rate in test data relative to the users baseline.
FindingsWe detected an abnormal resting heart rate during the period of viral infection (7 days before the symptom onset and 21 days after) in 92% (23 out of 25 cases) of patients with laboratory-confirmed COVID-19. In 56% (14) of cases, LAAD detection identified cases in their pre-symptomatic phase whereas 36% (9 cases) were detected after the onset of symptoms with an average precision score of 0{middle dot}91 (SD 0{middle dot}13, 95% CI 0{middle dot}854-0{middle dot}967), a recall score of 0{middle dot}36 (0{middle dot}295, 0{middle dot}232-0{middle dot}487), and a F-beta score of 0{middle dot}79 (0{middle dot}226, 0{middle dot}693-0{middle dot}888). In COVID-19 positive patients, abnormal RHR patterns start 5 days before symptom onset (6{middle dot}9 days in pre-symptomatic cases and 1{middle dot}9 days later in post-symptomatic cases). COVID-19+ patients have longer abnormal resting heart rate periods (89 hours or 3{middle dot}7 days) as compared to healthy individuals (25 hours or 1{middle dot}1 days).
InterpretationThese findings show that deep learning neural networks and wearables data are an effective method for the early detection of COVID-19 infection. Additional validation data will help guide the use of this and similar techniques in real-world infection surveillance and isolation policies to reduce transmission and end the pandemic.
FundingThis work was supported by NIH grants and gifts from the Flu Lab, as well as departmental funding from the Stanford Genetics department. The Google Cloud Platform costs were covered by Google for Education academic research and COVID-19 grant awards.
Research in contextO_ST_ABSEvidence before the studyC_ST_ABSCOVID-19 resulted in up to 1{middle dot}7 million deaths worldwide in 2020. COVID-19 detection using laboratory tests is usually performed after symptom onset. This delay can allow the spread of viral infection and can cause outbreaks. We searched PubMed, Google, and Google Scholar for research articles published in English up to Dec 1, 2020, using common search terms including "COVID-19 and wearables", "Resting heart rate and viral infection", "Resting heart rate and COVID-19", "machine learning and COVID-19" and "deep-learning and COVID-19". Previous studies have attempted to use an elevated resting heart rate as an indicator of viral infection. Although these studies have investigated the early prediction of COVID-19 using resting heart rate and other wearables data, studies to investigate a deep learning-based prediction model with performance evaluation metrics at the user level has not been reported.
Added value of this studyIn this study, we created a deep-learning system that used wearables data such as abnormal resting heart rate to predict COVID-19 before the symptom onset. The deep-learning system was created using retrospective time-series datasets collected from 25 COVID-19+ patients, 11 non-COVID-19, and 70 healthy individuals. To our knowledge, this is the first deep-learning model to identify an early viral infection using wearables data at the user level. This study also greatly extends our previous phase-1 study and factors unpredictable behavior and time-series nature of the data, limited data size, and lack of data labels to evaluate performance metrics. The use of a real-time version of this model using more data along with user feedback may help to scale early detection as the number of patients with COVID-19 continues to grow.
Implications of all the available evidenceIn the future, wearable devices may provide high-resolution sleep, temperature, saturated oxygen, respiration rate, and electrocardiogram, which could be used to further characterize an individuals baseline and improve the deep-learning model performance for infectious disease detection. Using multi-sensor data with a real-time deep-learning model has the potential to alert individuals of illness prior to symptom onset and may greatly reduce the viral spread. | infectious diseases |
10.1101/2021.01.08.21249188 | Visual Impairment and Risk of Dementia: the UK Biobank Study | INTRODUCTIONThe association between visual impairment (VI) and the risk of dementia has been poorly understood. We sought to investigate the VI-dementia relationship in the UK Biobank Study.
METHODSA total of 117,187 volunteers (aged 40-69 years) deemed free of dementia at baseline were included. Habitual distance visual acuity worse than 0.3 logMAR units in the better-seeing eye was used to define VI. The incident dementia was based on electronically linked hospital inpatient and death records.
RESULTSDuring a median follow up of 5.96 years, the presence of VI was significantly associated with incident dementia (HR=1.78, 95% CI: 1.18-2.68, P=0.006). There was a clear trend between the severity of VI and the risk of dementia (P for trend=0.002).
DISCUSSIONVisually impaired individuals were more likely to develop incident dementia, with a progressively greater risk among those with worse visual acuity. Our findings highlight the value of regular vision screening and elimination of VI.
HIGHLIGHTSO_LIThe association between VI and dementia has been poorly understood;
C_LIO_LIVI is associated with incident dementia in non-demented adults;
C_LIO_LIThere is a clear trend between the severity of VI and the risk of dementia;
C_LIO_LIVI may be a marker of increased dementia risk.
C_LI
RESEARCH IN CONTEXTO_ST_ABSSYSTEMATIC REVIEWC_ST_ABSWe searched and reviewed the literature using traditional sources (e.g., PubMed and GoogleScholar). While the association between VI and cognitive function/decline are increasingly studies, investigation of the association between VI and the risk of dementia has been largely overlooked.
INTERPRETATIONWe found that visually impaired individuals were more likely to develop incident dementia, with a progressively greater risk among those with worse visual acuity. Our findings imply that VI may be an important marker of dementia.
FUTURE DIRECTIONSThese findings call for more studies to investigate (a) the role of visual acuity changes on the risk of dementia; (b) the relationship between other components of visual function and incident dementia; (c) the relationship between eye diseases and incident dementia; and (d) the potential benefits of vision rehabilitation on dementia prevention. | ophthalmology |
10.1101/2021.01.10.20248831 | Factors indicating intention to vaccinate with a COVID-19 vaccine among older U.S. Adults | AO_SCPLOWBSTRACTC_SCPLOWO_ST_ABSBACKGROUNDC_ST_ABSThe success of vaccination efforts to curb the COVID-19 pandemic will require broad public uptake of immunization and highlights the importance of understanding factors associated with willingness to receive a vaccine.
METHODSAdults enrolled in the Heartline clinical study were invited to complete a COVID-19 vaccine assessment through the Heartline mobile application between November 6-20, 2020. Factors associated with willingness to receive a COVID-19 vaccine were evaluated using an ordered logistic regression as well as a Random Forest classification algorithm.
RESULTSAmong 9,106 study participants, 81.3% (n=7402) responded and had available demographic data. The majority (91.3%) reported a willingness to be vaccinated. Factors most strongly associated with vaccine willingness were beliefs about the safety and efficacy of COVID-19 vaccines and vaccines in general. Women and Black or African American respondents reported lower willingness to vaccinate. Among those less willing to get vaccinated, 66.2% said that they would talk with their health provider before making a decision. During the study, positive results from the first COVID-19 vaccine outcome study were released; vaccine willingness increased after this report.
CONCLUSIONSEven among older adults at high-risk for COVID-19 complications who are participating in a longitudinal clinical study, 1 in 11 reported lack of willingness to receive COVID-19 vaccine in November 2020. Variability in vaccine willingness by gender, race, education, and income suggests the potential for uneven vaccine uptake. Education by health providers directed toward assuaging concerns about vaccine safety and efficacy can help improve vaccine acceptance among those less willing.
Clinicaltrials.gov NCT04276441 | infectious diseases |
10.1101/2021.01.06.21249342 | Indoor dust as a matrix for surveillance of COVID-19 outbreaks | Ongoing disease surveillance is a critical tool to mitigate viral outbreaks, especially during a pandemic. Environmental monitoring has significant promise even following widespread vaccination among high-risk populations. The goal of this work is to demonstrate molecular SARS-CoV-2 monitoring in bulk floor dust and related samples as a proof-of-concept of a non-invasive environmental surveillance methodology for COVID-19 and potentially other viral diseases. Surface swab, passive sampler, and bulk floor dust samples were collected from rooms of individuals infected with COVID-19, and SARS-CoV-2 was measured with quantitative reverse transcription polymerase chain reaction (RT-qPCR) and two digital PCR (dPCR) methods. Bulk dust samples had geometric mean concentration of 159 copies/mg-dust and ranged from non-detects to 23,049 copies/mg-dust detected using ddPCR. An average of 88% of bulk dust samples were positive for the virus among detection methods compared to 55% of surface swabs and fewer on the passive sampler (19% carpet, 29% polystyrene). In bulk dust, SARS-CoV-2 was detected in 76%, 93%, and 97% of samples measured by qPCR, chip-based dPCR, and droplet dPCR respectively. Detectable viral RNA in the bulk vacuum bags did not measurably decay over 4 weeks, despite the application of a disinfectant before room cleaning. Future monitoring efforts should further evaluate RNA persistence and heterogeneity in dust. This study did not measure virus viability in dust or potential transmission associated with dust. Overall, this work demonstrates that bulk floor dust is a potentially useful matrix for long-term monitoring of viral disease outbreaks in high-risk populations and buildings.
ImportanceEnvironmental surveillance to assess pathogen presence within a community is proving to be a critical tool to protect public health, and it is especially relevant during the ongoing COVID-19 pandemic. Importantly, environmental surveillance tools also allow for the detection of asymptomatic disease carriers and for routine monitoring of a large number of people as has been shown for SARS-CoV-2 wastewater monitoring. However, additional monitoring techniques are needed to screen for outbreaks in high-risk settings such as congregate care facilities. Here, we demonstrate that SARS-CoV-2 can be detected in bulk floor dust collected from rooms housing infected individuals. This analysis suggests that dust may be a useful and efficient matrix for routine surveillance of viral disease outbreaks. | infectious diseases |
10.1101/2021.01.06.20248486 | Salivary DNA loads for human herpes viruses 6 and 7 are correlated with disease phenotype in Myalgic Encephalomyelitis/ Chronic Fatigue Syndrome | Myalgic Encephalomyelitis/ Chronic Fatigue Syndrome (ME/CFS) is a complex chronic condition affecting multiple body systems, with unknown cause, unclear pathogenesis mechanisms, and fluctuating symptoms which may lead to severe debilitation. It is frequently reported to have been triggered by an infection, particularly with herpes virus family members; however, there are no clear differences in exposure to, or seroprevalence of, any herpes virus in people with ME/CFS and healthy individuals. Herpes viruses exist in lytic and latent forms, and it is possible that ME/CFS is associated with viral reactivation, which has not been detectable previously due to insensitive testing methods.
Saliva samples were collected from 30 people living with ME/CFS at monthly intervals for six months and at times when they experienced symptom exacerbation, as well as from 14 healthy control individuals. The viral DNA load of the nine human herpes viruses was determined by digital droplet PCR. Symptoms were assessed by questionnaire at each time point.
Human herpes virus (HHV) 6B, HHV-7, herpes simplex virus 1 and Epstein Barr virus were detectable within the saliva samples, with higher HHV-6B and HHV-7 viral loads detected in people with ME/CFS than in healthy controls. Participants with ME/CFS could be broadly separated into two groups: one group displayed fluctuating patterns of herpes viruses detectable across the six months while the second group displayed more stable viral presentation. In the first group, there was positive correlation between HHV-6B and HHV-7 viral load and severity of symptom scores, including pain, neurocognition and autonomic dysfunction.
The results indicate that fluctuating viral load, related to herpesvirus reactivation state, may play a role in ME/CFS pathogenesis, or might be a consequence of dysregulated immune function. The sampling strategy and molecular tools developed permit large-scale epidemiological investigations.
Contribution to the FieldThe cause of ME/CFS and the mechanisms underlying disease pathogenesis are not known, although symptoms are often triggered by infection. Human herpes virus (HHV) family members have been implicated, although there is no difference in the seroprevalence of any HHV in people with ME/CFS and healthy controls, showing there is similar prior infection rate. HHVs exist in either latent or active, lytic, phases in the human host, and it is possible that ME/CFS symptoms and their severity is related to HHV reactivation from a latent state. We have used droplet digital PCR, a sensitive and specific method, to measure the prevalence and DNA concentration of HHVs in the saliva of people with ME/CFS and controls, and analysed the correlation with disease over a six-month timecourse. We found that two HHVs, HHV-7 and HHV-6B, were elevated in saliva from people with ME/CFS, and that in people who were severely affected by ME/CFS, the concentration HHV DNA correlated with symptom severity over time in a subgroup of patients with fluctuating salivary HHV repertoire. Our study demonstrates the feasibility of measuring HHV concentration in readily acquired samples, enabling future large-scale studies aimed at testing the causal role of HHV reactivation in ME/CFS disease. | infectious diseases |
10.1101/2021.01.06.21249313 | Cohort Profile: a Prospective Household cohort study of Influenza, Respiratory Syncytial virus, and other respiratory pathogens community burden and Transmission dynamics in South Africa (PHIRST), 2016-2018 | PurposeThe PHIRST study (Prospective Household cohort study of Influenza, Respiratory Syncytial virus, and other respiratory pathogens community burden and Transmission dynamics in South Africa) aimed to estimate the community burden of influenza and respiratory syncytial virus (RSV) including the incidence of infection, symptomatic fraction, and disease severity, and to assess household transmission. We further aimed to estimate the impact of HIV infection and age on disease burden and transmission, and to assess the burden of Bordetella pertussis and Streptococcus pneumoniae.
ParticipantsWe enrolled 1684 individuals in 327 randomly selected households in two sites (rural Agincourt subdistrict, Mpumalanga Province and urban Jouberton Township, North West Province) over 3 consecutive influenza and RSV seasons. A new cohort of households was enrolled each year. Eligible households included those with >2 household members where [≥]80% of household members provided consent (and assent for children aged 7-17 years). Enrolled household members were sampled with nasopharyngeal swabs twice weekly during the RSV and influenza seasons of the year of enrolment. Serology samples were collected at enrolment and before and after the influenza season annually.
Findings to dateThere were 122,113 potential individual follow-up visits over the 3 years, and participants were interviewed for 105,783 (87%) of these. Out of 105,683 nasopharyngeal swabs from follow-up visits, 1,258 (1%), 1,026 (1%), 273 (<1%), 38,829 (37%) tested positive on PCR for influenza viruses, respiratory syncytial virus, pertussis and pneumococcus respectively.
Future plansFuture planned analyses include analysis of influenza serology results and RSV burden and transmission. Households enrolled in the PHIRST study during 2016-2018 were eligible for inclusion in a study of SARS-CoV-2 transmission initiated in July 2020. This study uses similar testing frequency and household selection methods to assess the community burden of SARS-CoV-2 infection and the role of asymptomatic infection in virus transmission.
RegistrationClinical trials.gov NCT02519803
Article summaryO_ST_ABSStrengths and limitations of this studyC_ST_ABSO_LIPHIRST was conducted in urban and rural African settings with high HIV prevalence, allowing assessment of the effect of HIV on community burden and transmission dynamics of respiratory pathogens.
C_LIO_LIHouseholds were selected randomly to provide a representative sample of the community. Twice weekly sampling from each cohort of individuals for 6-10 months irrespective of symptoms allows estimation of community burden, household secondary infection risk, and serial interval including asymptomatic or paucisymptomatic episodes.
C_LIO_LIPolymerase chain reaction testing of >100,000 nasopharyngeal swab samples for multiple pathogens (influenza, respiratory syncytial virus, pertussis and Streptoccocus pneumonia) allows detailed examination of disease burden and transmission and pathogen interactions
C_LIO_LIPHIRST was not powered to assess severe outcomes (i.e. hospitalisation and death).
C_LIO_LIWe only examined four pathogens, but other micro-organisms may be important. Samples have been stored which could allow us to implement broader multi-pathogen testing in the future.
C_LI | infectious diseases |
10.1101/2021.01.10.21249520 | Viral mutation, contact rates and testing: a DCM study of fluctuations | This report considers three mechanisms that might underlie the course of the secondary peak of coronavirus infections in the United Kingdom. It considers: (i) fluctuations in transmission strength; (ii) seasonal fluctuations in contact rates and (iii) fluctuations in testing. Using dynamic causal modelling, we evaluated the contribution of all combinations of these three mechanisms using Bayesian model comparison. We found overwhelming evidence for the combination of all mechanisms, when explaining 16 types of data. Quantitatively, there was clear evidence for an increase in transmission strength of 57% over the past months (e.g., due to viral mutation), in the context of increased contact rates (e.g., rebound from national lockdowns) and increased test rates (e.g., due to the inclusion of lateral flow tests). Models with fluctuating transmission strength outperformed models with fluctuating contact rates. However, the best model included all three mechanisms suggesting that the resurgence during the second peak can be explained by an increase in effective contact rate that is the product of a rebound of contact rates following a national lockdown and increased transmission risk due to viral mutation. | infectious diseases |
10.1101/2021.01.10.21249550 | Acute Metabolic Emergencies in Diabetes and COVID-19: a systematic review and meta-analysis of case reports | IntroductionCOVID-19 is associated with DKA (Diabetic Ketoacidosis), HHS (Hyperglycaemic Hyperosmolar State) and EDKA (Euglycaemic DKA). High mortality has been observed in COVID-19-related diabetic ketoacidosis; however, evidence is scarce.
Patients and MethodsA systematic literature review was conducted using EMBASE, PubMed/Medline, and Google Scholar from January to December 2020 to identify all case reports describing DKA, HHS, and EDKA, in COVID-19 patients. The Joanna Briggs Institute critical appraisal checklist for case reports was used for quality assessment. Univariate and multivariate analysis assessed correlations of study origin, combined DKA/HHS, age, BMI, HbA1c, administered antidiabetics, comorbidities, symptoms onset, disease status, CRP, ferritin, d-dimers, glucose, osmolarity, pH, bicarbonates, ketones, lactates, {beta}-hydroxybutyric acid, anion gap, and acute kidney injury (AKI) with outcome.
ResultsFrom 312 identified publications, 41 including 71 cases analyzed qualitatively and quantitatively. Multivariate analysis demonstrated that COVID-19 disease status 4 (P<0.001), AKI (P<0.001), pH [≤]7.12 (P=0.032), and osmolarity [≥]324 (P=0.034), are all independently correlated with death. COVID-19 Disease Status 4 (P=3*10-8), combined DKA/HHS (P=0.021), and AKI (P=0.037) are independently correlated with death.
ConclusionCOVID-19 intertwines with acute metabolic emergencies in diabetes leading to increased mortality; key determinants are critical COVID-19 illness, co-presence of ketoacidosis and hyperosmosis and AKI. | endocrinology |
10.1101/2021.01.10.21249550 | Acute Metabolic Emergencies in Diabetes and COVID-19: a systematic review and meta-analysis of case reports | IntroductionCOVID-19 is associated with DKA (Diabetic Ketoacidosis), HHS (Hyperglycaemic Hyperosmolar State) and EDKA (Euglycaemic DKA). High mortality has been observed in COVID-19-related diabetic ketoacidosis; however, evidence is scarce.
Patients and MethodsA systematic literature review was conducted using EMBASE, PubMed/Medline, and Google Scholar from January to December 2020 to identify all case reports describing DKA, HHS, and EDKA, in COVID-19 patients. The Joanna Briggs Institute critical appraisal checklist for case reports was used for quality assessment. Univariate and multivariate analysis assessed correlations of study origin, combined DKA/HHS, age, BMI, HbA1c, administered antidiabetics, comorbidities, symptoms onset, disease status, CRP, ferritin, d-dimers, glucose, osmolarity, pH, bicarbonates, ketones, lactates, {beta}-hydroxybutyric acid, anion gap, and acute kidney injury (AKI) with outcome.
ResultsFrom 312 identified publications, 41 including 71 cases analyzed qualitatively and quantitatively. Multivariate analysis demonstrated that COVID-19 disease status 4 (P<0.001), AKI (P<0.001), pH [≤]7.12 (P=0.032), and osmolarity [≥]324 (P=0.034), are all independently correlated with death. COVID-19 Disease Status 4 (P=3*10-8), combined DKA/HHS (P=0.021), and AKI (P=0.037) are independently correlated with death.
ConclusionCOVID-19 intertwines with acute metabolic emergencies in diabetes leading to increased mortality; key determinants are critical COVID-19 illness, co-presence of ketoacidosis and hyperosmosis and AKI. | endocrinology |
10.1101/2021.01.10.21249523 | Trends in lipid-modifying agent use in 83 countries | AimsLipid-modifying agents (LMAs) are increasingly used to reduce lipid levels and prevent cardiovascular events, but the magnitude of their consumption in different world regions is unknown. We aimed to describe recent global trends in LMA consumption and to explore the relationship between country-level LMA consumption and cholesterol concentrations.
Methods and resultsThis cross-sectional and ecological study used monthly pharmaceutical sales data from January 2008 to December 2018, for 83 countries from the IQVIA Multinational Integrated Data Analysis System, and total and non-high-density lipoprotein (non- HDL) cholesterol concentrations from the NCD Risk Factor Collaboration. The compound annual growth rate (CAGR) was used to assess changes in LMA consumption over time. From 2008 to 2018, the use of LMAs increased from 7,468 to 11,197 standard units per 1000 inhabitants per year (CAGR 4.13%). Statins were the most used class of LMA and their market share increased in 75% of countries between 2008 and 2018. Fibrates were the most consumed nonstatin drugs. From 2013 to 2018, consumption of low-density lipoprotein lowering therapies increased (statins 3.99%; ezetimibe 4.01%; proprotein convertase subtilisin/kexin type 9 (PCSK9) inhibitors 104.47%). Limited evidence supports a clear relationship between country-level changes in LMA consumption and total and non-HDL cholesterol concentrations in 2008 versus 2018.
ConclusionSince 2008, global access to LMAs, especially statins, has improved. In line with international lipid guideline recommendations, recent trends indicate growth in the use of statins, ezetimibe, and PCSK9 inhibitors. Country-level patterns of LMA use and total and non-HDL cholesterol varied considerably. | epidemiology |
10.1101/2021.01.10.21249537 | Excess mortality during the 1918-20 influenza pandemic in Czechia | This research letter provides a replicable estimate of the mortality that the 1918-20 influenza pandemic caused in Czechia. A monthly all-cause excess mortality model identified clear periods of pandemic mortality in September 1918 through May 1919 and January 1920 through May 1920. The total excess mortality in those months implies a pandemic death toll of 71,967 and a national death rate of 0.75%. | epidemiology |
10.1101/2021.01.09.21249187 | Association of Visual Impairment with Risk for Future Parkinson's Disease | BackgroundAlthough visual dysfunction is one of the most common non-motor symptoms among patients with Parkinsons disease (PD), it is not known whether such dysfunction predates the onset of clinical PD.
ObjectivesTo examine the association of visual impairment (VI) with the future development of PD in the UK Biobank Study.
MethodsThe UK Biobank Study is one of the largest prospective cohort studies of health, enrolling over 500,000 participants aged 40-69 years between 2006 and 2010 across the UK. VI was defined as a habitual distance visual acuity (VA) worse than 0.3 LogMAR in the better-seeing eye. Incident cases of PD were determined by self report data, hospital admission records or death records, whichever came first. Multivariable Cox proportional hazard regression models were used to investigate the association between VI and the risk of incident PD.
ResultsA total of 117,050 participants were free of PD at the baseline assessment. During the median observation period of 5.96 (interquantile range [IQR]: 5.77-6.23) years, PD occurred in 222 (0.19%) participants. Visually impaired participants were at a higher risk of developing PD than non-VI participants (p<0.001). Compared with the non-VI group, the adjusted hazard ratio was 2.28 (95% CI 1.29-4.04, p=0.005) in the VI group. These results were consistent in the sensitivity analysis, where incident PD cases diagnosed within one year after the baseline assessment were excluded.
ConclusionsThis prospective cohort study found that VI was associated with an increased risk of incident PD, suggesting that VI may represent a prodromal feature of PD. | epidemiology |
10.1101/2021.01.09.21249522 | A Fractional Order Dengue Fever Model in theContext of Protected Travellers | Climate changes are affecting the control of many vector-borne diseases, particularly in Africa. In this work, a dengue fever model with protected travellers is formulated. Caputo-Fabrizio operator is utilized to obtain some qualitative information about the disease. The basic properties and the reproduction number is studied. The two steady states are determined and the local stability of the states are found to be asymptotically stable. The fixed pointed theory is made use to obtain the existence and uniqueness of solutions of the model. The numerical simulation suggests that the fractional-order affects the dynamics of dengue fever. | epidemiology |
10.1101/2021.01.09.21249145 | Plasma β-amyloid in mild behavioural impairment - neuropsychiatric symptoms on the Alzheimer's continuum | IntroductionSimple markers are required to recognize older adults at higher risk for neurodegenerative disease. Mild behavioural impairment (MBI) and plasma {beta}-amyloid (A{beta}) have been independently implicated in the development of incident cognitive decline and dementia. Here we studied the associations between MBI and plasma A{beta}42/A{beta}40.
MethodsParticipants with normal cognition (n = 86) or mild cognitive impairment (n = 53) were selected from the Alzheimers Disease Neuroimaging Initiative. MBI scores were derived from Neuropsychiatric Inventory items. Plasma A{beta}42/A{beta}40 ratios were assayed using mass spectrometry. Linear regressions were fitted to assess the association between MBI total score as well as MBI domain scores with plasma A{beta}42/A{beta}40.
ResultsLower plasma A{beta}42/A{beta}40 was associated with higher MBI total score (p = 0.04) and greater affective dysregulation (p = 0.04), but not with impaired drive/motivation (p = 0.095) or impulse dyscontrol (p = 0.29) MBI domains.
ConclusionIn persons with normal cognition or mild cognitive impairment, MBI was associated with low plasma A{beta}42/A{beta}40. Incorporating MBI into case detection can help capture preclinical and prodromal Alzheimers disease. | psychiatry and clinical psychology |
10.1101/2021.01.07.20248981 | Polygenic risk for immuno-metabolic markers and specific depressive symptoms: A multi-sample network analysis study | BackgroundAbout every fourth patient with major depressive disorder (MDD) shows evidence of systemic inflammation. Previous studies have shown inflammation-depression associations of multiple serum inflammatory markers and multiple specific depressive symptoms. It remains unclear, however, if these associations extend to genetic/lifetime predisposition to higher inflammatory marker levels and what role metabolic factors such as Body Mass Index (BMI) play. It is also unclear whether inflammation-symptom associations reflect direct or indirect associations, which can be disentangled using network analysis.
MethodsThis study examined associations of polygenic risk scores (PRSs) for immuno-metabolic markers (C-reactive protein [CRP], interleukin [IL]-6, IL-10, tumour necrosis factor [TNF]-, BMI) with seven depressive symptoms in one general population sample, the UK Biobank study (n=110,010), and two patient samples, the Munich Antidepressant Response Signature (MARS, n=1,058) and Sequenced Treatment Alternatives to Relieve Depression (STAR*D, n=1,143) studies. Network analysis was applied jointly for these samples using fused graphical least absolute shrinkage and selection operator (FGL) estimation as primary analysis and, individually, using unregularized model search estimation. Stability of results was assessed using bootstrapping and three consistency criteria were defined to appraise robustness and replicability of results across estimation methods, network bootstrapping, and samples.
ResultsNetwork analysis results displayed to-be-expected PRS-PRS and symptom-symptom associations (termed edges), respectively, that were mostly positive. Using FGL estimation, results further suggested 28, 29, and six PRS-symptom edges in MARS, STAR*D, and UK Biobank samples, respectively. Unregularized model search estimation suggested three PRS-symptom edges in the UK Biobank sample. Applying our consistency criteria to these associations indicated that only the association of higher CRP PRS with greater changes in appetite fulfilled all three criteria.
Four additional associations fulfilled at least two consistency criteria; specifically, higher CRP PRS was associated with greater fatigue and reduced anhedonia, higher TNF- PRS was associated with greater fatigue, and higher BMI PRS with greater changes in appetite and anhedonia. Associations of the BMI PRS with anhedonia, however, showed an inconsistent valence across estimation methods.
ConclusionsGenetic predisposition to higher systemic inflammatory markers are primarily associated with somatic/neurovegetative symptoms of depression such as changes in appetite and fatigue, consistent with previous studies based on circulating levels of inflammatory markers. We extend these findings by providing evidence that associations are direct (using network analysis) and extend to genetic predisposition to immuno-metabolic markers (using PRSs). Our findings can inform selection of patients with inflammation-related symptoms into clinical trials of immune-modulating drugs for MDD. | psychiatry and clinical psychology |
10.1101/2021.01.07.20248981 | Polygenic risk for immuno-metabolic markers and specific depressive symptoms: A multi-sample network analysis study | BackgroundAbout every fourth patient with major depressive disorder (MDD) shows evidence of systemic inflammation. Previous studies have shown inflammation-depression associations of multiple serum inflammatory markers and multiple specific depressive symptoms. It remains unclear, however, if these associations extend to genetic/lifetime predisposition to higher inflammatory marker levels and what role metabolic factors such as Body Mass Index (BMI) play. It is also unclear whether inflammation-symptom associations reflect direct or indirect associations, which can be disentangled using network analysis.
MethodsThis study examined associations of polygenic risk scores (PRSs) for immuno-metabolic markers (C-reactive protein [CRP], interleukin [IL]-6, IL-10, tumour necrosis factor [TNF]-, BMI) with seven depressive symptoms in one general population sample, the UK Biobank study (n=110,010), and two patient samples, the Munich Antidepressant Response Signature (MARS, n=1,058) and Sequenced Treatment Alternatives to Relieve Depression (STAR*D, n=1,143) studies. Network analysis was applied jointly for these samples using fused graphical least absolute shrinkage and selection operator (FGL) estimation as primary analysis and, individually, using unregularized model search estimation. Stability of results was assessed using bootstrapping and three consistency criteria were defined to appraise robustness and replicability of results across estimation methods, network bootstrapping, and samples.
ResultsNetwork analysis results displayed to-be-expected PRS-PRS and symptom-symptom associations (termed edges), respectively, that were mostly positive. Using FGL estimation, results further suggested 28, 29, and six PRS-symptom edges in MARS, STAR*D, and UK Biobank samples, respectively. Unregularized model search estimation suggested three PRS-symptom edges in the UK Biobank sample. Applying our consistency criteria to these associations indicated that only the association of higher CRP PRS with greater changes in appetite fulfilled all three criteria.
Four additional associations fulfilled at least two consistency criteria; specifically, higher CRP PRS was associated with greater fatigue and reduced anhedonia, higher TNF- PRS was associated with greater fatigue, and higher BMI PRS with greater changes in appetite and anhedonia. Associations of the BMI PRS with anhedonia, however, showed an inconsistent valence across estimation methods.
ConclusionsGenetic predisposition to higher systemic inflammatory markers are primarily associated with somatic/neurovegetative symptoms of depression such as changes in appetite and fatigue, consistent with previous studies based on circulating levels of inflammatory markers. We extend these findings by providing evidence that associations are direct (using network analysis) and extend to genetic predisposition to immuno-metabolic markers (using PRSs). Our findings can inform selection of patients with inflammation-related symptoms into clinical trials of immune-modulating drugs for MDD. | psychiatry and clinical psychology |
10.1101/2021.01.07.20248981 | Polygenic risk for immuno-metabolic markers and specific depressive symptoms: A multi-sample network analysis study | BackgroundAbout every fourth patient with major depressive disorder (MDD) shows evidence of systemic inflammation. Previous studies have shown inflammation-depression associations of multiple serum inflammatory markers and multiple specific depressive symptoms. It remains unclear, however, if these associations extend to genetic/lifetime predisposition to higher inflammatory marker levels and what role metabolic factors such as Body Mass Index (BMI) play. It is also unclear whether inflammation-symptom associations reflect direct or indirect associations, which can be disentangled using network analysis.
MethodsThis study examined associations of polygenic risk scores (PRSs) for immuno-metabolic markers (C-reactive protein [CRP], interleukin [IL]-6, IL-10, tumour necrosis factor [TNF]-, BMI) with seven depressive symptoms in one general population sample, the UK Biobank study (n=110,010), and two patient samples, the Munich Antidepressant Response Signature (MARS, n=1,058) and Sequenced Treatment Alternatives to Relieve Depression (STAR*D, n=1,143) studies. Network analysis was applied jointly for these samples using fused graphical least absolute shrinkage and selection operator (FGL) estimation as primary analysis and, individually, using unregularized model search estimation. Stability of results was assessed using bootstrapping and three consistency criteria were defined to appraise robustness and replicability of results across estimation methods, network bootstrapping, and samples.
ResultsNetwork analysis results displayed to-be-expected PRS-PRS and symptom-symptom associations (termed edges), respectively, that were mostly positive. Using FGL estimation, results further suggested 28, 29, and six PRS-symptom edges in MARS, STAR*D, and UK Biobank samples, respectively. Unregularized model search estimation suggested three PRS-symptom edges in the UK Biobank sample. Applying our consistency criteria to these associations indicated that only the association of higher CRP PRS with greater changes in appetite fulfilled all three criteria.
Four additional associations fulfilled at least two consistency criteria; specifically, higher CRP PRS was associated with greater fatigue and reduced anhedonia, higher TNF- PRS was associated with greater fatigue, and higher BMI PRS with greater changes in appetite and anhedonia. Associations of the BMI PRS with anhedonia, however, showed an inconsistent valence across estimation methods.
ConclusionsGenetic predisposition to higher systemic inflammatory markers are primarily associated with somatic/neurovegetative symptoms of depression such as changes in appetite and fatigue, consistent with previous studies based on circulating levels of inflammatory markers. We extend these findings by providing evidence that associations are direct (using network analysis) and extend to genetic predisposition to immuno-metabolic markers (using PRSs). Our findings can inform selection of patients with inflammation-related symptoms into clinical trials of immune-modulating drugs for MDD. | psychiatry and clinical psychology |
10.1101/2021.01.10.21249298 | Accurate Covid-19 prevalence measurement in the field | Timely, accurate epidemic figures are necessary for informed policy. In the Covid-19 pandemic, mismeasurement can lead to tremendous waste, in health or economic output. "Random" testing is commonly used to estimate virus prevalence, reporting daily positivity rates. However, since testing is necessarily voluntary, all "random" tests done in the field suffer from selection bias. This bias, unlike standard polling biases, goes beyond demographical representativeness and cannot be corrected by oversampling (i.e. selecting people without symptoms to test). Using controlled, incentivized experiments on a sample of all ages, we show that people who feel symptoms are up to 33 times more likely to seek testing. The bias in testing propensities leads to sizable prevalence bias: test positivity is inflated by up to five times, even if testing is costless. This effect varies greatly across time and age groups, making comparisons over time and across countries misleading. We validate our results using the REACT study in the UK and find that positivity figures have indeed a very large and time varying bias. We present calculations to debias positivity rates, but importantly, suggest a parsimonious way to sample the population bypassing the bias altogether. Our estimation is both real-time and consistently close to true values. These results are relevant for all epidemics, besides covid-19, when carriers have informative beliefs about their own status. | public and global health |
10.1101/2021.01.10.21249298 | Debiasing Covid-19 prevalence estimates | Timely, accurate epidemic figures are necessary for informed policy. In the Covid-19 pandemic, mismeasurement can lead to tremendous waste, in health or economic output. "Random" testing is commonly used to estimate virus prevalence, reporting daily positivity rates. However, since testing is necessarily voluntary, all "random" tests done in the field suffer from selection bias. This bias, unlike standard polling biases, goes beyond demographical representativeness and cannot be corrected by oversampling (i.e. selecting people without symptoms to test). Using controlled, incentivized experiments on a sample of all ages, we show that people who feel symptoms are up to 33 times more likely to seek testing. The bias in testing propensities leads to sizable prevalence bias: test positivity is inflated by up to five times, even if testing is costless. This effect varies greatly across time and age groups, making comparisons over time and across countries misleading. We validate our results using the REACT study in the UK and find that positivity figures have indeed a very large and time varying bias. We present calculations to debias positivity rates, but importantly, suggest a parsimonious way to sample the population bypassing the bias altogether. Our estimation is both real-time and consistently close to true values. These results are relevant for all epidemics, besides covid-19, when carriers have informative beliefs about their own status. | public and global health |
10.1101/2021.01.10.21249298 | Debiasing Covid-19 prevalence estimates | Timely, accurate epidemic figures are necessary for informed policy. In the Covid-19 pandemic, mismeasurement can lead to tremendous waste, in health or economic output. "Random" testing is commonly used to estimate virus prevalence, reporting daily positivity rates. However, since testing is necessarily voluntary, all "random" tests done in the field suffer from selection bias. This bias, unlike standard polling biases, goes beyond demographical representativeness and cannot be corrected by oversampling (i.e. selecting people without symptoms to test). Using controlled, incentivized experiments on a sample of all ages, we show that people who feel symptoms are up to 33 times more likely to seek testing. The bias in testing propensities leads to sizable prevalence bias: test positivity is inflated by up to five times, even if testing is costless. This effect varies greatly across time and age groups, making comparisons over time and across countries misleading. We validate our results using the REACT study in the UK and find that positivity figures have indeed a very large and time varying bias. We present calculations to debias positivity rates, but importantly, suggest a parsimonious way to sample the population bypassing the bias altogether. Our estimation is both real-time and consistently close to true values. These results are relevant for all epidemics, besides covid-19, when carriers have informative beliefs about their own status. | public and global health |
10.1101/2021.01.10.21249545 | Association of Chronic Acid Suppression and Social Determinants of Health with COVID-19 Infection | Acid suppressants are a widely-used class of medications previously linked to an increased risk of aerodigestive infections. However, prior studies of these medications as potentially reversible risk factors for COVID-19 have been conflicting. We performed a case-control study involving clinician-abstracted data from 900 health records across 3 US medical centers. We incorporated sociobehavioral predictors of infectious exposure using geomapping to publicly-available data. We found no evidence for an association between chronic acid suppression and incident COVID-19 (adjusted odds ratio 1.04, 95% CI: 0.92-1.17, P=0.515). However, we identified several medical and social features as positive (Latinx ethnicity, BMI [≥] 30, dementia, public transportation use, month of the pandemic) and negative (female sex, concurrent solid tumor, alcohol use disorder) predictors of new-onset infection. These results place both medical and social factors on the same scale within the context of the COVID-19 pandemic, and underscore the importance of comprehensive models of disease. | public and global health |
10.1101/2021.01.10.21249538 | Vulnerabilities in child wellbeing among primary school children: a cross-sectional study in Bradford, UK | ObjectiveTo describe the prevalence of factors related to wellbeing among primary school children in a deprived multi-ethnic community.
Design and participantsCross-sectional survey of 15,641 children aged 7-10 years in Born in Bradfords Primary School Years study: whole-classroom samples in 89 Bradford primary schools between 2016 and 2019.
Main outcome measuresPrevalence estimates by ethnicity (%, 95% CI) of single and multiple vulnerabilities in child wellbeing within and across four domains (home, family, relationships; material resources; friends and school; subjective wellbeing).
ResultsOnly 10% of children have no vulnerabilities in any domain of wellbeing; 10% have one or more vulnerabilities in all four domains. The highest prevalence estimates were for being bullied some or all of the time (52.7%, 51.9 to 53.4%), keeping worries to oneself (31.2%, 30.5 to 31.9%), having no park near home (30.8%, 30.1 to 31.5%) and worrying all the time about how much money their family has (26.3%, 25.6 to 27%).
Boys were consistently significantly more likely than girls to report all of the vulnerabilities in the Home, Family and Family Relationships domain, and the majority of indicators in the other domains, and in all domains except Friends and School, boys were significantly more likely to have at least one vulnerability. Girls were significantly more likely to report not having many friends (16.7%, 95% CI: 15.9 to 17.6% vs. 12.5%, 95% CI: 11.8 to 13.2%), being bullied some or all of the time (55.8%, 95% CI: 54.7 to 56.9% vs. 49.7%, 95% CI: 48.6 to 50.8%) and feeling left out all the time (12.1%, 95% CI: 11.4 to 12.8%) vs. 10.3%, 95% CI: 9.7 to 11.0%).
Variations in vulnerabilities by ethnicity were complex, with children in Black, Asian and Minority Ethnic groups sometimes reporting more vulnerabilities and sometimes fewer than White British children. For example, compared to children of Pakistani heritage, White British children were more likely to say that their family never gets along well (6.3%,5.6 to 7.1% vs. 4.1%,3.6 to 4.6%) and to have no access to the internet at home (22.3%,21 to 23.6% vs. 18%,17 to 18.9%). Children with Pakistani heritage were more likely than White British children to say they had no park near their home where they can play with friends (32.7%,31.6 to 33.9% vs. 29.9%,28.6 to 31.3%), to report not having three meals a day (17.9%,16.9 to 18.8% vs. 11.9%,10.9 to 12.9%) and to worry all the time about how much money their families have (29.3%,28.2 to 30.3%) vs. 21.6%,20.4 to 22.9%). Gypsy/Irish Traveller children were less likely than White British children to say they were bullied some or all of the time (42.2%,35.4 to 49.4% vs. 53.8%,52.3 to 55.3%), but more likely to say they were mean to others all the time (9.9%,6.3 to 15.2% vs. 4%,3.5 to 4.7%) and can never work out what to do when things are hard (15.2%,10.6 to 21.2% vs. 9%, 8.2 to 9.9%).
We considered six vulnerabilities to be of particular concern during the current Covid-19 pandemic and associated national and local lockdowns: family never gets along well together; no garden where child can play; no nearby park where they can play; not having 3 meals a day; no internet at home; worried about money all the time. Pre-pandemic, 37.4% (36.6 to 38.3%) of Bradford children had one of these vulnerabilities and a further 29.6% (28.9 to 30.4%) had more than one.
ConclusionsAlthough most primary school children aged 7-10 in our study have good levels of wellbeing on most indicators across multiple domains, fewer than 10% have no vulnerabilities at all, a worrying 10% have at least one vulnerability in all the four domains we studied and two thirds have vulnerabilities of concern during the Covid-19 lockdowns. | public and global health |
10.1101/2021.01.10.21249532 | Effectiveness of regional restrictions in reducing SARS-CoV-2 transmission during the second wave of COVID-19, Italy. | To counter the second COVID-19 wave in autumn 2020, the Italian government introduced a system of physical distancing measures organized in progressively restrictive tiers (coded as yellow, orange, and red) and imposed on a regional basis according to epidemiological risk assessments. The individuals attendance to locations outside the residential settings was progressively reduced with tiers, but less than during the national lockdown against the first COVID-19 wave in the spring. The reproduction number Rt decreased below the epidemic threshold in 85 out of 107 provinces after the introduction of the tier system, reaching average values of about 0.99, 0.89 and 0.77 in the yellow, orange and red tier, respectively. We estimate that the reduced transmissibility resulted in averting about 37% of the hospitalizations between November 5 and November 25, 2020. These results are instrumental to inform public health efforts aimed at preventing future resurgence of cases. | public and global health |
10.1101/2021.01.10.21249532 | Impact of tiered restrictions on human activities and the epidemiology of the second wave of COVID-19 in Italy | To counter the second COVID-19 wave in autumn 2020, the Italian government introduced a system of physical distancing measures organized in progressively restrictive tiers (coded as yellow, orange, and red) and imposed on a regional basis according to epidemiological risk assessments. The individuals attendance to locations outside the residential settings was progressively reduced with tiers, but less than during the national lockdown against the first COVID-19 wave in the spring. The reproduction number Rt decreased below the epidemic threshold in 85 out of 107 provinces after the introduction of the tier system, reaching average values of about 0.99, 0.89 and 0.77 in the yellow, orange and red tier, respectively. We estimate that the reduced transmissibility resulted in averting about 37% of the hospitalizations between November 5 and November 25, 2020. These results are instrumental to inform public health efforts aimed at preventing future resurgence of cases. | public and global health |
10.1101/2021.01.10.21249348 | Evaluation of Ultrafast Wave-CAIPI 3D FLAIR in the Visualization and Volumetric Estimation of Cerebral White Matter Lesions | BACKGROUND AND PURPOSETo evaluate an ultrafast 3D-FLAIR sequence using Wave-CAIPI encoding (Wave-FLAIR) compared to standard 3D-FLAIR in the visualization and volumetric estimation of cerebral white matter lesions in a clinical setting.
MATERIALS AND METHODS42 consecutive patients underwent 3T brain MRI including standard 3D-FLAIR (acceleration factor R=2, scan time TA=7:15 minutes) and resolution-matched ultrafast Wave-FLAIR sequences (R=6, TA=2:45 minutes for the 20-ch coil; R=9, TA=1:50 minutes for the 32-ch coil) as part of clinical evaluation for demyelinating disease. Automated segmentation of cerebral white matter lesions was performed using the Lesion Segmentation Tool in SPM. Students t-test, intra-class correlation coefficient (ICC), relative lesion volume difference (LVD) and Dice similarity coefficients (DSC) were used to compare volumetric measurements between sequences. Two blinded neuroradiologists evaluated the visualization of white matter lesions, artifact and overall diagnostic quality using a predefined 5-point scale.
RESULTSStandard and Wave-FLAIR sequences showed excellent agreement of lesion volumes with an ICC of 0.99 and DSC of 0.97{+/-}0.05 (range 0.84 to 0.99). Wave-FLAIR was non-inferior to standard-FLAIR for visualization of lesions and motion. The diagnostic quality for Wave-FLAIR was slightly greater than standard-FLAIR for infratentorial lesions (p<0.001), and there was less pulsation artifact on Wave-FLAIR compared to standard FLAIR (p<0.001).
CONCLUSIONSUltrafast Wave-FLAIR provides superior visualization of infratentorial lesions while preserving overall diagnostic quality and yields comparable white matter lesion volumes to those estimated using standard-FLAIR. The availability of ultrafast Wave-FLAIR may facilitate the greater use of 3D-FLAIR sequences in the evaluation of patients with suspected demyelinating disease. | radiology and imaging |
10.1101/2021.01.10.21249333 | Proteomic and Metabolomic Investigation of COVID-19 Patients with Elevated Serum Lactate Dehydrogenase | Serum lactate dehydrogenase (LDH) has been established as a prognostic indicator given its differential expression in COVID-19 patients. However, the molecular mechanisms underneath remain poorly understood. In this study, 144 COVID-19 patients were enrolled to monitor the clinical and laboratory parameters over three weeks. Serum lactate dehydrogenase (LDH) was shown elevated in the COVID-19 patients on admission and declined throughout disease course, and its ability to classify patient severity outperformed other biochemical indicators. A threshold of 247 U/L serum LDH on admission was determined for severity prognosis. Next, we classified a subset of 14 patients into high- and low-risk groups based on serum LDH expression and compared their quantitative serum proteomic and metabolomic differences. The results found COVID-19 patients with high serum LDH exhibited differentially expressed blood coagulation and immune responses including acute inflammatory responses, platelet degranulation, complement cascade, as well as multiple different metabolic responses including lipid metabolism, protein ubiquitination and pyruvate fermentation. Specifically, activation of hypoxia responses was highlighted in patients with high LDH expressions. Taken together, our data showed that serum LDH levels are associated COVID-19 severity, and that elevated serum LDH might be consequences of hypoxia and tissue injuries induced by inflammation. | respiratory medicine |
10.1101/2021.01.10.21249333 | Proteomic and Metabolomic Investigation of COVID-19 Patients with Elevated Serum Lactate Dehydrogenase | Serum lactate dehydrogenase (LDH) has been established as a prognostic indicator given its differential expression in COVID-19 patients. However, the molecular mechanisms underneath remain poorly understood. In this study, 144 COVID-19 patients were enrolled to monitor the clinical and laboratory parameters over three weeks. Serum lactate dehydrogenase (LDH) was shown elevated in the COVID-19 patients on admission and declined throughout disease course, and its ability to classify patient severity outperformed other biochemical indicators. A threshold of 247 U/L serum LDH on admission was determined for severity prognosis. Next, we classified a subset of 14 patients into high- and low-risk groups based on serum LDH expression and compared their quantitative serum proteomic and metabolomic differences. The results found COVID-19 patients with high serum LDH exhibited differentially expressed blood coagulation and immune responses including acute inflammatory responses, platelet degranulation, complement cascade, as well as multiple different metabolic responses including lipid metabolism, protein ubiquitination and pyruvate fermentation. Specifically, activation of hypoxia responses was highlighted in patients with high LDH expressions. Taken together, our data showed that serum LDH levels are associated COVID-19 severity, and that elevated serum LDH might be consequences of hypoxia and tissue injuries induced by inflammation. | respiratory medicine |
10.1101/2021.01.07.21249362 | Impaired sensory evidence accumulation and network function in Lewy body dementia | Deficits in attention underpin many of the cognitive and neuropsychiatric features of Lewy body dementia. These attention-related symptoms remain difficult to treat and there are many gaps in our understanding of their neurobiology. An improved understanding of attention-related impairments can be achieved via mathematical modelling approaches, which identify cognitive parameters to provide an intermediate level between observed behavioural data and its underlying neural correlate. Here, we apply this approach to identify the role of impaired sensory evidence accumulation in the attention deficits that characterise Lewy body dementia. In 31 people with Lewy body dementia (including 13 Parkinsons disease dementia and 18 dementia with Lewy bodies cases), 16 people with Alzheimers disease, and 23 healthy controls, we administered an attention task whilst they underwent functional 3T MRI. Using hierarchical Bayesian estimation of a drift diffusion model, we decomposed task performance into drift rate and decision boundary parameters. We tested the hypothesis that the drift rate - a measure of the quality of sensory evidence accumulation - is specifically impaired in Lewy body dementia, compared to Alzheimers disease. We further explored whether trial-by-trial variations in the drift rate related to activity within the default and dorsal attention networks, to determine whether altered activity in these networks was associated with slowed drift rates in Lewy body dementia. Our results revealed slower drift rates in the Lewy body dementia compared to the Alzheimers disease group, whereas the patient groups were equivalent for their decision boundaries. The patient groups were reduced relative to controls for both parameters. This highlights sensory evidence accumulation deficits as a key feature that distinguishes attention impairments in Lewy body dementia, consistent with impaired ability to efficiently process information from the environment to guide behaviour. We also found that the drift rate was strongly related to activity in the dorsal attention network across all three groups, whereas the Lewy body dementia group showed a divergent relationship relative to the Alzheimers disease and control groups for the default network, consistent with altered default network modulation being associated with impaired evidence accumulation. Together, our findings reveal impaired sensory evidence accumulation as a specific marker of attention problems in Lewy body dementia, which may relate to large-scale network abnormalities. By identifying impairments in a specific sub-process of attention, these findings will inform future exploratory and intervention studies that aim to understand and treat attention-related symptoms that are a key feature of Lewy body dementia. | neurology |
10.1101/2021.01.09.21249317 | Intermittent radiotherapy as alternative treatment for recurrent high grade glioma: A modelling study based on longitudinal tumor measurements | Recurrent high grade glioma patients face a poor prognosis for which no curative treatment option currently exists. In contrast to prescribing high dose hypofractionated stereotactic radiotherapy (HFSRT, [≥] 6 Gyx5 in daily fractions) with debulking intent, we suggest a personalized treatment strategy to improve tumor control by delivering intermittent high dose treatment (iRT, [≥] 6 Gyx1 every six weeks). We performed a simulation analysis to compare HFSRT, iRT and iRT plus boost ([≥] 6 Gyx3 in daily fractions at time of progression) based on a mathematical model of tumour growth, radiation response and patient-specific evolution of resistance to additional treatments (pembrolizumab and bevacizumab). Model parameters were fitted from tumour growth curves of 16 patients enrolled in the phase 1 NCT02313272 trial that combined HFSRT with bevacizumab and pembrolizumab. Then, iRT +/-boost treatments were simulated and compared to HFSRT based on time to tumor regrowth. The modelling results demonstrated that iRT+boost(-boost) treatment was equal or superior to HFSRT in 15(11) out of 16 cases and that patients that remained responsive to pembrolizumab and bevacizumab would benefit most from iRT. Time to progression could be prolonged through the application of additional, intermittently delivered fractions. iRT hence provides a promising treatment option for recurrent high grade glioma patients. | oncology |
10.1101/2021.01.09.21249189 | Association of Visual Impairment with Brain Structure | ObjectiveTo investigate the association of visual impairment (VI) with brain structures in the UK Biobank Study.
MethodsThe UK Biobank Study is a large prospective study that recruited more than 500,000 participants aged 40-69 from 2006 to 2010 across the UK. Visual acuity (VA) of worse than 0.3 LogMAR units (Snellen 20/40) was defined as VI. Structural magnetic resonance imaging (MRI) data were obtained using a 3.0-T MRI imager. Volumetric measures of five global brain volumes (total brain volume, total grey matter, total white matter, cerebrospinal fluid (CSF), brain stem) and the volumes of seven specific brain region (thalamus, caudate nucleus, basal ganglia, pallidum, hippocampus, amygdala and nucleus accumbens) were included in the present analysis. Multivariable linear regression was used to investigate the association of VI with global and specific brain volumes.
ResultsA total of 8976 participants free of neurological disorders at baseline assessment were included for the present analysis. The prevalence of VI was 0.02% (n=181). After adjusting for a range of cofounding factors, VI was significantly associated with decreased volumes of the total brain ({beta} = -0.12, 95% confidence interval (CI) -0.23 to 0.00, P = 0.049), thalamus ({beta} = -0.16, 95% CI -0.18 to -0.04, P = 0.010), caudatenucleus ({beta} = -0.14, 95% CI -0.27 to 0.00, P = 0.046), pallidum ({beta} = -0.15, 95% CI-0.27 to -0.02, P = 0.028) and amygdala ({beta} = -0.18, 95% CI -0.31 to -0.04, P = 0.012).
InterpretationWe found that VI is associated with a decrease in total brain volumes and the volumes of specific brain regions implicated in neurodegenerative diseases. | ophthalmology |
10.1101/2021.01.10.21249548 | Atmospheric PM2.5 before and after Lockdown in relation to COVID-19 Evolution and daily Viral Counts: Could Viral Natural Selection have occurred due to changes in the Airborne Pollutant PM2.5 acting as a Vector for SARS-CoV-2? | BackgroundGenes coding for SARS-CoV-2 have been detected on the microscopic airborne pollutant particulate matter, which has been suggested as a vector for COVID-19 transmission. Lockdown in China has been shown to be associated with significant reduction in pollution including the particulate matter component which coincided with the appearance of a viral mutant (Clade G) which steadily displaced the original Clade D after lockdown. The reason why Clade G developed a fitness advantage is as yet unknown. This paper examines the possible role of airborne particulate matter PM2.5 as selective pressure determining viral Clade predominance and further shedding light on the mode of SARS-CoV-2 transmission.
MethodsThe average levels of PM2.5 of a number of cities were obtained from the Air Quality Index (AQI), a real-time assessment of atmospheric pollution. The daily average PM2.5 levels were assessed between January 23rd and April 29th 2020 determined by the timeline when viral counts in Beijing and other cities were available. Daily viral counts of Clades D and G were available starting from the 12th February as determined by the scientific literature published in August 2020. The cities chosen were Beijing, Sheffield, Nottingham, Sydney and Cambridge because of their substantially elevated viral counts compared to other cities. Cities as opposed to vaster areas/nations were chosen as PM2.5 levels vary across regions and countries.
ResultsFor the time period assessed, the Beijing PM2.5 pattern initiated with highly elevated mean PM2.5 levels of 155.8{micro}g/m3 (SD+/-73.6) during high viral counts, followed by 82.1{micro}g/m3 (SD+/-44.9) (p<0.04) when the viral counts decreased. In all the other cities assessed, the pattern differed whereby the PM2.5 levels increased significantly over the preceding baseline contemporaneously with the viral count rise. The changes in these cities PM2.5 levels were on average 31.5{micro}g/m3 before viral counts rose and 56.35{micro}g/m3 contemporaneous with viral count rise. The average levels of PM2.5 in these cities started to decrease one week after lockdown to 46{micro}g/m3 when measured over 2 weeks post-lockdown.
As regards the viral counts from data retrieved from Beijing, the latter part of the bell-shaped curve and a subsequent smaller curve of the viral count was available for evaluation. The average viral count for Clade D in Beijing was 11.1(SD+/-13.5) followed by a mean viral count for Clade G was 13.8(SD+/-9.2). Conversely in all the other cities besides Beijing, the viral counts averaged 45.8 for Clade D and 161 for Clade G. The variation in viral counts between cities suggests the strong possibility of variation in the availability of sampling between cities.
The newer variant, Clade G demonstrated viral counts initially appearing in mid-February in Beijing to later displace Clade D as the dominant viral Clade. The appearance of Clade G coincided with the decreasing gradient of PM2.5 levels. A number of significant correlations were obtained between PM2.5 levels and the viral count in all the cities reviewed.
ConclusionCOVID-19 viral counts appear to increase concomitant with increasing PM2.5 levels. Viral counts of both Clades correlated differentially with PM2.5 levels in all the cities assessed. The significantly highly elevated PM2.5 levels in Beijing resulted in correlating mainly with Clade D, however Clade G began to appear with decreasing PM2.5 levels, suggesting the beginnings for the initial SARS-CoV-2 Clade evolution. Clade G, the newer variant was able to flourish at lower levels of PM2.5 than Clade D. Clade G may possibly have utilized other sources of particulate matter as a viral vector, such as that derived from tobacco smoking, whereby 66% of Chinese males are smokers and 70% of the Chinese non-smoking population are exposed to 2nd hand smoking. | infectious diseases |
10.1101/2021.01.10.21249151 | Robotic RNA extraction for SARS-CoV-2 surveillance using saliva samples | Saliva is an attractive specimen type for asymptomatic surveillance of COVID-19 in large populations due to its ease of collection and its demonstrated utility for detecting RNA from SARS-CoV-2. Multiple saliva-based viral detection protocols use a direct-to-RT-qPCR approach that eliminates nucleic acid extraction but can reduce viral RNA detection sensitivity. To improve test sensitivity while maintaining speed, we developed a robotic nucleic acid extraction method for detecting SARS-CoV-2 RNA in saliva samples with high throughput. Using this assay, the Free Asymptomatic Saliva Testing (IGI-FAST) research study on the UC Berkeley campus conducted 11,971 tests on supervised self-collected saliva samples and identified rare positive specimens containing SARS-CoV-2 RNA during a time of low infection prevalence. In an attempt to increase testing capacity, we further adapted our robotic extraction assay to process pooled saliva samples. We also benchmarked our assay against the gold standard, nasopharyngeal swab specimens. Finally, we designed and validated a RT-qPCR test suitable for saliva self-collection. These results establish a robotic extraction-based procedure for rapid PCR-based saliva testing that is suitable for samples from both symptomatic and asymptomatic individuals. | infectious diseases |
10.1101/2021.01.10.21249151 | Robotic RNA extraction for SARS-CoV-2 surveillance using saliva samples | Saliva is an attractive specimen type for asymptomatic surveillance of COVID-19 in large populations due to its ease of collection and its demonstrated utility for detecting RNA from SARS-CoV-2. Multiple saliva-based viral detection protocols use a direct-to-RT-qPCR approach that eliminates nucleic acid extraction but can reduce viral RNA detection sensitivity. To improve test sensitivity while maintaining speed, we developed a robotic nucleic acid extraction method for detecting SARS-CoV-2 RNA in saliva samples with high throughput. Using this assay, the Free Asymptomatic Saliva Testing (IGI-FAST) research study on the UC Berkeley campus conducted 11,971 tests on supervised self-collected saliva samples and identified rare positive specimens containing SARS-CoV-2 RNA during a time of low infection prevalence. In an attempt to increase testing capacity, we further adapted our robotic extraction assay to process pooled saliva samples. We also benchmarked our assay against the gold standard, nasopharyngeal swab specimens. Finally, we designed and validated a RT-qPCR test suitable for saliva self-collection. These results establish a robotic extraction-based procedure for rapid PCR-based saliva testing that is suitable for samples from both symptomatic and asymptomatic individuals. | infectious diseases |
10.1101/2021.01.10.21249528 | Characteristics of changes in circulating markers of alveolar epithelial and endothelial injury in acute respiratory distress syndrome with COVID-19 | The time course and specific contributions of alveolar epithelial and endothelial injury to the pathogenesis of acute respiratory distress syndrome (ARDS) with coronavirus disease (COVID-19) remain unclear. Here, we evaluated the characteristics of circulating markers of alveolar epithelial and endothelial injury in serum samples from eleven ARDS patients and ten non-ARDS patients, all with COVID-19. Our results indicates that the alveolar epithelial injury at the very early disease stage and the endothelial injury which continues to exacerbate during the later disease stage seem to be the hallmarks of ARDS with COVID-19. | intensive care and critical care medicine |
10.1101/2021.01.10.21249528 | Distinct time courses and pathogenic contributions of alveolar epithelial and endothelial injury in acute respiratory distress syndrome with COVID-19: evidence from circulating biomarkers | The time course and specific contributions of alveolar epithelial and endothelial injury to the pathogenesis of acute respiratory distress syndrome (ARDS) with coronavirus disease (COVID-19) remain unclear. Here, we evaluated the characteristics of circulating markers of alveolar epithelial and endothelial injury in serum samples from eleven ARDS patients and ten non-ARDS patients, all with COVID-19. Our results indicates that the alveolar epithelial injury at the very early disease stage and the endothelial injury which continues to exacerbate during the later disease stage seem to be the hallmarks of ARDS with COVID-19. | intensive care and critical care medicine |
10.1101/2021.01.10.21249528 | Distinct temporal characteristics of circulating alveolar epithelial and endothelial injury markers in ARDS with COVID-19: a preliminary retrospective report | The time course and specific contributions of alveolar epithelial and endothelial injury to the pathogenesis of acute respiratory distress syndrome (ARDS) with coronavirus disease (COVID-19) remain unclear. Here, we evaluated the characteristics of circulating markers of alveolar epithelial and endothelial injury in serum samples from eleven ARDS patients and ten non-ARDS patients, all with COVID-19. Our results indicates that the alveolar epithelial injury at the very early disease stage and the endothelial injury which continues to exacerbate during the later disease stage seem to be the hallmarks of ARDS with COVID-19. | intensive care and critical care medicine |
10.1101/2021.01.06.20249030 | Convergence of Comorbidity and COVID-19 Infection to Fatality: An Investigation Based on Concurrent Health Status Evaluation among the Elderly Population in Kerala | ObjectiveTo investigate the impact of age, comorbidity, and vaccination in the fatality of older COVID-19 patients in the state of Kerala, India, based on their comorbidity and vaccination status.
MethodsIt is a cross sectional study adopting a mixed method approach conducted among the older population in Kerala. To study the health profile, 405 older people were surveyed, and 102 people were interviewed in-depth at their households, between June to November 2020. The results of the study were triangulated with elderly COVID-19 fatality data, available from the citizen-science dashboards of the research team and Department of Health, Kerala. Vaccination data was retrieved from cowin.gov.in to study its impact. The data was analysed using the IBM SPSS version 22.0.
ResultsAge is a predictor of COVID-19 fatality. Diabetes, hypertension, heart diseases, CKD and COPD are the significant predictors of elderly COVID-19 fatality. The current comorbidity profile of the total older population matches with the comorbidities of the COVID-19 elderly death cases. Vaccination has impacted COVID-19 mortality after vaccinating 65 percent (first dose) of the elderly.
ConclusionsAge and comorbidities can predict potential fatality among older COVID-19 patients. Timely and accurate health data and better knowledge of high-risk factors such as comorbidity can easily guide the healthcare system and authorities to efficient prevention and treatment methodologies. Knowledge on prevailing NCDs can drive early preparedness before it converges with an epidemic like the present zoonotic disease. Priority must be given for elderly vaccination to bring down the mortality rates. | public and global health |
10.1101/2021.01.06.20249030 | Convergence of Comorbidity and COVID-19 Infection to Fatality: An Investigation Based on Concurrent Health Status Evaluation among Older Adults | ObjectiveTo investigate the impact of age, comorbidity, and vaccination in the fatality of older COVID-19 patients in the state of Kerala, India, based on their comorbidity and vaccination status.
MethodsIt is a cross sectional study adopting a mixed method approach conducted among the older population in Kerala. To study the health profile, 405 older people were surveyed, and 102 people were interviewed in-depth at their households, between June to November 2020. The results of the study were triangulated with elderly COVID-19 fatality data, available from the citizen-science dashboards of the research team and Department of Health, Kerala. Vaccination data was retrieved from cowin.gov.in to study its impact. The data was analysed using the IBM SPSS version 22.0.
ResultsAge is a predictor of COVID-19 fatality. Diabetes, hypertension, heart diseases, CKD and COPD are the significant predictors of elderly COVID-19 fatality. The current comorbidity profile of the total older population matches with the comorbidities of the COVID-19 elderly death cases. Vaccination has impacted COVID-19 mortality after vaccinating 65 percent (first dose) of the elderly.
ConclusionsAge and comorbidities can predict potential fatality among older COVID-19 patients. Timely and accurate health data and better knowledge of high-risk factors such as comorbidity can easily guide the healthcare system and authorities to efficient prevention and treatment methodologies. Knowledge on prevailing NCDs can drive early preparedness before it converges with an epidemic like the present zoonotic disease. Priority must be given for elderly vaccination to bring down the mortality rates. | public and global health |
10.1101/2021.01.06.20249030 | Convergence of Comorbidity and COVID-19 Infection to Fatality: An Investigation Based on Health Assessment and Vaccination among Older Adults in Kerala | ObjectiveTo investigate the impact of age, comorbidity, and vaccination in the fatality of older COVID-19 patients in the state of Kerala, India, based on their comorbidity and vaccination status.
MethodsIt is a cross sectional study adopting a mixed method approach conducted among the older population in Kerala. To study the health profile, 405 older people were surveyed, and 102 people were interviewed in-depth at their households, between June to November 2020. The results of the study were triangulated with elderly COVID-19 fatality data, available from the citizen-science dashboards of the research team and Department of Health, Kerala. Vaccination data was retrieved from cowin.gov.in to study its impact. The data was analysed using the IBM SPSS version 22.0.
ResultsAge is a predictor of COVID-19 fatality. Diabetes, hypertension, heart diseases, CKD and COPD are the significant predictors of elderly COVID-19 fatality. The current comorbidity profile of the total older population matches with the comorbidities of the COVID-19 elderly death cases. Vaccination has impacted COVID-19 mortality after vaccinating 65 percent (first dose) of the elderly.
ConclusionsAge and comorbidities can predict potential fatality among older COVID-19 patients. Timely and accurate health data and better knowledge of high-risk factors such as comorbidity can easily guide the healthcare system and authorities to efficient prevention and treatment methodologies. Knowledge on prevailing NCDs can drive early preparedness before it converges with an epidemic like the present zoonotic disease. Priority must be given for elderly vaccination to bring down the mortality rates. | public and global health |
10.1101/2021.01.10.20249079 | Altered kidney function and acute kidney damage markers predict survival outcomes of COVID-19 patients: A prospective pilot study. | BackgroundThe central role in the pathogenesis of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), called as coronavirus disease 2019 (COVID-19), infection is attributed to angiotensin-converting enzyme 2 (ACE-2). ACE-2 expressing respiratory system involvement is the main clinical manifestation of the infection. However, literature about the association between the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection and higher ACE-2 expressing kidney is very limited. In this study, we primarily aimed to investigate whether there is a kidney injury during the course of SARS-CoV-2 infection. The predictive value of kidney injury for survival was also determined.
MethodsA total of 47 participants who met the inclusion criteria were included in the study. The participants were classified as COVID-19 patients before treatment COVID-19 patients after treatment, COVID-19 patients under treatment in ICU and controls. The parameters comorbidity, serum creatinine and cystatin C levels, CKD-EPI eGFR levels, KIM-1 and NGAL levels, urine KIM-1/creatinine and NGAL/creatinine ratios were statistically compared between the groups. The associations between covariates including kidney disease indicators and death from COVID-19 were examined using Cox proportional hazard regression analysis.
ResultsSerum creatinine and cystatin C levels, urine KIM-1/creatinine levels, and CKD-EPI, CKD-EPI cystatin C and CKD-EPI creatinine-cystatin C eGFR levels exhibited significant difference in the groups. The causes of the difference were more altered kidney function and increased acute kidney damage in COVID-19 patients before treatment and under treatment in ICU. Additionally, incidences of comorbidity and proteinuria in the urine analysis were higher in the COVID-19 patients under treatment in ICU group. Urine KIM-1/creatinine ratio and proteinuria were associated with COVID-19 specific death.
ConclusionsWe found that COVID-19 patients under treatment in ICU exhibited extremely higher levels of serum cystatin C, and urine KIM-1/creatinine and urine NGAL/creatinine ratios. These results clearly described the acute kidney damage by COVID-19 using molecular kidney damage markers for the first time in the literature. Lowered CKD-EPI, CKD-EPI cystatin C and CKD-EPI creatinine-cystatin C eGFR levels were determined in them, as well. Urine KIM-1/creatinine ratio and proteinuria were associated with COVID-19 specific death. In this regard, considering kidney function and kidney damage markers must not be ignored in the COVID-19 patients, and serial monitoring of them should be considered. | infectious diseases |
10.1101/2021.01.09.21249263 | Pragmatic, open-label, single-center, randomized, phase II clinical trial to evaluate the efficacy and safety of methylprednisolone pulses and tacrolimus in patients with severe pneumonia secondary to COVID-19: the TACROVID trial protocol | IntroductionSome COVID-19 patients evolve to severe lung injury and systemic hyperinflammatory syndrome triggered by both the coronavirus infection and the subsequent host-immune response. Accordingly, the use of immunomodulatory agents has been suggested but still remains controversial. Our working hypothesis is that methylprednisolone pulses and tacrolimus may be an effective and safety drug combination for treating severe COVID-19 patients.
Methods and analysisTACROVID is a randomized, open-label, single-center, phase II trial to evaluate the efficacy and safety of methylprednisolone pulses and tacrolimus plus standard of care (SoC) versus SoC alone, in patients at advanced stage of COVID-19 disease with lung injury and systemic hyperinflammatory response. Patients are randomly assigned (1:1) to one of two arms (42 patients in each group). The primary aim is to assess the time to clinical stability after initiating randomization. Clinical stability is defined as body temperature [≤] 37.5{degrees}C, and PaO2/FiO2 > 400 and/or SatO2/FiO2 > 300, and respiratory rate [≤]24 rpm; for 48 consecutive hours.
DiscussionMethylprednisolone and tacrolimus might be beneficial to treat those COVID-19 patients progressing into severe pulmonary failure and systemic hyperinflammatory syndrome. The rationale for its use is the fast effect of methylprednisolone pulses and the ability of tacrolimus to inhibit both the CoV-2 replication and the secondary cytokine storm. Interestingly, both drugs are low-cost and can be manufactured on a large scale; thus, if effective and safe, a large number of patients could be treated in developed and developing countries.
Trial registration numberNCT04341038 / EudraCT: 2020-001445-39 | infectious diseases |
10.1101/2021.01.09.21249494 | Assessments of heavy lift UAV quadcopter drone to support COVID 19 vaccine cold chain delivery for indigenous people in remote areasin South East Asia | Vaccine delivery is one important aspect need to be strengthened within health systems. One of the main challenges in COVID 19 vaccine delivery is how to cover indigenous population in remote and isolated forests in South East Asia. Another issue in COVID 19 cold chain delivery is requirement for a carrier that can maintain the suitable storage temperature. Related to this condition, COVID 19 vaccine should be delivered using heavy vaccine cooler box and this demand delivery system equipped with heavy lift capacity. In here, this study proposes and assess the potential used of heavy lift UAV quadcopter to expand the COVID 19 vaccine delivery to indigenous people living in village that impeded by rugged terrain. The landscape and terrain analysis show that access to the villages was dominated by 15%-45% slopes and the available access is only 1.5 m width trail. To transport 500 vials with 10 kg carrier along 2 km trail, it requires 2 persons to walk for 1 hour. By using drone, a straight line route with a length of 1.5 km can be developed. There were at least 3 drone types were available commercially to lift 10 kg load and several drones with payload capacity below 10 kg. For carrying 100 vials to village using drones, it is estimated the required delivery time was 1.23-1.38 minutes. Around 1.57-1.66 minute delivery times were required to transport 250 vials. For carrying the maximum and full loads of 500 vials or equals to 10 kg load, a drone requires in average of 3.13 minute delivery times. This required drone delivery time is significantly below the required time by walking that almost 1 hour. Drones were limited by flight operational times. Whereas all required delivery times for each drone assessed in this study were still below the drone operational time. The lowest drone operational time was 16 minutes and this is still higher than the time required for a drone to deliver the vaccine. Considering the effectiveness and anticipating vaccine vaccination, UAV quadcopter drone is a feasible option to support COVID 19 vaccine delivery to reach indigenous people in isolated areas. | health systems and quality improvement |
10.1101/2021.01.09.21249498 | Risk factors for bacterial infections in patients with moderate to severe COVID-19: A case control study | ObjectiveBacterial infections are known to complicate respiratory viral infections and are associated with adverse outcomes in COVID-19 patients. A case control study was conducted to determine risk factors for bacterial infections where cases were defined as moderate to severe/critical COVID-19 patients with bacterial infection and those without were included as controls. Logistic regression analysis was performed.
ResultsOut of a total of 50 cases and 50 controls, greater proportion of cases had severe or critical disease at presentation as compared to control i.e 80% vs 30% (p<0.001). Hospital acquired pneumonia (72%) and Gram negative organisms (82%) were predominant. Overall antibiotic utilization was 82% and was 64% in patients who had no evidence of bacterial infection. The median length of stay was significantly longer among cases compared to controls (12.5 versus 7.5 days) (p=0.001). The overall mortality was 30%, with comparatively higher proportion of deaths among cases (42% versus 18%) (p=0.009). Severe or critical COVID-19 at presentation (AOR: 4.42 times; 95% CI; 1.63-11.9) and use of steroids (AOR: 4.60; 95% CI 1.24-17.05) were independently associated with risk of bacterial infections. These findings have implications for antibiotic stewardship as antibiotics can be reserved for those at higher risk for bacterial superinfections. | infectious diseases |
10.1101/2021.01.05.21249237 | Enisamium is an inhibitor of the SARS-CoV-2 RNA polymerase and shows improvement of recovery in COVID-19 patients in an interim analysis of a clinical trial | Pandemic SARS-CoV-2 causes a mild to severe respiratory disease called Coronavirus Disease 2019 (COVID-19). Control of SARS-CoV-2 spread will depend on vaccine-induced or naturally acquired protective herd immunity. Until then, antiviral strategies are needed to manage COVID-19, but approved antiviral treatments, such as remdesivir, can only be delivered intravenously. Enisamium (laboratory code FAV00A, trade name Amizon(R)) is an orally active inhibitor of influenza A and B viruses in cell culture and clinically approved in countries of the Commonwealth of Independent States. Here we show that enisamium can inhibit SARS-CoV-2 infections in NHBE and Caco-2 cells. In vitro, the previously identified enisamium metabolite VR17-04 directly inhibits the activity of the SARS-CoV-2 RNA polymerase. Docking and molecular dynamics simulations suggest that VR17-04 prevents GTP and UTP incorporation. To confirm enisamiums antiviral properties, we conducted a double-blind, randomized, placebo-controlled trial in adult, hospitalized COVID-19 patients, which needed medical care either with or without supplementary oxygen. Patients received either enisamium (500 mg per dose) or placebo for 7 days. A pre-planned interim analysis showed in the subgroup of patients needing supplementary oxygen (n = 77) in the enisamium group a mean recovery time of 11.1 days, compared to 13.9 days for the placebo group (log-rank test; p=0.0259). No significant difference was found for all patients (n = 373) or those only needing medical care (n = 296). These results thus suggest that enisamium is an inhibitor of SARS-CoV-2 RNA synthesis and that enisamium treatment shortens the time to recovery for COVID-19 patients needing oxygen.
Significance statementSARS-CoV-2 is the causative agent of COVID-19. Although vaccines are now becoming available to prevent SARS-CoV-2 spread, the development of antivirals remains necessary for treating current COVID-19 patients and combating future coronavirus outbreaks. Here, we report that enisamium, which can be administered orally, can prevent SARS-CoV-2 replication and that its metabolite VR17-04 can inhibit the SARS-CoV-2 RNA polymerase in vitro. Moreover, we find that COVID-19 patients requiring supplementary oxygen, recover more quickly than patients treated with a placebo. Enisamium may therefore be an accessible treatment for COVID-19 patients. | infectious diseases |
10.1101/2021.01.05.21249237 | Enisamium is an inhibitor of the SARS-CoV-2 RNA polymerase and shows improvement of recovery in COVID-19 patients in an interim analysis of a clinical trial | Pandemic SARS-CoV-2 causes a mild to severe respiratory disease called Coronavirus Disease 2019 (COVID-19). Control of SARS-CoV-2 spread will depend on vaccine-induced or naturally acquired protective herd immunity. Until then, antiviral strategies are needed to manage COVID-19, but approved antiviral treatments, such as remdesivir, can only be delivered intravenously. Enisamium (laboratory code FAV00A, trade name Amizon(R)) is an orally active inhibitor of influenza A and B viruses in cell culture and clinically approved in countries of the Commonwealth of Independent States. Here we show that enisamium can inhibit SARS-CoV-2 infections in NHBE and Caco-2 cells. In vitro, the previously identified enisamium metabolite VR17-04 directly inhibits the activity of the SARS-CoV-2 RNA polymerase. Docking and molecular dynamics simulations suggest that VR17-04 prevents GTP and UTP incorporation. To confirm enisamiums antiviral properties, we conducted a double-blind, randomized, placebo-controlled trial in adult, hospitalized COVID-19 patients, which needed medical care either with or without supplementary oxygen. Patients received either enisamium (500 mg per dose) or placebo for 7 days. A pre-planned interim analysis showed in the subgroup of patients needing supplementary oxygen (n = 77) in the enisamium group a mean recovery time of 11.1 days, compared to 13.9 days for the placebo group (log-rank test; p=0.0259). No significant difference was found for all patients (n = 373) or those only needing medical care (n = 296). These results thus suggest that enisamium is an inhibitor of SARS-CoV-2 RNA synthesis and that enisamium treatment shortens the time to recovery for COVID-19 patients needing oxygen.
Significance statementSARS-CoV-2 is the causative agent of COVID-19. Although vaccines are now becoming available to prevent SARS-CoV-2 spread, the development of antivirals remains necessary for treating current COVID-19 patients and combating future coronavirus outbreaks. Here, we report that enisamium, which can be administered orally, can prevent SARS-CoV-2 replication and that its metabolite VR17-04 can inhibit the SARS-CoV-2 RNA polymerase in vitro. Moreover, we find that COVID-19 patients requiring supplementary oxygen, recover more quickly than patients treated with a placebo. Enisamium may therefore be an accessible treatment for COVID-19 patients. | infectious diseases |
10.1101/2021.01.06.21249363 | Geographical and temporal variation in reduction of malaria infection among children under five years of age throughout Nigeria | BackgroundGlobal progress in reducing malaria has stalled since 2015. Analysis of the situation is particularly needed in Nigeria, the country with by far the largest share of the burden, where approximately a quarter of all cases in the world are estimated to occur.
MethodsWe analysed data from three nationwide surveys (Malaria Indicator Surveys in 2010 and 2015, and a National Demographic and Health Survey in 2018), with malaria parasite prevalence in children under five years of age determined by sampling from all 36 states of Nigeria, and blood slide microscopy performed in the same accredited laboratory for all samples. Changes over time were evaluated by calculating prevalence ratio (PR) values with 95% confidence intervals (CI) for each state, together with Mantel-Haenszel adjusted prevalence ratios (PRadj) for each of the six major geopolitical zones of the country.
ResultsBetween 2010 and 2018 there were significant reductions in parasite prevalence in 25 states, but not in the remaining 11 states. Prevalence decreased most in southern zones of the country (South West PRadj = 0.53; South East PRadj = 0.59; South South PRadj = 0.51) and the North Central zone (PRadj = 0.36). Changes in the north were less marked, but were significant and indicated overall reductions by more than 20% (North West PRadj = 0.74; North East PRadj = 0.76). Changes in the south occurred mostly between 2010 and 2015, whereas those in the north were more gradual and most continued after 2015. Recent changes were not correlated with survey-reported variation in use of preventive measures.
ConclusionReductions in malaria infection in children under five have occurred in most individual states in Nigeria since 2010, but substantial geographical variation in the timing and extent indicate challenges to be overcome to enable global malaria reduction. | infectious diseases |
10.1101/2021.01.04.21249249 | Role of pollution and weather indicators in the COVID-19 outbreak: A brief study on Delhi, India | The present study examines the impact of environment pollution indicators and weather indicators on the COVID-19 outbreak in the capital city of India. In this study, we hypothesize that certain weather conditions with an atmosphere having high content of air pollutants, might impact the transmission of COVID-19, in addition to the direct human to human diffusion. The Kendall and Spearman rank correlation tests were chosen as an empirical methodology to conduct the statistical analysis. In this regard, we compiled a daily dataset of COVID-19 cases (Confirmed, Recovered, Deceased), Weather indicators (Temperature and relative humidity) and pollution indicators (PM 2.5, PM 10, NO2, CO, and SO2) in Delhi state of India. The effects of each parameter within three time frames of same day, 7 days ago, and 14 days ago are evaluated. This study reveal a significant correlation between the transmission of COVID-19 outbreaks and the atmospheric pollutants with a combination of specific climatic conditions. The findings of this research will help the policymakers to identify risky geographic areas and enforce timely preventive measures. | infectious diseases |
10.1101/2021.01.04.20248661 | Association of homelessness with COVID-19 positivity among individuals visiting a testing center | We conducted a chart audit of all patients attending an inner-city COVID-19 testing centre in Toronto, Canada between March and April 2020. Of the 2050 unique individuals tested, 214 (10.4%) were homeless. People experiencing homelessness were more likely to test positive for COVID-19 compared to those not experiencing homelessness even after adjustment for age, sex, and the presence of any medical co-morbidity (15.4% vs. 6.7%, p<0.001; OR 2.41, 95% CI 1.51 to 3.76, p<0.001). | infectious diseases |
10.1101/2021.01.10.21249534 | Reverse causal effect of atrial fibrillation on 17 site-specific cancer risk: A Mendelian randomization study | BackgroundBidirectional association between atrial fibrillation (AF) and cancer was reported by observational investigations. Additional study is warranted to investigate the causal effects of AF on risk of cancer.
MethodsThis study was a summary-level Mendelian randomization (MR) analysis. Genetic instrument for AF was developed from a genome-wide association study (GWAS) meta-analysis for AF including 537,409 European ancestry individuals including 55,144 AF cases. The outcome data for risk of 17 site-specific cancer from a previous GWAS meta-analysis of the UK Biobank (48,961/359,825 case/controls) and Genetic Epidemiology Research on Adult Health and Aging (16,001/50,525 case/controls) cohorts including European ancestry individuals was investigated. Inverse variance weighted method was the main MR method, supported by pleiotropy-robust sensitivity analysis including MR-Egger regression and penalized weighted median method.
ResultsThe causal estimates indicated that AF was causally linked to higher risk of cancers of lung, breast, cervix, endometrium, and melanoma. MR-Egger test for directional pleiotropy indicated absence of a pleiotropy in the identified causal estimates and MR-Egger regression and median-based methods provided similarly significant findings. On the other hand, the genetic predisposition of AF was significantly associated with lower risk of esophagus/gastric cancer, but possibility of a directional pleiotropy remained in the association.
ConclusionsAF is a causal factor for certain types of cancer. Appropriate cancer screening should be suggested in clinical guidelines for AF patients. Future trial is necessary to confirm whether appropriate management of AF may reduce the risk of cancer which is a major cause of deaths in AF patients. | cardiovascular medicine |
10.1101/2021.01.09.21249490 | Nutritional Status in Patients with Acute heart failure: A systematic review and meta-analysis with trial sequential analysis. | A critical review of the prognosis impact of malnutrition in patients admitted with acute heart failure (AHF) has never been performed. We systematically reviewed the observational epidemiology literature to determine the all-cause mortality (ACM) in undernourished patients with acute heart failure or at risk of malnutrition through a meta-analysis of observational studies.
A systemic search using PubMed, Scopus, and Web of Science was done for articles reporting an association between malnutrition and mortality in patient with acute heart failure published before December 2019. Original data from observational cohort studies in patients with acute heart failure at baseline, and with nutritional state evaluation at admission using screening, or assessment tools. The outcome of interest was mortality independent of the timeframe for follow up. The characteristics of the included study were collected. Data quality assessment using the Newcastle Ottawa Quality Assessment Scale. The hazard ratios (HRs) and corresponding 95% confidence intervals (CIs) were extracted. For the meta-analysis, a random-effects model was considered.
Heterogeneity between studies was assessed using Cochran Q statistics and I2 statistics. Subgroup analyses were used to identify the source of heterogeneity. A sensitive analysis was performed to reflect the influence of the individual data set on the pooled HR. Publication bias was detected using the Doi plot and Luis Furuya-Kanamori asymmetry index (LFK index). The influence of potential publication bias on results was explored by using the trim-and-fill procedure. To assess the risks of random errors, trial sequential analysis (TSA) was performed.
Seven studies were eligible for review and meta-analysis. There were 9053 participants and over 1536 events occurred. The prevalence of malnutrition varied from 33% to 78.8%. Mean follow-up varied between 189 and 951 days. ACM rates varied between 7% and 42.6%. Nutritional status is significantly associated with mortality in patients with AHF (Pooled HR=1.15;95%CI[1.08-1.23]). Considerable between-study heterogeneity was observed (I2=83%, P=0.001). Heterogeneity was partially explained by the different tools used to screen malnutrition risk, and follow-up durations used by the included studies. There was evidence of major publication bias regarding the risk of malnutrition-related to ACM. The obtained LFK index was 6.12 and suggests major asymmetry. The recalculated pooled HR that incorporates the hypothetical missing studies is 1.15; 95%CI (1.08-1.22). However, the accumulating number of participants and the required information size has not yet been achieved. Then, the trial sequential monitoring boundary is inconclusive.
This first meta-analysis of the association between nutritional status in patients with acute heart failure and all-cause mortality indicated that malnutrition risk in a patient with acute heart failure was associated with increased all-cause mortality. The prognosis impact of malnutrition is real despite heterogeneity in tools and cut off for defining malnutrition and mean follow up duration. This review underlines the peremptory need for multicenter studies, for uniform guidelines for assessing nutritional status, and for reporting guidelines for prognostic studies in an acute cardiovascular setting. Better nutritional practice to improve patient care is emphasized in international and national health care guidelines. | cardiovascular medicine |
10.1101/2021.01.09.21249456 | The Ecological Structure of Mosquito Population Dynamics: Insights from India | Understanding the temporal dynamics of mosquito populations underlying malaria transmission is key to optimising control strategies. We collate mosquito time-series catch data spanning 40 years and 117 locations across India to understand the factors driving these dynamics. Our analyses reveal pronounced variation in dynamics across locations and between species. Many mosquito populations lacked the often-assumed positive relationship with rainfall, instead displaying patterns of abundance that were only weakly or even negatively correlated with precipitation and highlighting the role of temperature, proximity to perennial bodies of water and patterns of land use in shaping the dynamics and seasonality of mosquito populations. We show that these diverse dynamics can be clustered into "dynamical archetypes", each characterised by distinct temporal properties and driven by a largely unique set of environmental factors. These results highlight that a complex interplay of factors, rather than rainfall alone, shape the timing and extent of mosquito population seasonality. | epidemiology |
10.1101/2021.01.09.21249456 | The Ecological Structure of Mosquito Population Seasonal Dynamics | Understanding the temporal dynamics of mosquito populations underlying malaria transmission is key to optimising control strategies. We collate mosquito time-series catch data spanning 40 years and 117 locations across India to understand the factors driving these dynamics. Our analyses reveal pronounced variation in dynamics across locations and between species. Many mosquito populations lacked the often-assumed positive relationship with rainfall, instead displaying patterns of abundance that were only weakly or even negatively correlated with precipitation and highlighting the role of temperature, proximity to perennial bodies of water and patterns of land use in shaping the dynamics and seasonality of mosquito populations. We show that these diverse dynamics can be clustered into "dynamical archetypes", each characterised by distinct temporal properties and driven by a largely unique set of environmental factors. These results highlight that a complex interplay of factors, rather than rainfall alone, shape the timing and extent of mosquito population seasonality. | epidemiology |
10.1101/2021.01.09.21249489 | Impact of COVID-19 pandemic on use of Pediatric Emergency Health Services in a Tertiary Care Pediatric Hospital in North India | ObjectiveTo compare Pediatric Emergency attendance pre-COVID 19 to that during COVID 19 pandemic and to study changes in patient profiles attending Pediatric Emergency Department during COVID 19 pandemic.
MethodsWe conducted a retrospective cross-sectional observational study and collected data from Medical Record Section during the COVID-19 pandemic from January to June 2020 and compared it with data from 2019 in similar months. Data collected was analyzed to find out the impact of COVID - 19 on use of pediatric emergency health services with respect to patient attendance, age and clinical profile before and during COVID-19 in a tertiary care hospital in New Delhi.
ResultsWe observed a 43% decline in PED visits which increased to 75% during the period of lock-down (p value = 0.005). There was a significant decrease in children of age group 1-5 years attending PED. Mortality rate during lockdown had gone up by nearly 3times than the average monthly mortality.
ConclusionsWhile children might not have been directly affected by the COVID-19 pandemic, but the fear of COVID 19 and measures taken to control the pandemic has affected the health seeking behavior of patients to an extent that indirectly caused more damage than anticipated. | pediatrics |
10.1101/2021.01.09.21249515 | Impact of Residential Neighborhood and Race/Ethnicity on Outcomes of Hospitalized Patients with COVID-19 in the Bronx | The socially vulnerable have been most affected due to the COVID-19 pandemic, similar to the aftermath of any major disaster. Racial and social minorities are experiencing a disproportionate burden of morbidity and mortality.
The aim of this study was to evaluate the impact of residential location/community and race/ethnicity on outcomes of COVID-19 infection among hospitalized patients within the Bronx. This was a single center retrospective observational cohort study that included SARS-CoV2 positive adult residents of the Bronx (stratified as residents of South Bronx vs Rest of Bronx) hospitalized between March-May 2020. Data extracted from hospital electronic medical records included residential addresses, race, comorbidities, and insurance details. Comorbidity burden other clinical and laboratory details were also assessed to determine their correlation to COVID-19 severity of illness and outcomes of mortality and length of stay.
As expected, the COVID-19 pandemic differentially affected outcomes in those in the more socially disadvantaged area of the South Bronx versus the rest of the Bronx borough. Residents of the South Bronx had a significantly higher comorbidity burden and had public insurance to access medical care in comparison to the remainder of the Bronx. Interestingly, for the patient population studied there was no observed difference in 30-day mortality by race/ethnicity among those infected with COVID- 19 in spite of the increased disease burden observed.
This adds an interesting perspective to the current literature, and highlights the need to address the social/economic factors contributing to health access disparity to reduce the adverse impact of COVID-19 in these communities. | public and global health |
10.1101/2021.01.09.21249505 | Wastewater Virus Detection Complements Clinical COVID-19 Testing to Limit Spread of Infection at Kenyon College | In-person college instruction during the 2020 pandemic required effective and economical monitoring of COVID-19 prevalence. Kenyon College and the Village of Gambier conducted measurement of SARS-CoV-2 RNA from the village wastewater plant and from an on-campus sewer line. Wastewater RNA detection revealed virus prevalence leading to individual testing and case identification. Wastewater surveillance also showed when case rates had subsided, thus limiting the need for individual clinical testing. Overall, wastewater virus surveillance allows more targeted use of individual testing and increases community confidence in student population management. | public and global health |
10.1101/2021.01.10.21249533 | Effects of surgical masks on droplet and aerosol dispersion under various oxygen delivery modalities | RationaleAerosol dispersion under various oxygen delivery modalities, including high flow nasal cannula, is a critical concern for healthcare workers who treat acute hypoxemic respiratory failure during the coronavirus disease 2019 pandemic. Effects of surgical masks on droplet and aerosol dispersion under oxygen delivery modalities are not yet clarified.
ObjectivesTo visualize and quantify dispersion particles under various oxygen delivery modalities and examine the protective effect of surgical masks on particle dispersion.
MethodsThree and five healthy men were enrolled for video recording and quantification of particles, respectively. Various oxygen delivery modalities including high flow nasal cannula were used in this study. Particle dispersions during rest breathing, speaking, and coughing were recorded and automatically counted in each condition and were evaluated with or without surgical masks.
Measurements and Main ResultsCoughing led to the maximum amount and distance of particle dispersion, regardless of modalities. Droplet dispersion was not visually increased by oxygen delivery modalities compared to breathing at room air. With surgical masks over the nasal cannula or high-flow nasal cannula, droplet dispersion was barely visible. Oxygen modalities did not increase the particle dispersion counts regardless of breathing pattens. Wearing surgical masks significantly decreased particle dispersion in all modalities while speaking and coughing, regardless of particle sizes, and reduction rates were approximately 95 and 80-90 % for larger (> 5 m) and smaller (> 0.5 m) particles, respectively.
ConclusionsSurgical mask over high flow nasal canula may be safely used for acute hypoxemic respiratory failure including coronavirus disease 2019 patients.
Subject Category List4.13 Ventilation: Non-Invasive/Long-Term/Weaning
*This article has an online data supplement, which is accessible from this issues table of content online at www.atsjournals.org. | respiratory medicine |
10.1101/2021.01.09.21249508 | Exploration of interethnic variation in the ibuprofen metabolizing enzyme CYP2C9: a cautionary guide for treatment of COVID-19 symptoms | Coronavirus disease 2019 (COVID-19), is a rapidly spreading infectious illness that causes a debilitating respiratory syndrome. While non-steroidal anti-inflammatory drugs (NSAIDs), may be prescribed for the management of pain and fever, there was early controversy on the use of ibuprofen for symptomatic treatment of COVID-19. P450 enzyme CYP2C9 are known to be involved in the metabolism of NSAIDs. Although no study has been conducted in the setting of population genetics in patients with COVID-19 yet, there are plausible mechanisms by which genetic determinants may play a role in adverse drug reactions (ADRs). In this work, we adjusted expected phenotype frequencies based on racial demographic models dependent on population ancestry in drug responses and toxicity events associated with ibuprofen treatment. A cohort of 101 Jordanian Arab samples retrospectively were selected and genotyped using Affymetrix DMET Plus Premier Package, within the context of over 100,000 global subjects in 417 published reports. European populations are 7.2x more likely to show impaired ibuprofen metabolism than Sub-Saharan populations, and 4.5x more likely than East Asian ancestry populations. Hence, a proactive assessment of the most likely gene-drug candidates will lead to more effective treatments and a better understanding of the role of pharmacogenetics for COVID-19. | genetic and genomic medicine |
10.1101/2021.01.09.21249508 | Exploration of interethnic variation in the ibuprofen metabolizing enzyme CYP2C9: a genetic-based cautionary guide for treatment of COVID-19 symptoms | Coronavirus disease 2019 (COVID-19), is a rapidly spreading infectious illness that causes a debilitating respiratory syndrome. While non-steroidal anti-inflammatory drugs (NSAIDs), may be prescribed for the management of pain and fever, there was early controversy on the use of ibuprofen for symptomatic treatment of COVID-19. P450 enzyme CYP2C9 are known to be involved in the metabolism of NSAIDs. Although no study has been conducted in the setting of population genetics in patients with COVID-19 yet, there are plausible mechanisms by which genetic determinants may play a role in adverse drug reactions (ADRs). In this work, we adjusted expected phenotype frequencies based on racial demographic models dependent on population ancestry in drug responses and toxicity events associated with ibuprofen treatment. A cohort of 101 Jordanian Arab samples retrospectively were selected and genotyped using Affymetrix DMET Plus Premier Package, within the context of over 100,000 global subjects in 417 published reports. European populations are 7.2x more likely to show impaired ibuprofen metabolism than Sub-Saharan populations, and 4.5x more likely than East Asian ancestry populations. Hence, a proactive assessment of the most likely gene-drug candidates will lead to more effective treatments and a better understanding of the role of pharmacogenetics for COVID-19. | genetic and genomic medicine |
10.1101/2021.01.09.21249499 | Performance and Implementation Evaluation of the Abbott BinaxNOW Rapid Antigen Test in a High-throughput Drive-through Community Testing Site in Massachusetts | BackgroundRapid diagnostic tests (RDTs) for SARS-CoV-2 antigens (Ag) that can be performed at point-of-care (POC) can supplement molecular testing and help mitigate the COVID-19 pandemic. Deployment of an Ag RDT requires an understanding of its operational and performance characteristics under real-world conditions and in relevant subpopulations. We evaluated the Abbott BinaxNOW COVID-19 Ag Card in a high-throughput, drive-through, free community testing site in Massachusetts (MA) using anterior nasal (AN) swab RT-PCR for clinical testing.
MethodsIndividuals presenting for molecular testing in two of seven lanes were offered the opportunity to also receive BinaxNOW testing. Dual AN swabs were collected from symptomatic and asymptomatic children ([≤] 18 years) and adults. BinaxNOW testing was performed in a testing pod with temperature/humidity monitoring. One individual performed testing and official result reporting for each test, but most tests had a second independent reading to assess inter-operator agreement. Positive BinaxNOW results were scored as faint, medium, or strong. Positive BinaxNOW results were reported to patients by phone and they were instructed to isolate pending RT-PCR results. The paired RT-PCR result was the reference for sensitivity and specificity calculations.
ResultsOf 2482 participants, 1380 adults and 928 children had paired RT-PCR/BinaxNOW results and complete symptom data. 974/1380 (71%) adults and 829/928 (89%) children were asymptomatic. BinaxNOW had 96.5% (95% confidence interval [CI] 90.0-99.3) sensitivity and 100% (98.6-100.0) specificity in adults within 7 days of symptoms, and 84.6% (65.1-95.6) sensitivity and 100% (94.5-100.0) specificity in children within 7 days of symptoms. Sensitivity and specificity in asymptomatic adults were 70.2% (56.6-81.6) and 99.6% (98.9-99.9), respectively, and in asymptomatic children were 65.4% (55.6-74.4) and 99.0% (98.0-99.6), respectively. By cycle threshold (Ct) value cutoff, sensitivity in all subgroups combined (n=292 RT-PCR-positive individuals) was 99.3% with Ct [≤]25, 95.8% with [≤]30, and 81.2% with [≤]35. Twelve false positive BinaxNOW results (out of 2308 tests) were observed; in all twelve, the test bands were faint but otherwise normal, and were noted by both readers. One invalid BinaxNOW result was identified. Inter-operator agreement (positive versus negative BinaxNOW result) was 100% (n = 2230/2230 double reads). Each operator was able to process 20 RDTs per hour. In a separate set of 30 specimens (from individuals with symptoms [≤]7 days) run at temperatures below the manufacturers recommended range (46-58.5{degrees}F), sensitivity was 66.7% and specificity 95.2%.
ConclusionsBinaxNOW had very high specificity in both adults and children and very high sensitivity in newly symptomatic adults. Overall, 95.8% sensitivity was observed with Ct [≤] 30. These data support public health recommendations for use of the BinaxNOW test in adults with symptoms for [≤]7 days without RT-PCR confirmation. Excellent inter-operator agreement indicates that an individual can perform and read the BinaxNOW test alone. A skilled laboratorian can perform and read 20 tests per hour. Careful attention to temperature is critical. | infectious diseases |
10.1101/2021.01.10.21249440 | Serologic Surveillance and Phylogenetic Analysis of SARS-CoV-2 Infection in Hospital Health Care Workers | BACKGROUNDIt is unclear how, when and where health care workers (HCW) working in hospitals are infected with SARS-CoV-2.
METHODSProspective cohort study comprising 4-weekly measurement of SARS-CoV-2 specific antibodies and questionnaires from March to June 2020. We compared SARS-CoV-2 incidence between HCW working in Covid-19 patient care, HCW working in non-Covid-19 patient care and HCW not in patient care. Phylogenetic analyses of SARS-CoV-2 samples from patients and HCW were performed to identify potential transmission clusters.
RESULTSWe included 801 HCW: 439 in the Covid-19 patient care group, 164 in the non-Covid-19 patient care group and 198 in the no patient care group. SARS-CoV-2 incidence was highest in HCW working in Covid-19 patient care (13.2%), as compared with HCW in non-Covid-19 patient care (6.7%, hazard ratio [HR] 2.2, 95% confidence interval [CI] 1.2 to 4.3) and in HCW not working in patient care (3.6%, HR 3.9, 95% CI 1.8 to 8.6). Within the group of HCW caring for Covid-19 patients, SARS-CoV-2 cumulative incidence was highest in HCW working on Covid-19 wards (25.7%), as compared with HCW working on intensive care units (7.1%, HR 3.6, 95% CI 1.9 to 6.9), and HCW working in the emergency room (8.0%, HR 3.3, 95% CI 1.5 to 7.1). Phylogenetic analyses on Covid-19 wards identified multiple potential HCW-to-HCW transmission clusters while no patient-to-HCW transmission clusters were identified.
CONCLUSIONSHCW working on Covid-19 wards are at increased risk for nosocomial SARS-CoV-2 infection, with an important role for HCW-to-HCW transmission.
(Funded by the Netherlands Organization for Health Research and Development ZonMw & the Corona Research Fund Amsterdam UMC; Netherlands Trial Register number NL8645) | infectious diseases |
10.1101/2021.01.10.21249382 | Epidemiological impact of prioritizing SARS-CoV-2 vaccination by antibody status: Mathematical modeling analyses | BackgroundVaccines against SARS-CoV-2 have been developed, but their availability falls far short of global needs. This study aimed to investigate the impact of prioritizing available doses on the basis of recipient antibody status, that is by exposure status, using Qatar as an example.
MethodsVaccination impact was assessed under different scale-up scenarios using a deterministic meta-population mathematical model describing SARS-CoV-2 transmission and disease progression in the presence of vaccination.
ResultsFor a vaccine that protects against infection with an efficacy of 95%, half as many vaccinations were needed to avert one infection, disease outcome, or death by prioritizing antibody-negative individuals for vaccination. Prioritization by antibody status reduced incidence at a faster rate and led to faster elimination of infection and return to normalcy. Further prioritization by age group amplified the gains of prioritization by antibody status. Gains from prioritization by antibody status were largest in settings where the proportion of the population already infected at the commencement of vaccination was 30-60%, which is perhaps where most countries will be by the time vaccination programs are up and running. For a vaccine that only protects against disease and not infection, vaccine impact was reduced by half, whether this impact was measured in terms of averted infections or disease outcomes, but the relative gains from using antibody status to prioritize vaccination recipients were similar.
ConclusionsMajor health, societal, and economic gains can be achieved more quickly by prioritizing those who are antibody-negative while doses of the vaccine remain in short supply. | epidemiology |
10.1101/2021.01.09.21249384 | International risk of the new variant COVID-19 importations originating in the United Kingdom | A fast-spreading SARS-CoV-2 variant identified in the United Kingdom in December 2020 has raised international alarm. We estimate that, in all 15 countries analyzed, there is at least a 50% chance the variant was imported by travelers from the United Kingdom by December 7th. | epidemiology |
10.1101/2021.01.09.21249384 | International risk of the new variant COVID-19 importations originating in the United Kingdom | A fast-spreading SARS-CoV-2 variant identified in the United Kingdom in December 2020 has raised international alarm. We estimate that, in all 15 countries analyzed, there is at least a 50% chance the variant was imported by travelers from the United Kingdom by December 7th. | epidemiology |
10.1101/2021.01.09.21249384 | International risk of the new variant COVID-19 importations originating in the United Kingdom | A fast-spreading SARS-CoV-2 variant identified in the United Kingdom in December 2020 has raised international alarm. We estimate that, in all 15 countries analyzed, there is at least a 50% chance the variant was imported by travelers from the United Kingdom by December 7th. | epidemiology |
10.1101/2021.01.09.21249384 | International risk of the new variant COVID-19 importations originating in the United Kingdom | A fast-spreading SARS-CoV-2 variant identified in the United Kingdom in December 2020 has raised international alarm. We estimate that, in all 15 countries analyzed, there is at least a 50% chance the variant was imported by travelers from the United Kingdom by December 7th. | epidemiology |
10.1101/2021.01.09.21249384 | International risk of the new variant COVID-19 importations originating in the United Kingdom | A fast-spreading SARS-CoV-2 variant identified in the United Kingdom in December 2020 has raised international alarm. We estimate that, in all 15 countries analyzed, there is at least a 50% chance the variant was imported by travelers from the United Kingdom by December 7th. | epidemiology |
10.1101/2021.01.10.21249524 | Using excess deaths and testing statistics to improve estimates of COVID-19 mortalities | Factors such as non-uniform definitions of mortality, uncertainty in disease prevalence, and biased sampling complicate the quantification of fatality during an epidemic. Regardless of the employed fatality measure, the infected population and the number of infection-caused deaths need to be consistently estimated for comparing mortality across regions. We combine historical and current mortality data, a statistical testing model, and an SIR epidemic model, to improve estimation of mortality. We find that the average excess death across the entire US is 13% higher than the number of reported COVID-19 deaths. In some areas, such as New York City, the number of weekly deaths is about eight times higher than in previous years. Other countries such as Peru, Ecuador, Mexico, and Spain exhibit excess deaths significantly higher than their reported COVID-19 deaths. Conversely, we find negligible or negative excess deaths for part and all of 2020 for Denmark, Germany, and Norway. | epidemiology |
10.1101/2021.01.10.20249014 | Cerebrospinal fluid in COVID-19 neurological complications: no cytokine storm or neuroinflammation. | BACKGROUNDNeurological complications occur in COVID-19. We aimed to examine cerebrospinal fluid (CSF) of COVID-19 subjects with neurological complications and determine presence of neuroinflammatory changes implicated in pathogenesis.
METHODSCross-sectional study of CSF neuroinflammatory profiles from 18 COVID-19 subjects with neurological complications categorized by diagnosis (stroke, encephalopathy, headache) and illness severity (critical, severe, moderate, mild). COVID-19 CSF was compared with CSF from healthy, infectious and neuroinflammatory disorders and stroke controls (n=82). Cytokines (IL-6, TNF, IFN{gamma}, IL-10, IL-12p70, IL-17A), inflammation and coagulation markers (high-sensitivity-C Reactive Protein [hsCRP], ferritin, fibrinogen, D-dimer, Factor VIII) and neurofilament light chain (NF-L), were quantified. SARS-CoV2 RNA and SARS-CoV2 IgG and IgA antibodies in CSF were tested with RT-PCR and ELISA.
RESULTSCSF from COVID-19 subjects showed a paucity of neuroinflammatory changes, absence of pleocytosis or specific increases in pro-inflammatory markers or cytokines (IL-6, ferritin, or D-dimer). Anti-SARS-CoV2 antibodies in CSF of COVID-19 subjects (77%) were observed despite no evidence of SARS-CoV2 viral RNA. A similar increase of pro-inflammatory cytokines (IL-6, TNF, IL-12p70) and IL-10 in CSF of COVID-19 and non-COVID-19 stroke subjects was observed compared to controls. CSF-NF-L was elevated in subjects with stroke and critical COVID-19. CSF-hsCRP was present almost exclusively in COVID-19 cases.
CONCLUSIONThe paucity of neuroinflammatory changes in CSF of COVID-19 subjects and lack of SARS-CoV2 RNA do not support the presumed neurovirulence of SARS-CoV2 or neuroinflammation in pathogenesis of neurological complications in COVID-19. Elevated CSF-NF-L indicates neuroaxonal injury in COVID-19 cases. The role of CSF SARS-CoV2 IgG antibodies is still undetermined.
FUNDINGThis work was supported by NIH R01-NS110122 and The Bart McLean Fund for Neuroimmunology Research. | neurology |
10.1101/2021.01.11.20248788 | Explainable AI enables clinical trial patient selection to retrospectively improve treatment effects in schizophrenia | BackgroundHeterogeneity among patients responses to treatment is prevalent in psychiatric disorders. Personalized medicine approaches - which involve parsing patients into subgroups better indicated for a particular treatment - could therefore improve patient outcomes and serve as a powerful tool in patient selection within clinical trials. Machine learning approaches can identify patient subgroups but are often not "explainable" due to the use of complex algorithms that do not mirror clinicians natural decision-making processes.
MethodsHere we combine two analytical approaches - Personalized Advantage Index and Bayesian Rule Lists - to identify paliperidone-indicated schizophrenia patients in a way that emphasizes model explainability. We apply these approaches retrospectively to randomized, placebo-controlled clinical trial data to identify a paliperidone-indicated subgroup of schizophrenia patients who demonstrate a larger treatment effect (outcome on treatment superior than on placebo) than that of the full randomized sample as assessed with Cohens d. For this study, the outcome corresponded to a reduction in the Positive and Negative Syndrome Scale (PANSS) total score which measures positive (e.g., hallucinations, delusions), negative (e.g., blunted affect, emotional withdrawal), and general psychopathological (e.g., disturbance of volition, uncooperativeness) symptoms in schizophrenia.
ResultsUsing our combined explainable AI approach to identify a subgroup more responsive to paliperidone than placebo, the treatment effect increased significantly over that of the full sample (p<0.0001 for a one-sample t-test comparing the full sample Cohens d=0.82 and a generated distribution of subgroup Cohens ds with mean d=1.22, std d=0.09). In addition, our modeling approach produces simple logical statements (if-then-else), termed a "rule list", to ease interpretability for clinicians. A majority of the rule lists generated from cross-validation found two general psychopathology symptoms, disturbance of volition and uncooperativeness, to predict membership in the paliperidone-indicated subgroup.
ConclusionsThese results help to technically validate our explainable AI approach to patient selection for a clinical trial by identifying a subgroup with an improved treatment effect. With these data, the explainable rule lists also suggest that paliperidone may provide an improved therapeutic benefit for the treatment of schizophrenia patients with either of the symptoms of high disturbance of volition or high uncooperativeness.
Trial Registrationclincialtrials.gov identifier: NCT 00083668; registered May 28, 2004 | psychiatry and clinical psychology |
10.1101/2021.01.11.20248788 | Explainable AI enables clinical trial patient selection to retrospectively improve treatment effects in schizophrenia | BackgroundHeterogeneity among patients responses to treatment is prevalent in psychiatric disorders. Personalized medicine approaches - which involve parsing patients into subgroups better indicated for a particular treatment - could therefore improve patient outcomes and serve as a powerful tool in patient selection within clinical trials. Machine learning approaches can identify patient subgroups but are often not "explainable" due to the use of complex algorithms that do not mirror clinicians natural decision-making processes.
MethodsHere we combine two analytical approaches - Personalized Advantage Index and Bayesian Rule Lists - to identify paliperidone-indicated schizophrenia patients in a way that emphasizes model explainability. We apply these approaches retrospectively to randomized, placebo-controlled clinical trial data to identify a paliperidone-indicated subgroup of schizophrenia patients who demonstrate a larger treatment effect (outcome on treatment superior than on placebo) than that of the full randomized sample as assessed with Cohens d. For this study, the outcome corresponded to a reduction in the Positive and Negative Syndrome Scale (PANSS) total score which measures positive (e.g., hallucinations, delusions), negative (e.g., blunted affect, emotional withdrawal), and general psychopathological (e.g., disturbance of volition, uncooperativeness) symptoms in schizophrenia.
ResultsUsing our combined explainable AI approach to identify a subgroup more responsive to paliperidone than placebo, the treatment effect increased significantly over that of the full sample (p<0.0001 for a one-sample t-test comparing the full sample Cohens d=0.82 and a generated distribution of subgroup Cohens ds with mean d=1.22, std d=0.09). In addition, our modeling approach produces simple logical statements (if-then-else), termed a "rule list", to ease interpretability for clinicians. A majority of the rule lists generated from cross-validation found two general psychopathology symptoms, disturbance of volition and uncooperativeness, to predict membership in the paliperidone-indicated subgroup.
ConclusionsThese results help to technically validate our explainable AI approach to patient selection for a clinical trial by identifying a subgroup with an improved treatment effect. With these data, the explainable rule lists also suggest that paliperidone may provide an improved therapeutic benefit for the treatment of schizophrenia patients with either of the symptoms of high disturbance of volition or high uncooperativeness.
Trial Registrationclincialtrials.gov identifier: NCT 00083668; registered May 28, 2004 | psychiatry and clinical psychology |