id
stringlengths 16
27
| title
stringlengths 18
339
| abstract
stringlengths 95
38.7k
| category
stringlengths 7
44
|
---|---|---|---|
10.1101/2022.04.28.22274420 | Differences in epidemiology of enteropathogens in children pre- and post-rotavirus vaccine introduction in Kilifi, coastal Kenya | BackgroundIn July 2014, Kenya introduced the Rotarix(R) vaccine into its national immunization program. The impact of this vaccination programme on the local epidemiology of enteropathogens is unclear.
MethodsThe TaqMan Array Card (TAC) was used for screening for 28 different enteropathogens in 718 stools from children less than 13 years of age who presented with diarrhea and were admitted to Kilifi County Hospital, coastal Kenya, in 2013 (before vaccine introduction) and in 2016-2018 (after vaccine introduction). The differences between pre- and post-Rotarix(R) vaccination periods were examined using univariate and multivariable logistic regressions.
ResultsIn 665 specimens (92.6%), one or more enteropathogens were detected, while in 323 specimens (48.6%), three or more enteropathogens were detected. There was a significant increase in the proportion of samples containing enteroaggregative Escherichia coli (35.7% vs 45.3%, p=0.014), cytomegalovirus (4.2% vs 9.9%, p=0.008), Vibrio cholerae (0.0% vs 2.3%, p=0.019), Strongyloides species (0.8% vs 3.6%, p=0.048) and Dientamoeba fragilis (2.1% vs 7.8%, p=0.004) post-vaccine introduction. Sapovirus detection decreased significantly (7.6% vs 4.0%, p=0.030) post-vaccine introduction. The proportion of samples that tested positive for rotavirus group A did not statistically differ between the pre- and post-vaccine periods (27.4% vs. 23.5%, p=0.253).
ConclusionsIn this setting, the burden of childhood enteropathogen infection was high both pre- and post-rotavirus vaccination introduction, with some specific changes in the burden of enteropathogens in hospitalized children after rotavirus vaccination introduction. | infectious diseases |
10.1101/2022.04.29.22274435 | Iron status and risk of sepsis: a Mendelian randomisation analysis. | ObjectiveTo evaluate the association between four iron biomarkers and sepsis.
DesignAn observational cohort and two sample Mendelian randomisation (MR) study.
SettingThe UK Biobank prospective cohort study (for the observational cohort and for MR outcomes), the FinnGEN cohort study (replication of MR outcomes) and three large genome wide association studies (GWAS, for iron exposures).
Participants453,169 participants enrolled in UK Biobank, 356,000 participants enrolled in FinnGen and between 131,471 - 246,139 participants enrolled across the three iron biomarker GWAS.
ExposuresIn the observational cohort, iron status was determined using ferritin levels recorded in general practice data. In the MR analyses, four iron biomarkers were used: serum iron, serum ferritin, the total iron binding capacity (TIBC), and transferrin saturation (TSAT).
Main outcome measuresHospital admission with an ICD-10 coded sepsis diagnosis.
ResultsIn the observational cohort, the odds of sepsis increased as ferritin levels increased, with odds ratios of >1 once ferritin was more than >160 ug/L, within the normal accepted reference range for ferritin. Extremely low ferritin (<40ug/L) was also associated with increased odds of sepsis.
In inverse-variance-weighted mendelian randomisation analyses, increases in all iron biomarkers were associated with increasing odds of sepsis (OR 1.10 for each SD increase in TSAT; 95% CI 1.03 to 1.17) with similar results for serum iron (OR 1.07; 95% CI 0.98 to 1.16), and ferritin (OR 1.09; 95% CI 0.99 to 1.2), with the opposite result for TIBC (OR 0.92; 95% CI 0.86 to 0.99). Effect estimates were slightly larger in those with iron deficiency or anaemia.
Results were similar in the replication cohort (FinnGen) and were robust to sensitivity analyses of Mendelian randomisation.
ConclusionsIncreasing measures of iron and related biomarkers are associated with increased risk of sepsis in a healthy adult volunteer cohort. MR analyses suggest this association is causal. Clinicians and policymakers should be aware of this potential risk when manipulating iron levels. | infectious diseases |
10.1101/2022.04.28.22274441 | Leveraging infectious disease models to interpret randomized controlled trials: controlling enteric pathogen transmission through water, sanitation, and hygiene interventions | Randomized controlled trials (RCTs), which evaluate hypotheses in specific contexts, are often considered the gold standard of evidence for infectious disease interventions, but their results cannot immediately generalize to other contexts. Mechanistic models are one approach to generalizing findings between contexts, but infectious disease transmission models are not immediately suited for analyzing RCTs, since they often rely on time-series surveillance data that is rarely collected by RCTs. We developed a modeling framework to explain the main outcome of an infectious disease RCT--relative risk--and applied it to a water, sanitation, and hygiene (WASH) RCT. This model can generalize the RCT results to other contexts and conditions. We developed this compartmental modeling framework to account for key WASH RCT factors: i) transmission across multiple environmental pathways, ii) multiple interventions applied individually and in combination, iii) adherence to interventions or preexisting conditions, and iv) the impact of individuals not enrolled in the study. We employed a hybrid sampling-importance resampling and estimation framework to obtain posterior estimates of mechanistic parameters and their uncertainties and illustrated our model using WASH Benefits Bangladesh RCT data (n=17,187). Our model reproduced reported diarrheal prevalence in this RCT. The baseline estimate of the basic reproduction number [R]0 for the control arm (1.15, 95% CI: 1.09, 1.27) corresponded to an endemic prevalence of 13% (95% CI: 9-21%) in the absence of intervention or preexisting WASH conditions. No single pathway was likely able to sustain transmission: pathway-specific [R]0s for water, fomites, and all other pathways were 0.49 (95% CI: 0.07, 0.99), 0.26 (95% CI: 0.04, 0.57), and 0.40 (95% CI: 0.02, 0.88), respectively. An infectious disease modeling approach to evaluating RCTs can complement RCT analysis by providing a rigorous framework for generating data-driven hypotheses that explain trial findings, particularly unexpected null results, opening up existing data to deeper epidemiological understanding.
Author summaryA randomized controlled trial (RCT) testing an intervention to reduce infectious disease transmission can provide high-quality scientific evidence about the impact of that intervention in a specific context, but the results are often difficult to generalize to other policy-relevant contexts and conditions. Infectious disease transmission models can be used to explore what might happen to disease dynamics under different conditions, but the standard use of these models is to fit to longitudinal, surveillance data, which is rarely collected by RCTs. We developed a framework to fit an infectious disease model to steady-state diarrheal prevalence data in water, sanitation, and hygiene RCTs, explicitly accounting for completeness, coverage, and compliance. Although this framework is developed with water, sanitation, and hygiene interventions for enteropathogens in mind, it could be extended to other disease contexts. By leveraging existing large-scale RCT data sets, it will be possible to better understand the underlying disease epidemiology and investigate the likely outcomes of policy-relevant scenarios. Ultimately, this work can be incorporated into decision making for public health policy and programs. | epidemiology |
10.1101/2022.04.28.22274437 | Genome-wide analysis of binge-eating disorder identifies the first three risk loci and implicates iron metabolism | Binge-eating disorder (BED) is the most common eating disorder yet its genetic architecture remains largely unknown. Studying BED is challenging because it is often comorbid with obesity, a common and highly polygenic trait, and it is underdiagnosed in biobank datasets. To address this limitation, we apply a supervised machine learning approach to estimate the probability of each individual having BED based on electronic medical records from the Million Veteran Program. We perform a genome-wide association study on individuals of African (n = 77,574) and European (n = 285,138) ancestry while controlling for body mass index to identify three independent loci near the HFE, MCHR2 and LRP11 genes, which are reproducible across three independent cohorts. We identify genetic association between BED and several neuropsychiatric traits and implicate iron metabolism in the pathophysiology of BED. Overall, our findings provide insights into the genetics underlying BED and suggest directions for future translational research. | psychiatry and clinical psychology |
10.1101/2022.04.28.22274341 | PCR Testing in the UK During the SARS-CoV-2 Pandemic - Evidence from FOI Requests | Polymerase Chain Reaction ("PCR") tests have been used to identify cases of COVID-19 during the course of the pandemic. Notably, PCR alone cannot differentiate between the presence of whole viruses (which can be transmitted and infect individuals) and small fragments of genetic material that are not infectious. A feature of PCR known as the cycle threshold (Ct) can be used to discriminate between these states, but the relationship between Ct and infectiousness is still poorly understood.
This well-known limitation of the test compromises the identification of cases and their trends, and consequently those measures to interrupt transmission (such as isolation) that are undertaken on the basis of reliably identifying infectious individuals.
Here, we interrogate the public authorities understanding of PCR testing for SARS-CoV-2 in the UK by accessing Freedom of Information requests submitted in 2020-21.We searched WhatDoTheyKnow and found 300 FOI requests, from over 150 individuals. We grouped their questions into four themes addressing the number of tests in use, the reporting of cycle thresholds ( Ct), the Ct values themselves, and the accuracy of tests.
The number of validated tests in use in the UK is currently not clear: In FOI responses, Public Health England (PHE) report it may be "80" or "85". However, European regulations suggest there could be over 400 different CE marked tests available on the market and available for use. Laboratories have a statutory duty to report positive cases to PHE, but they do not have to advise which tests they are using nor submit Ct values. Only two FOI responses provided answers on Ct values, indicating that in a set time span, 24-38% of the Ct values were over 30. The most common FOI asked if there was a cycle threshold for positivity. In those that responded, the Ct for a positive result varied from 30 to 45. We found limited information on the technical accuracy of the tests. Several responses stated there is no static, specific or standard cycle threshold.
The current system requires significant changes to ensure it offers accurate diagnostic data to enable effective clinical management of SARS-CoV-2. PCR is an important and powerful tool, but its systematic misuse and misreporting risk undermining its usefulness and credibility. | public and global health |
10.1101/2022.04.28.22273177 | Occupational differences in SARS-CoV-2 infection: Analysis of the UK ONS Coronavirus (COVID-19) Infection Survey | BackgroundConsiderable concern remains about how occupational SARS-CoV-2 risk has evolved during the COVID-19 pandemic. We aimed to ascertain which occupations had the greatest risk of SARS-CoV-2 infection and explore how relative differences varied over the pandemic.
MethodsAnalysis of cohort data from the UK Office of National Statistics Coronavirus (COVID-19) Infection Survey from April 2020 to November 2021. This survey is designed to be representative of the UK population and uses regular PCR testing. Cox and multilevel logistic regression to compare SARS-CoV-2 infection between occupational/sector groups, overall and by four time periods with interactions, adjusted for age, sex, ethnicity, deprivation, region, household size, urban/rural neighbourhood and current health conditions.
ResultsBased on 3,910,311 observations from 312,304 working age adults, elevated risks of infection can be seen overall for social care (HR 1.14; 95% CI 1.04 to 1.24), education (HR 1.31; 95% CI 1.23 to 1.39), bus and coach drivers (1.43; 95% CI 1.03 to 1.97) and police and protective services (HR 1.45; 95% CI 1.29 to 1.62) when compared to non-essential workers. By time period, relative differences were more pronounced early in the pandemic. For healthcare elevated odds in the early waves switched to a reduction in the later stages. Education saw raises after the initial lockdown and this has persisted. Adjustment for covariates made very little difference to effect estimates.
ConclusionsElevated risks among healthcare workers have diminished over time but education workers have had persistently higher risks. Long-term mitigation measures in certain workplaces may be warranted.
What is already known on this topicSome occupational groups have observed increased rates of disease and mortality relating to COVID-19.
What this study addsRelative differences between occupational groups have varied during different stages of the COVID-19 pandemic with risks for healthcare workers diminishing over time and workers in the education sector seeing persistent elevated risks.
How this study might affect research, practice or policyIncreased long term mitigation such as ventilation should be considered in sectors with a persistent elevated risk. It is important for workplace policy to be responsive to evolving pandemic risks. | occupational and environmental health |
10.1101/2022.04.28.22273177 | Occupational differences in SARS-CoV-2 infection: Analysis of the UK ONS Coronavirus (COVID-19) Infection Survey | BackgroundConsiderable concern remains about how occupational SARS-CoV-2 risk has evolved during the COVID-19 pandemic. We aimed to ascertain which occupations had the greatest risk of SARS-CoV-2 infection and explore how relative differences varied over the pandemic.
MethodsAnalysis of cohort data from the UK Office of National Statistics Coronavirus (COVID-19) Infection Survey from April 2020 to November 2021. This survey is designed to be representative of the UK population and uses regular PCR testing. Cox and multilevel logistic regression to compare SARS-CoV-2 infection between occupational/sector groups, overall and by four time periods with interactions, adjusted for age, sex, ethnicity, deprivation, region, household size, urban/rural neighbourhood and current health conditions.
ResultsBased on 3,910,311 observations from 312,304 working age adults, elevated risks of infection can be seen overall for social care (HR 1.14; 95% CI 1.04 to 1.24), education (HR 1.31; 95% CI 1.23 to 1.39), bus and coach drivers (1.43; 95% CI 1.03 to 1.97) and police and protective services (HR 1.45; 95% CI 1.29 to 1.62) when compared to non-essential workers. By time period, relative differences were more pronounced early in the pandemic. For healthcare elevated odds in the early waves switched to a reduction in the later stages. Education saw raises after the initial lockdown and this has persisted. Adjustment for covariates made very little difference to effect estimates.
ConclusionsElevated risks among healthcare workers have diminished over time but education workers have had persistently higher risks. Long-term mitigation measures in certain workplaces may be warranted.
What is already known on this topicSome occupational groups have observed increased rates of disease and mortality relating to COVID-19.
What this study addsRelative differences between occupational groups have varied during different stages of the COVID-19 pandemic with risks for healthcare workers diminishing over time and workers in the education sector seeing persistent elevated risks.
How this study might affect research, practice or policyIncreased long term mitigation such as ventilation should be considered in sectors with a persistent elevated risk. It is important for workplace policy to be responsive to evolving pandemic risks. | occupational and environmental health |
10.1101/2022.04.29.22274513 | Medical text prediction and suggestion using generative pre-trained transformer models with dental medical notes | BackgroundGenerative pre-trained transformer (GPT) models are one of the latest large pre-trained natural language processing (NLP) models, which enables model training with limited datasets, and reduces dependency on large datasets which are scarce and costly to establish and maintain. There is a rising interest to explore the use of GPT models in healthcare.
ObjectiveWe investigate the performance of GPT-2 and GPT-Neo models for medical text prediction using 374,787 free-text dental notes.
MethodsWe fine-tune pre-trained GPT-2 and GPT-Neo models for next word prediction on a dataset of over 374,000 manually written sections of dental clinical notes. Each model was trained on 80% of the dataset, validated on 10%, and tested on the remaining 10%. We report model performance in terms of next word prediction accuracy and loss. Additionally, we analyze the performance of the models on different types of prediction tokens for categories. We annotate each token in 100 randomly sampled notes by category (e.g. Names, Abbreviations, Clinical Terms, Punctuation, etc.) and compare the performance of each model by token category.
ResultsModels present acceptable accuracy scores (GPT-2: 76%, GPT-Neo: 53%), and the GPT-2 model also performs better in manual evaluations, especially for names, abbreviations, and punctuation. The results suggest that pre-trained models have the potential to assist medical charting in the future. We share the lessons learned, insights, and suggestions for future implementations.
ConclusionThe results suggest that pre-trained models have the potential to assist medical charting in the future. Our study presented one of the first implementations of the GPT model used with medical notes. | health informatics |
10.1101/2022.04.30.22274525 | HematoNet: Expert Level Classification of Bone Marrow Cytology Morphology in Hematological Malignancy with Deep Learning | There have been few efforts made to automate the cytomorphological categorization of bone marrow cells. For bone marrow cell categorization, deep-learning algorithms have been limited to a small number of samples or disease classifications. In this paper, we proposed a pipeline to classify the bone marrow cells despite these limitations. Data augmentation was used throughout the data to resolve any class imbalances. Then, random transformations such as rotating between 0{degrees} to 90{degrees}, zooming in/out, flipping horizontally and/or vertically, and translating were performed. The model used in the pipeline was a CoAtNet and that was compared with two baseline models, EfficientNetV2 and ResNext50. We then analyzed the CoAtNet model using SmoothGrad and Grad-CAM, two recently developed algorithms that have been shown to meet the fundamental requirements for explainability methods. After evaluating all three models performance for each of the distinct morphological classes, the proposed CoAtNet model was able to outperform the EfficientNetV2 and ResNext50 models due to its attention network property that increased the learning curve for the algorithm which was represented using a precision-recall curve. | health informatics |
10.1101/2022.04.29.22274506 | Trends in diabetes incidence and associated risk factors among people living with HIV in the current treatment era, 2008-2018 | People with HIV (PWH) have an increased risk for diabetes mellitus. Our objectives were to characterize the prevalence and incidence of diabetes in a cohort of people with HIV (PWH), and to evaluate both traditional and HIV-specific risk factors contributing to incident diabetes diagnoses. We conducted a retrospective study of a Southeastern US academic HIV clinic. All PWH age > 18 years of age who attended at least two clinic visits between 2008 and 2018 were evaluated to assess time to diabetes incidence. Laboratory, demographic, clinical, medication and diagnoses data were extracted from the Clinic EMR. Diabetes was defined when at least two of the following three criteria were met: (1) laboratory data consistent with a diagnosis as defined by the ADA SOC (Hgb A1C [≥] 6.5% and/or 2 glucose results >200 mg/dl (at least 30 days apart)), (2) diagnosis of diabetes in the EMR, or (3) exposure to diabetes medication. Time to Diabetes incidence was computed from the entire clinic population for each study year. Univariate Cox proportional hazard models were developed to evaluate associations between each baseline factor and time to DM. Multivariable Cox proportional hazard regression models with time-dependent covariates were created to evaluate the independent association between significant parameters from univariate analyses and time to incident DM. From 4113 PWH included in the analysis, we identified 252 incident cases of diabetes. In multivariable analysis, BMI classification, liver disease, steroid exposure, and use of Integrase Inhibitors were associated with incident diabetes. Additional associated factors included lower CD4 cell counts, duration of HIV infection, exposure to non-statin lipid lowering therapy, and dyslipidemia. Incident diabetes rates are increasing at an alarming rate among PWH. Diabetes prevalence increased from 8.8% in 2008 to 14% in 2018. Both traditional and HIV-related risk factors, particularly body weight, steroid exposure, and use of Integrase Inhibitors, were associated with incident diabetes. Notably, several of the risk factors identified are modifiable and should be targeted for intervention. | hiv aids |
10.1101/2022.04.29.22274468 | Design and validation of HIV peptide pools for detection of HIV-specific CD4+ and CD8+ T cells | Reagents to monitor T cell responses to the entire HIV genome, based on well characterized epitopes, are missing. Evaluation of HIV-specific T cell responses is of importance to study natural infection, and therapeutic and vaccine interventions. Experimentally derived CD4+ and CD8+ HIV epitopes from the HIV molecular immunology database were developed into Class I and Class II HIV megapools (MPs). We assessed HIV responses in persons with HIV pre combined antiretroviral therapy (cART) (n=17) and post-cART (n=18) and compared these responses to 15 controls without HIV (matched by sex, age, and ethnicity). Using the Activation Induced Marker (AIM) assay, we quantified HIV-specific total CD4+, memory CD4+, circulating T follicular helper, total CD8+ and memory CD8+ T cells. We also compared the Class I and Class II HIV MPs to commercially available HIV gag peptide pools. Overall, HIV Class II MP detected HIV-specific CD4+ T cells in 21/35 (60%) HIV positive samples and 0/15 HIV negative samples. HIV Class I MP detected an HIV-specific CD8+ T cells in 17/35 (48.6%) HIV positive samples and 0/15 HIV negative samples. Our innovative HIV MPs are reflective of the entire HIV genome, and its performance is comparable to other commercially available peptide pools. Here, we detected HIV-specific CD4+ and CD8+ T cell responses in people on and off cART, but not in people without HIV. | hiv aids |
10.1101/2022.04.29.22274433 | Brief: Implementation of a Novel Clinic/Community Partnership Addressing Food Insecurity Among Adults with HIV in the Southern United States | Food insecurity is highly prevalent among people with HIV. Traditional calorie-rich, nutrient poor food assistance programs may improve food security but increase risk for other chronic diseases. This case study describes the process evaluation of a novel clinic/community partnership to provide nutritionally adequate, tailored food assistance to adults with HIV in Alabama. Methods used include semi-structured interviews with program staff at Birmingham AIDS Outreach and the University of Alabama at Birminghams 1917 HIV/AIDS Clinic, and analysis of descriptive characteristics of individuals enrolled in the food program for a minimum of one year between 2017-2019. The new program served 1,311 patients and enabled more than 300 previously lost-to-follow-up patients to re-engage in HIV care. The program implementation reviewed here can serve as a roadmap to develop clinic/community partnerships focused on a variety of health outcomes and quality of life among food insecure patients. | hiv aids |
10.1101/2022.04.29.22274267 | Multi-omics identify LRRC15 as a COVID-19 severity predictor and persistent pro-thrombotic signals in convalescence | Patients with end-stage kidney disease (ESKD) are at high risk of severe COVID-19. Here, we performed longitudinal blood sampling of ESKD haemodialysis patients with COVID-19, collecting samples pre-infection, serially during infection, and after clinical recovery. Using plasma proteomics, and RNA-sequencing and flow cytometry of immune cells, we identified transcriptomic and proteomic signatures of COVID-19 severity, and found distinct temporal molecular profiles in patients with severe disease. Supervised learning revealed that the plasma proteome was a superior indicator of clinical severity than the PBMC transcriptome. We showed that both the levels and trajectory of plasma LRRC15, a proposed co-receptor for SARS-CoV-2, are the strongest predictors of clinical outcome. Strikingly, we observed that two months after the acute infection, patients still display dysregulated gene expression related to vascular, platelet and coagulation pathways, including PF4 (platelet factor 4), which may explain the prolonged thrombotic risk following COVID-19. | infectious diseases |
10.1101/2022.04.28.22274174 | Promising efficacy of following a third dose of mRNA SARS-CoV-2 vaccination in patients treated with anti-CD20 antibody who failed 2-dose vaccination | Anti-CD20 antibodies react with CD20 expressed not only on malignant B cells but also on normal B cells. It has been reported that patients treated with anti-CD20 antibodies had an insufficient response to two-dose mRNA SARS-CoV-2 vaccination. To investigate the efficacy of a third dose in these patients, we investigated serum IgG antibody titers for S1 protein after third vaccination in 22 patients treated with anti-CD20 antibody who failed two-dose vaccination. Results showed that overall, 50% of patients seroconverted. Although no patient who received the third dose within 1 year of the last anti-CD20 antibody administration showed an increase in S1 antibody titer, 69% of patients who received the third dose more than 1 year after the last anti-CD20 antibody administration seroconverted. Our data show that a third dose of vaccination is effective in improving seroconversion rate in patients treated with anti-CD20 antibody who failed standard two-dose vaccination. | infectious diseases |
10.1101/2022.04.30.22274523 | Characteristics of admissions for Kawasaki Disease from 1997 to 2012: Lessons from a national database | A majority of large epidemiologic studies on Kawasaki Disease have come from Asia. There is paucity of data assessing Kawasaki Disease on a national level in the U.S., particularly in terms of hospitalization co-morbidities and cost. This study set forth to analyze data from the Kids Inpatient Database from 1997 to 2012. Data were analyzed for age, race, cardiogenic shock, acute kidney injury, liver failure, acute respiratory distress syndrome, arrhythmia, and congenital heart disease. Additionally, multivariate regression analysis was performed to assess the impact of Kawasaki Disease on coronary artery aneurysms, ECMO, length of stay, cost of stay, and mortality. Asian and Pacific Islander children were disproportionally affected by Kawasaki Disease in the U.S (20.8% of Kawasaki Disease admissions vs 3.3% of all other pediatric hospital admissions, p<0.01). Patients hospitalized for Kawasaki Disease had an increased risk of developing coronary artery aneurysms (OR 2,839, 95%CI 2,2985-3,527) and cardiogenic shock (OR 3.42, 95% CI 2.18-5.37). Patients with Kawasaki disease were less likely to have congenital heart disease (OR 0.62, 95%CI 0.55-0.69), arrhythmia (OR 0.31, 95%CI 0.11-0.84), and acute respiratory distress syndrome (OR 0.29, 95%CI 0.19-0.43). Kawasaki disease patients had shorter hospitalization length of stay by 2.59 days (p <0.01) and decreased cost of stay by $5,513 (p <0.01). Kawasaki Disease had lower mortality when compared to all other admissions (OR 0.03, 95%CI 0.01-0.09). No significant differences were found for ECMO, acute kidney injury, or liver failure. | cardiovascular medicine |
10.1101/2022.04.29.22274467 | Survival analysis of coronary care unit patients from MIMIC IV database: the role of potassium in cardiac patients | The profile of hospitalizations in coronary care units (CCU) includes patients with different age groups, multiple comorbidities and causes of hospitalization that may or may not be primarily cardiac. This study aimed to estimate survival time and evaluate the association and impact of different factors on this time in a cohort of patients admitted to CCU. A cohort of 7120 adult patients admitted to CCU was analyzed from a subset of data from the MIMIC-IV database (Medical Information Mart for Intensive Care, version 4). A descriptive analysis was performed using Kaplan-Meier survival analysis, with a Log-rank test to establish comparisons between groups. Survival regression was modeled using Coxs proportional risk models for the multiple analysis. The p-value was defined as {inverted exclamation} 0.05 as statistically significant. In patients who died during hospitalization, there was a higher average age, longer hospital stay, and a higher rate of heart and respiratory rate, all with p {inverted exclamation} 0.001. Median overall survival was 28 days (95% CI 26-30 days). The survival probability curve presented a higher inclination in the first weeks, reaching a stable value close to 20% at 10 weeks after hospitalization. When Coxs regression adjusted for age, gender and comorbidities was performed, hyperpotassemia was shown to be an independent risk factor for in-hospital mortality (RR = 1.22, 95% CI: 1.14-1.30) in this group of patients. These results reinforced that the electronic health record may contain, already in the first hours of hospitalization, relevant information to understand the progression of diseases and identify future directions for research. This study is expected to clarify important topics related to the MIMIC-IV database and enable further research using this patient database. Knowledge of the characteristics of the CCU population can allow better management of physical and human hospital resources. | cardiovascular medicine |
10.1101/2022.04.30.22274528 | Alternative Factor Prescribing after Low-Dose Recombinant Factor VIIa Protocol in Cardiac Surgery | BackgroundSafety concerns exist with the off-label use of recombinant factor VIIa (rFVIIa, Novoseven RT(R)) for refractory bleeding in cardiac surgery, including increased risk of thromboembolism. A rFVIIa protocol was implemented in December 2015 to standardize rFVIIa for cardiac surgery related hemorrhage.
MethodsWe performed a retrospective, observational review of rFVIIa in adult cardiac surgery patients pre-protocol (January 2015 to November 2015) vs. post-protocol (December 2015 to March 2016). Study outcomes were rate of rFVIIa administration, rFVIIa dosing characteristics, length of stay, mortality, readmission rate, need for re-exploration, and rate of 4-factor Prothrombin Complex Concentrates (PCC; Kcentra(R)) administration.
ResultsThere was a significant reduction in percentage of cardiac surgery cases receiving rFVIIa pre-vs. post-protocol (14.3 vs. 5.2%, p=0.015). Average total dose per patient decreased between groups (81.4 vs. 56.6 mcg/kg, p=0.059). In-hospital mortality, length of stay, need for re-exploration, readmission rates and 30-day mortality did not differ. Although 4-four-factor PCC significantly increased post-protocol (2.5% vs. 8%, p=0.02), overall use of factor products, rFVIIa or 4-factor PCC, did not change between study periods (16.8% vs. 13%, p=0.416). Mean cost of either rFVIIa or 4-factor PCC pre-protocol was significantly higher than that post-protocol ($8,778 vs. $4,421, p=0.0008).
ConclusionsThe use of rFVIIa decreased after implementation of a rFVIIa protocol targeting 30 mcg/kg/dose without compromising morbidity or mortality outcomes. Four-factor PCC use significantly increased during the study, but the overall cost was reduced. Institutions wanting to implement a rFVIIa protocol should take careful measures to concurrently address off-label use of 4-factor PCC. | cardiovascular medicine |
10.1101/2022.04.29.22274483 | Effectiveness of ChAdOx1-S COVID-19 Booster Vaccination against the Omicron and Delta variants in England | BackgroundDespite the potential widespread global use of the ChAdOx1-S booster, to date there are no published data on the real-world effectiveness. VE studies have found one and two doses of the ChAdOx1-S vaccine to be highly effective, and clinical trial data have demonstrated enhanced immunity following a ChAdOx1-S booster. In England, some individuals received a ChAdOx1-S booster where vaccination with mRNA vaccines was clinically contraindicated.
MethodsThe demographic characteristics of those who received a ChAdOx1-S booster were compared to those who received a BNT162b2 booster. A test-negative case control design was used to estimate vaccine effectiveness of the ChAdOx1-S booster against symptomatic disease and hospitalisation in England.
FindingsThose who received a ChAdOx1-S booster were more likely to be female (adjusted odds ratio (OR) 1.67 (1.64-1.71)), in a clinical risk group (adjusted OR 1.58 (1.54-1.63)), in the CEV group (adjusted OR 1.84 (1.79-1.89)) or severely immunosuppressed (adjusted OR 2.05 (1.96-2.13)).
Protection against symptomatic disease in those aged 65 years and older peaked at 66.1% (16.6 to 86.3%) and 68.5% (65.7 to 71.2%) amongst those who received the ChAdOx1-S and BNT162b2 booster vaccines, respectively. Protection waned to 44.5% (22.4 to 60.2%) and 54.1% (50.5 to 57.5%) after 5-9 weeks. Protection against hospitalisation following Omicron infection peaked at 82.3% (64.2 to 91.3%) after receiving a ChAdOx1-S booster, as compared to 90.9% (88.7 to 92.7%) for those who received a BNT162b2 booster.
InterpretationDifferences in the population boosted with ChAdOx1-S in England renders direct comparison of vaccine effectiveness by manufacturer challenging. Nonetheless, this study supports the use of the ChAdOx1-S booster for protection against severe disease with COVID-19 in settings that have not yet offered booster doses and suggests that those who received ChAdOx1-S as a booster in England do not require re-vaccination ahead of others.
FundingUKHSA | epidemiology |
10.1101/2022.04.29.22274494 | Forecast Intervals for Infectious Disease Models | Forecast intervals for infectious disease transmission and mortality have long been overconfident -- i.e., the advertised coverage probabilities of those intervals fell short of their subsequent performances. Further, there was no apparent relation between how good models claimed to be (as measured by their purported forecast uncertainties) and how good the models really were (as measured by their actual forecast errors). The main cause of this problem lies in the misapplication of textbook methods for uncertainty quantification. A solution lies in the creative use of predictive tail probabilities to obtain valid interval coverages. This approach is compatible with all probabilistic predictive models whose forecast error behavior does not change "too quickly" over time. | epidemiology |
10.1101/2022.04.29.22274485 | Human behaviour, NPI and mobility reduction effects on COVID-19 transmission in different countries of the world | BackgroundThe outbreak of Coronavirus disease, which originated in Wuhan, China in 2019, has affected the lives of billions of people globally. Throughout 2020, the reproduction number of COVID-19 was widely used by decision-makers to explain their strategies to control the pandemic.
MethodsIn this work, we deduce and analyze both initial and effective reproduction numbers for 12 diverse world regions between February and December of 2020. We consider mobility reductions, mask wearing and compliance with masks, mask efficacy values alongside other non-pharmaceutical interventions (NPIs) in each region to get further insights in how each of the above factored into each regions SARS-COV-2 transmission dynamic.
ResultsWe quantify in each region the following reductions in the observed effective reproduction numbers of the pandemic: i) reduction due to decrease in mobility (as captured in Google mobility reports); ii) reduction due to mask wearing and mask compliance; iii) reduction due to other NPIs, over and above the ones identified in i) and ii).
ConclusionIn most cases mobility reduction coming from nationwide lockdown measures has helped stave off the initial wave in countries who took these types of measures. Beyond the first waves, mask mandates and compliance, together with social-distancing measures (which we refer to as other NPIs) have allowed some control of subsequent disease spread. The methodology we propose here is novel and can be applied to other respiratory diseases such as influenza or RSV. | epidemiology |
10.1101/2022.04.28.22274446 | Vaccine Stockpile Sharing For Selfish Objectives | The COVAX program aims to provide global equitable access to life-saving vaccines. However, vaccine protectionism by wealthy nations has limited progress towards vaccine sharing goals. For example, as of April 2022 only[~] 20% of the population in Africa has received at least one COVID-19 vaccine dose. Here we use a two-nation coupled epidemic model to evaluate optimal vaccine-sharing policies given a selfish objective: in which countries with vaccine stockpiles aim to minimize fatalities in their own populations. Despite the selfish objective, we find it is often optimal for a donor nation to share a significant fraction of its vaccine stockpile. Mechanistically, sharing a vaccine stockpile reduces the intensity of outbreaks in the recipient nation, in turn reducing travel-associated incidence in the donor nation. This effect is intensified as vaccination rates decrease and epidemic coupling increases. Despite acting selfishly, vaccine sharing by a donor nation significantly reduces transmission and fatalities in the recipient nation. Moreover, we find that there are hybrid sharing policies that have a negligible effect on fatalities in the donor nation compared to the optimal policy while significantly reducing fatalities in the recipient nation. Altogether, these findings provide a rationale for nations with extensive vaccine stockpiles to share with other nations. | epidemiology |
10.1101/2022.04.30.22274514 | Health-Related Quality of Life and Coping Strategies adopted by COVID-19 survivors: A nationwide cross-sectional study in Bangladesh | This study aims to investigate the health-related quality of life and coping strategies among COVID-19 survivors in Bangladesh.
MethodsThis is a cross-sectional study of 2198 adult, COVID-19 survivors living in Bangladesh. Data were collected from previously diagnosed COVID-19 participants (confirmed by an RT-PCR test) via door-to-door interviews in the eight different divisions in Bangladesh. For data collection, Bengali translated Brief COPE inventory and WHO Brief Quality of Life (WHO-QOLBREF) questionnaires were used. The data collection period was from June 2020 to March 2021.
ResultsMales 72.38% (1591) were more affected by COVID-19 than females 27.62% (607). Age showed significant correlations (p<0.005) with physical, psychological and social relationships; whereas, gender showed only significant correlation with physical health (p<0.001). Marital status, occupation, living area, and co-morbidities showed significant co-relation with all four domains of QoL (p<0.001). Education and affected family members showed significant correlation with physical and social relationship (p<0.001). However, smoking habit showed significant correlations with both social relationship and environment (p<0.001). Age and marital status showed a significant correlation with avoidant coping strategy (p<0.001); whereas gender and co-morbidities showed significant correlation with problem focused coping strategies (p<0.001). Educational qualification, occupation and living area showed significant correlation with all three coping strategies (p<0.001).
ConclusionSurvivors of COVID-19 showed mixed types of coping strategies; however, the predominant coping strategy was avoidant coping, followed by problem focused coping, with emotion focused coping reported as the least prevalent. Marital status, occupation, living area and co-morbidities showed a greater effect on QoL in all participants. This study represents the real scenario of nationwide health associated quality of life and coping strategy during and beyond the Delta pandemic. | epidemiology |
10.1101/2022.04.30.22274526 | Pre-morbid use of proton pump inhibitors has no effect on the risk of death or hospitalization in COVID-19 patients: a matched cohort study | BackgroundSeveral studies assessed the effect of pre-morbid exposure to proton pump inhibitors (PPIs) on disease course in adult COVID-19 patients with somewhat inconsistent results.
MethodsThis population-based matched cohort study embraced first COVID-19 episodes in adults diagnosed up to August 15 2021 in Croatia. Considering over-the-counter (OTC) availability of PPIs, patients were classified based on exposure to PPIs and burden of PPI-requiring conditions as "non-users" (no issued prescriptions, no recorded treatment-requiring conditions between January 1 2019 and COVID-19 diagnosis), "possible users" (no issued prescriptions, recorded treatment-requiring conditions; OTC use possible) and "users" (different intensity of issued prescriptions over 12 months prior to diagnosis, at least one within 3 months). Subsets were mutually exactly matched in respect to a range of pre-COVID-19 characteristics. The contrast between "users" and "possible users" was considered the most informative for the effect of PPIs that is separate of the effect of PPI-requiring conditions.
ResultsAmong 433609 COVID-19 patients, 332389 were PPI "non-users", 18170 were "possible users", and 55098 were "users". Users and possible users were matched 41195 to 17334 and 33272 to 16434 in the primary and sensitivity analyses. There was no relevant difference between "users" and "possible users" regarding COVID-19-related mortality [RR=0.93 (95%CI 0.85-1.02; RD= -0.34% (-0.73, 0.03) in primary and RR=0.88 (0.78-0.98); RD=-0.45 (- 0.80, -0.11) in sensitivity analysis] or COVID-19-related hospitalizations [RR=1.04 (0.97-1.13); RD=0.29% (-0.16, 0.73) in primary and RR=1.05 (0.97-1.15); RD=0.32% (-0.12, 0.75) in sensitivity analysis].
ConclusionsPre-morbid exposure to PPIs does not affect the risk of death or hospitalization in adult COVID-19 patients. | gastroenterology |
10.1101/2022.04.29.22274511 | Clinicopathological and Imaging Features of Primary Biliary Cholangitis | Abstract/KeywordsO_ST_ABSBackground and AimsC_ST_ABSPrimary biliary cholangitis (PBC) is a chronic inflammatory autoimmune disease of the biliary epithelial cells, causing slow progression of cholestasis and fibrosis. The aim of the study is to summarize the imaging features of PBC and examine the correlation between its clinicopathological and radiologic features, elucidating the specific clinicopathological and radiologic differences between male and female PBC patients, as well as between AMA-positive and AMA-negative PBC patients.
MethodsDemographic, laboratory, radiologic and survival data were collected retrospectively for patients diagnosed and treated for PBC at the Mount Sinai Health System between 2016 and 2020 with at least one ultrasound, CT, or MRI of abdomen available for assessment. Biochemical and radiologic data were compared between male v.s. female groups, and AMA-positive v.s. AMA-negative group.
ResultsA total of 273 patients diagnosed with PBC were included. Non-specific hepatic parenchymal disease, cirrhosis, cholelithiasis, splenomegaly, lymphadenopathy, benign liver masses/cysts were the most common features on abdominal images. 24% of PBC patients had biliary tree abnormalities on MRI. LAD was reported in 38-47% of PBC patients. There was no significant difference in age, race, or ethnicity between the sexes of patients with PBC, however imaging characteristics of portal hypertension were more frequently seen in men. AMA-negative PBC patients had similar distributions of age, race/ethnicity and survival to AMA-positive PBC patients.
ConclusionOur study summarized the clinicopathological and imaging features of PBC and distinct subgroups. This would assist in the diagnosis of PBC and avoid over-testing in real-world practice. | gastroenterology |
10.1101/2022.04.29.22274464 | Prevalence and Genotypic Characterization of Extended Spectrum Beta-Lactamase Uropathogens Isolated from Refugees with Urinary Tract Infections in Nakivale Refugee Settlement camp, Southwestern Uganda | BackgroundWorld Health Organization approximates that one in four individuals have had at least one UTI episode requiring treatment with an antimicrobial agent by the teen age. At Nakivale refugee camp, the overwhelming number of refugees often associated with poor living conditions such as communal bathrooms and toilets and multiple sex partners do predispose the refuges to urinary tract infections.
AimTo determine the prevalence of bacterial community-onset urinary tract infections among refugees in Nakivale refugee settlement and determine the antimicrobial susceptibility patterns of the isolated pathogens.
MethodsThis study was a cross-sectional study, that included 216 outpatients attending Nakivale Health Centre III between July and September 2020.
ResultsPrevalence of UTI was 24.1% (52/216). The majority 86(39.81%) of the refugees were from DR Congo, followed by those from Somalia 58(26.85%). The commonest causative agent was Staphylococcus aureus 22/52 (42.31%) of total isolates, followed by Escherichia coli 21/52(40.38%). Multidrug resistant isolates accounted for 71.15% (37/52) and mono resistance was 26.92% (14/52). Out of the 52 bacterial isolates, 30 (58%) were Extended-Spectrum Beta-Lactamase organisms (ESBLs). Twenty-one (70.0%) isolates were ESBL producers while 9(30%) were non-ESBL producers. Both blaTEM and blaCTX-M were 62.5% each while blaSHV detected was 37.5%.
ConclusionsThe prevalence of UTI among refugees in Nakivale settlement is high with Staphylococcus aureus and Escherichia coli as the commonest causes of UTI. There is a high rate of multidrug resistance to common drugs used to treat UTI. The prevalence of ESBL-producing Enterobacteriaceae is high and the common ESBL genes are blaTEM and blaCTX- | pathology |
10.1101/2022.04.29.22274482 | Early life attachment in term and preterm infants. | 1BackgroundPreterm birth is associated with atypical cognitive and socioemotional outcomes in childhood. Secure infant attachment protects against adverse outcomes, but could be modified by alterations in the early caregiving environment inherent to essential neonatal intensive care or co-morbidities of preterm birth. We aimed to test the hypothesis that preterm birth is associated with differences in infant attachment compared with infants born at term, and to investigate clinical, neurodevelopmental and socioeconomic variables that could contribute to variance in infant attachment.
Methods68 preterm and 68 term infants with mean (range) gestational age at birth 29.7 (22.1 - 32.9) and 39.6 (36.4 - 42.1) weeks, respectively, completed the Still-Face Paradigm (SFP) at nine months of corrected age. Attachment dimensions and categories were obtained from infant responses to the SFP using a published coding scheme, and an alternative principal component and clustering strategy. Neurodevelopment was assessed using the Vineland Adaptive Behaviour Scales, and socioeconomic status was operationalized as neighborhood deprivation.
ResultsPreterm and term infants did not differ in attachment dimensions (distress, fretfulness, attentiveness to caregivers, p-values > .07; principal components, p-values > .07), or the distribution of attachment categories (attachment styles, p-value = .79; attachment clusters, p-value > .78). In the whole sample, fretfulness correlated with socioeconomic deprivation (n = 136, rs = -0.23, p-value < .01), and attentiveness correlated with motor development (n = 120, rs = .24, p-value < .01).
ConclusionsThere were no differences in attachment between preterm and term infants at nine months of age, suggesting that caregiver-infant attachment relationships are resilient to the effects of prematurity on the developing infant. The results highlight links between socioeconomic deprivation and infant attachment, and suggest there is a relationship between infant attentiveness and motor function in infancy. | pediatrics |
10.1101/2022.04.30.22274531 | In schizophrenia, the effects of the IL-6/IL-23/Th17 axis on health-related quality of life and disabilities are partly mediated by generalized cognitive decline and the symptomatome. | Schizophrenia patients show increased disabilities and lower quality of life (DisQoL). Nevertheless, there are no data whether, in schizophrenia, activation of the interleukin (IL)-6, IL-23, T helper (Th)-17 axis and lowered magnesium and calcium levels impact DisQoL scores. This study recruited 90 patients with schizophrenia (including 40 with deficit schizophrenia) and 40 healthy controls and assessed the World Health Association QoL instrument-Abbreviated version and Sheehan Disability scale, Brief Assessment of Cognition in Schizophrenia (BACS), IL-6, IL-23, IL-17, IL-21, IL-22, tumor necrosis factor (TNF)-, magnesium and calcium. Regression analyses showed that a large part of the first factor extracted from the physical, psychological, social and environmental HR-QoL and interference with school/work, social life, and home responsibilities was predicted by a generalized cognitive deterioration (G-CoDe) index (a latent vector extracted from BACs scores), and the first vector extracted from various symptom domains ("symptomatome"), whereas the biomarkers had no effects. Partial Least Squares analysis showed that the IL6IL23Th17 axis and magnesium/calcium had highly significant total (indirect + direct) effects on HR-QoL/disabilities which were mediated by G-CoDe and the symptomatome (a first factor extracted from negative and positive symptoms). The IL6IL23Th17 axis explained 63.1% of the variance in a single latent trait extracted from G-CoDe, symptomatome, HR-QoL and disability data. The latter features are manifestations of a common core, namely the behavioral-cognitive-psycho-social worsening index, which is caused by the neuroimmunotoxic effects of the IL6IL23Th17 axis in subjects with lowered antioxidant defenses (magnesium and calcium) thereby producing damage to neuronal circuits underpinning deficit schizophrenia. | psychiatry and clinical psychology |
10.1101/2022.04.29.22274474 | Categorical and dimensional brain network-based models of trauma-related dissociative subtypes | BackgroundTrauma-related pathological dissociation is a multidimensional and disabling phenomenon that involves disruptions or discontinuities in psychological functioning. Despite its prevalence, personal and societal burden, dissociation remains underappreciated in clinical practice, and it lacks a synthesized neurobiological model that could place it in context with other common psychiatric symptoms. To identify a nuanced neurobiological model of pathological dissociation, we examined the functional connectivity of three core neurocognitive networks as related to the dimensional dissociation subtypes of depersonalization/derealization and partially-dissociated intrusions, and the diagnostic category of a complex dissociation disorder, dissociative identity disorder (DID).
MethodsParticipants were 91 adult women with and without: a history of childhood trauma, current posttraumatic stress disorder (PTSD) and varied levels of pathological dissociation. Participants provided interview and self-report data about pathological dissociation, PTSD symptoms, childhood maltreatment history, and completed a resting-state functional magnetic resonance imaging scan.
ResultsAfter controlling for age, childhood maltreatment and PTSD symptom severity, we found that pathological dissociation was associated with hyperconnectivity within central executive, default, and salience networks, and decreased connectivity of central executive and salience networks with other areas. Moreover, we isolated unique connectivity markers linked to depersonalization/derealization, to partially-dissociated intrusions, and to DID.
ConclusionsOur work suggests subtypes of pathological dissociation have robust, discernable, and unique functional connectivity signatures. The neural correlates of dissociation may serve as potential targets for treatment engagement to facilitate recovery from PTSD and pathological dissociation. These results underscore dissociation assessment as crucial in clinical and medical care settings. | psychiatry and clinical psychology |
10.1101/2022.04.28.22274086 | Quantifying the relationship between sub-population wastewater samples and community-wide SARS-CoV-2 seroprevalence | BackgroundWastewater-based epidemiology is a promising approach but robust epidemiological models to relate wastewater to community prevalence are lacking. Assessments of SARS-CoV-2 infection rates have relied primarily on convenience sampling, which does not provide reliable estimates of community prevalence because of inherent biases.
MethodsFrom August 2020 to February 2021, we conducted a serial stratified randomized samplings to estimate the prevalence of anti-SARS-CoV-2 antibodies in 3,717 participants, and weekly sampling of community wastewater for SARS-CoV-2 concentrations in Jefferson County, KY. With the use of a Susceptible, Infected, Recovered (SIR)-type model, we obtained longitudinal estimates of prevalence and compared these with wastewater concentration, using regression analysis.
FindingsModel analysis revealed significant temporal differences in epidemic peaks; the average incidence rate based on serological sampling in some areas was up to 50% higher than health department rates based on convenience sampling. The model-estimated average prevalence rates correlated well with wastewater (correlation=0{middle dot}63). In regression analysis, a weekly unit increase in wastewater concentration of SARS-CoV-2 corresponded to an average increase of between 1-1{middle dot}3 cases of SARS-CoV-2 infection per 100K residents.
InterpretationPublicly available health department incidence rates vastly underestimate true community incidence and wastewater has a high potential to provide robust estimates of community spread of infection.
Research in contextO_ST_ABSEvidence before this studyC_ST_ABSAdministratively reported clinical case rates of coronavirus disease 2019 (COVID-19) infected individuals are biased due to a wide range of factors from testing access to concerns about missing low and non-symptomatic and self-tested individuals. Wastewater estimates offer an alternative to support community monitoring based on fecal shedding of the virus but are difficult to interpret when compared with the available public health data sets of infection rates. We examined all English literature until February 24, 2022, on Web of Science and PubMed with the terms ["seroprevalence" or "antibody"] AND ["COVID-19" or "SARS-CoV-2"] AND ["wastewater"]. We identified six studies. None of these studies considered randomized COVID-19 community anti-SARS-CoV-2 antibody testing paired with wastewater data.
Added value of this studyThe study demonstrates how results from serial stratified randomized serological sampling of the community can be used to build a longitudinal model that can interpolate and extrapolate community levels of infection beyond specific testing dates. Such a model correlates well with wastewater concentrations indicating its utility as a surrogate for infection prevalence. The testing data used in the study were collected before wide availability of COVID-19 vaccines and are therefore unique as they are unlikely to include a significant number of false positive results.
Implications of all the available evidenceThe study demonstrates that convenience sampling obtained data from health department reporting seriously underestimates community-wide prevalence of infection. In contrast, wastewater-based epidemiology may be a faster, cost-effective, and more robust method of estimating the prevalence of viral infections within specific urban areas. | public and global health |
10.1101/2022.04.29.22274486 | How did the COVID-19 pandemic affect access to condoms, chlamydia and HIV testing, and cervical cancer screening at a population level in Britain? (Natsal-COVID) | ObjectivesTo investigate how differential access to key interventions to reduce sexually transmitted infections (STI), HIV, and their sequelae changed during the COVID-19 pandemic.
MethodsBritish participants (18-59y) completed a cross-sectional web survey one year (March to April 2021) after the initial lockdown in Britain. Quota-based sampling and weighting resulted in a quasi-representative population sample. We compared Natsal-COVID data with Natsal-3, a household-based probability sample cross-sectional survey (16-74y) conducted in 2010-12. Reported unmet need for condoms because of the pandemic and uptake of chlamydia testing/HIV testing/cervical cancer screening were analysed among sexually-experienced participants (18-44y) (n=2869, Natsal-COVID; n=8551, Natsal-3). Odds ratios adjusted for age (aOR) and other potential confounders (AOR) describe associations with demographic and behavioural factors.
ResultsIn 2021, 6.9% of women and 16.2% of men reported unmet need for condoms because of the pandemic. This was more likely among participants: aged 18-24 years, of Black or Black British ethnicity, and reporting same-sex sex (past five years) or one or more new relationships (past year). Chlamydia and HIV testing were more commonly reported by younger participants, those reporting condomless sex with new sexual partners, and men reporting same-sex partners; a very similar distribution to 10 years previously (Natsal-3). However, there were differences during the pandemic, including stronger associations with chlamydia testing for men reporting same-sex partners; with HIV testing for women reporting new sexual partners; and with cervical screening among smokers.
ConclusionsOur study suggests differential access to key primary and secondary STI/HIV prevention interventions continued during the first year of the COVID-19 pandemic. However, the available evidence does not suggest substantial changes in inequalities in since 2010-12. While the pandemic might not have exacerbated inequalities in access to primary and secondary prevention, it is clear that large inequalities persisted, typically among those at greatest STI/HIV risk.
Key MessagesO_LIMany MSM, people of Black ethnicity and young people (i.e. groups most impacted by STIs) reported unmet need for condoms because of the pandemic
C_LIO_LIWe compared inequalities in access to key interventions using Natsal-COVID (2021) and Natsal-3 (2010-12).
C_LIO_LIDuring the pandemic (Natsal-COVID), there were stronger associations with chlamydia testing for MSM and with HIV testing for women reporting new sexual partners.
C_LIO_LIThere were stronger associations with cervical screening among smokers during the pandemic compared to 2010-12 (Natsal-3).
C_LIO_LIHowever, we did not find strong evidence that vulnerable groups were at additional risk during the pandemic when compared to 2010-12.
C_LI | sexual and reproductive health |
10.1101/2022.04.29.22274463 | Genetic Risk Factors for Postoperative Atrial Fibrillation - a Nationwide Genome-Wide Association Study (GWAS) | BackgroundAtrial fibrillation (AF) is a major cause of morbidity with a high prevalence among the elderly and has an established genetic disposition. Surgery is a well-known risk factor for AF; however, it is currently not recognized how much common genetic variants influence the postoperative risk. The purpose of this study was to identify Single Nucleotide Polymorphisms associated with postoperative AF.
MethodsThe UK Biobank was utilized to conduct a Genome-Wide Association Study (GWAS) to identify variants associated with AF after surgery. An initial discovery GWAS was performed in patients that had undergone surgery with subsequent replication in a unique non-surgical cohort. In the surgical cohort, cases were defined as newly diagnosed AF within 30 days after surgery. The threshold for significance was set at 5 x 10-8.
ResultsAfter quality control, 144,196 surgical patients with 254,068 SNPs were left for analysis. Two variants (rs17042171 (p = 4.86 x 10-15) and rs17042081 (p = 7.12 x 10-15)) near the PITX2-gene reached statistical significance. These variants were replicated in the non-surgical cohort (1.39 x 10-101 and 1.27 x 10-93, respectively). Several other loci were significantly associated with AF in the non-surgical cohort.
ConclusionIn this GWAS-analysis of a large national biobank, we identified 2 variants that were significantly associated with postoperative AF. These variants were subsequently replicated in a unique non-surgical cohort. These findings bring new insight in the genetics of postoperative AF and may help identify at-risk patients and guide management. | surgery |
10.1101/2022.04.29.22274396 | Immune responses following 3rd and 4th doses of heterologous and homologous COVID-19 vaccines in kidney transplant recipients | BackgroundSolid organ transplant recipients have attenuated immune responses to SARS-CoV-2 vaccines. In this study, we report on immune responses to 3rd- (V3) and 4th- (V4) doses of heterologous and homologous vaccines in a kidney transplant population.
Methods724 kidney transplant recipients were prospectively screened for serological responses following 3 primary doses of a SARS-CoV2 vaccine. 322 patients were sampled post-V4 for anti-spike (anti-S), with 69 undergoing assessment of SARS-CoV-2 T-cell responses. All vaccine doses were received post-transplant, only mRNA vaccines were used for V3 and V4 dosing. All participants had serological testing performed post-V2 and at least once prior to their 1st dose of vaccine.
Results586/724 (80.9%) patients were infection-naive post-V3; 141/2586 (24.1%) remained seronegative at 31 (21-51) days post-V3. Timing of vaccination in relation to transplantation, OR: 0.28 (0.15-0.54), p=0.0001; immunosuppression burden, OR: 0.22 (0.13-0.37), p<0.0001, and a diagnosis of diabetes, OR: 0.49 (0.32-0.75), p=0.001, remained independent risk factors for non-seroconversion. Seropositive patients post-V3 had greater anti-S if primed with BNT162b2 compared with ChAdOx1, p=0.001.
Post-V4, 45/239 (18.8%) infection-naive patients remained seronegative. De novo seroconversion post-V4 occurred in 15/60 (25.0%) patients who were seronegative post-V3. There was no difference in anti-S post-V4 by vaccine combination, p=0.50. Anti-S post-V4 were sequentially greater in those seroconverting post V2- compared with V3-, and V3- compared with V4-, at 1561 (567-5211), 379 (101-851) and 19 (9.7-48) BAU/ml respectively.
T-cell responses were poor, with only 11/54 (20.4%) infection-naive patients having detectable T-cell responses post-V4, with no difference seen by vaccine type.
ConclusionA significant proportion of transplant recipients remain seronegative following 3- and 4- doses of SARS-CoV-2 vaccines, with poor T-cell responses, and are likely to have inadequate protection against infection. | transplantation |
10.1101/2022.05.01.22274521 | Predicting longitudinal brain atrophy in Parkinson's disease using a Susceptible-Infected-Removed agent-based model | Parkinsons disease (PD) is a progressive neurodegenerative disorder characterized by accumulation of abnormal isoforms of alpha-synuclein. Alpha-synuclein is proposed to act as a prion in PD: in its misfolded pathologic state it favours the misfolding of normal alpha-synuclein molecules, spreads trans-neuronally, and causes neuronal or synaptic damage as it accumulates. This theory remains controversial. We have previously developed a Susceptible-Infected-Removed (SIR) computational model that simulates the templating, propagation and toxicity of alpha-synuclein molecules in the brain. Here we test this model with longitudinal MRI collected over four years from the Parkinson Progression Markers Initiative (1068 T1 MRI scans, 790 PD, 278 matched controls). We find that brain deformation progresses in subcortical and cortical regions. The SIR model, using structural connectivity from diffusion MRI, recapitulates the spatiotemporal distribution of brain atrophy observed in PD. We show that connectome topology and geometry significantly contribute to model fit. We also show that the spatial expression of two genes implicated in alpha-synuclein synthesis and clearance, SNCA and GBA, also influences the atrophy pattern. We conclude that the progression of atrophy in PD is consistent with the prion-like hypothesis and that the SIR model is a promising tool to investigate multifactorial neurodegenerative diseases over time. | neurology |
10.1101/2022.04.30.22274520 | Death by Round Numbers and Sharp Thresholds: How to Avoid Dangerous AI EHR Recommendations | Recommendations for clinical practice consider consistency of application and ease of implementation, resulting in treatment decisions that use round numbers and sharp thresholds even though these choices produce statistically sub-optimal decisions. To characterize the impact of these choices, we examine four datasets and the biases underlying patient mortality risk. We document two types of suboptimalities: (1) discontinuities in which treatment decisions produce step-function changes in risk near clinically-important round-number cutoffs, and (2) counter-causal paradoxes in which anti-correlation between risk factors and aggressive treatment produces risk curves that contradict underlying causal risk. We also show that outcomes have improved over decades of refinement of clinical practice, reducing but not eliminating the strength of these biases. Finally, we provide recommendations for clinical practice and data analysis: interpretable "glass-box" models can transform the challenges of statistical confounding into opportunities to improve medical practice. | health informatics |
10.1101/2022.04.29.22274461 | A comprehensive accuracy assessment of Samsung smartwatch heart rate and heart rate variability | BackgroundPhotoplethysmography (PPG) is a low-cost and easy-to-implement method to measure vital signs, including heart rate (HR) and heart rate variability (HRV). The method is widely used in various wearable devices. For example, Samsung smartwatches are PPG-based open-source wristbands used in remote well-being monitoring and fitness applications. However, PPG is highly susceptible to motion artifacts and environmental noise. A validation study is required to investigate the accuracy of PPG-based wearable devices in free-living conditions.
ObjectiveWe evaluate the accuracy of PPG signals - collected by the Samsung Gear Sport smartwatch in free-living conditions - in terms of HR and time-domain and frequency-domain HRV parameters against a medical-grade chest electrocardiogram (ECG) monitor.
MethodsWe conducted 24-hours monitoring using a Samsung Gear Sport smartwatch and a Shimmer3 ECG device. The monitoring included 28 participants (14 male and 14 female), where they engaged in their daily routines. We evaluated HR and HRV parameters during the sleep and awake time. The parameters extracted from the smartwatch were compared against the ECG reference. For the comparison, we employed the Pearson correlation coefficient, Bland-Altman plot, and linear regression methods.
ResultsWe found a significantly high positive correlation between the smartwatchs and Shimmer ECGs HR, time-domain HRV, LF, and HF and a significant moderate positive correlation between the smartwatchs and shimmer ECGs LF/HF during sleep time. The mean biases of HR, time-domain HRV, and LF/HF were low, while the biases of LF and HF were moderate during sleep. The regression analysis showed low error variances of HR, AVNN, and pNN50, moderate error variances of SDNN, RMSSD, LF, and HF, and high error variances of LF/HF during sleep. During the awake time, there was a significantly high positive correlation of AVNN and a moderate positive correlation of HR, while the other parameters indicated significantly low positive correlations. RMSSD and SDNN showed low mean biases, and the other parameters had moderate mean biases. In addition, AVNN had moderate error variance while the other parameters indicated high error variances.
ConclusionThe Samsung smartwatch provides acceptable HR, time-domain HRV, LF, and HF parameters during sleep time. In contrast, during the awake time, AVNN and HR show satisfactory accuracy, and the other HRV parameters have high errors. | health informatics |
10.1101/2022.04.27.22274383 | An intermediate step in bridging the gap between evidence and practice: developing and applying a methodology for "general good practices" | The gap between evidence and clinical practice has been in the focus of researches for decades. Although successful implementation means the new knowledge must work in particular environments, it doesnt mean that the entire process should exclusively be executed by the individual institutes. This is the point where we assumed that an intermediate step, the "general good practice", could help to ensure that translation is done in a more professional way.
The development of the general good practice methodology was based on our infinitE model, which organized the factors of successful translation into an evidence-editing-embedding-effect on practice framework, using tools from the disciplines of Evidence-Based Medicine, Quality Improvement and Change Management.
The methodology organised the editing and embedding part of the development into a process involving three full-day sessions carried out with different health professionals, experts and moderators. After pilot testing, it was finalized and applied to other topics as well.
The methodology presented in detail in this paper, centred on flow chart, process analysis, failure mode identification and Kotters 8-step model. Beside the pilot topic of the institutional process of resuscitation, the methodology has also proved applicable to more than ten other topics, meaning that at least all the core elements of the proposed bundle of general good practice have been produced in the development process.
Compared to the guidelines, general good practices demonstrate the evidence in operation, helping to develop workflows, responsibilities, documentation, trainings, etc. and can also be a starting point for the digitalisation of care processes.
The next step is to examine how healthcare institutions can build on these in their own editing and embedding activities, and what the results will be. Further studies could explore the applicability of the development methodology in different healthcare systems or at different levels of maturity in terms of quality. | health systems and quality improvement |
10.1101/2022.04.29.22274481 | Healthcare workers' mental health and wellbeing during the COVID-19 pandemic: Longitudinal analysis of interview and social media data | The COVID-19 pandemic has shed light on the fractures of healthcare systems around the world, particularly in relation to the healthcare workforce. Frontline staff, in particular, have been exposed to unprecedented strain and delivering care during the pandemic has impacted their safety, mental health and wellbeing. The aim of this paper was to explore the experiences of HCWs delivering care in the UK during the COVID-19 pandemic to understand their wellbeing needs, experiences and strategies used to maintain wellbeing (at individual and organizational levels). We analysed 94 telephone interviews with HCWs and 2000 tweets about HCWs mental health taking place during the first year of the COVID-19 pandemic. Results were grouped under six themes: redeployment; wellbeing support and coping strategies; mental health effects; organisational support; social network and public support. These findings demonstrate a need for open conversations, where staffs wellbeing needs and strategies can be shared and encouraged, rather than implementing solely top-down psychological interventions. At the macro level, findings also highlighted the impact on HCWs wellbeing of public and government support, as well as the need for ensuring protection through PPE, testing, and/or vaccines for frontline workers. | health systems and quality improvement |
10.1101/2022.04.29.22274481 | Healthcare workers' mental health and wellbeing during the COVID-19 pandemic: Longitudinal analysis of interview and social media data | The COVID-19 pandemic has shed light on the fractures of healthcare systems around the world, particularly in relation to the healthcare workforce. Frontline staff, in particular, have been exposed to unprecedented strain and delivering care during the pandemic has impacted their safety, mental health and wellbeing. The aim of this paper was to explore the experiences of HCWs delivering care in the UK during the COVID-19 pandemic to understand their wellbeing needs, experiences and strategies used to maintain wellbeing (at individual and organizational levels). We analysed 94 telephone interviews with HCWs and 2000 tweets about HCWs mental health taking place during the first year of the COVID-19 pandemic. Results were grouped under six themes: redeployment; wellbeing support and coping strategies; mental health effects; organisational support; social network and public support. These findings demonstrate a need for open conversations, where staffs wellbeing needs and strategies can be shared and encouraged, rather than implementing solely top-down psychological interventions. At the macro level, findings also highlighted the impact on HCWs wellbeing of public and government support, as well as the need for ensuring protection through PPE, testing, and/or vaccines for frontline workers. | health systems and quality improvement |
10.1101/2022.04.29.22274465 | A methodology to generate epidemic scenarios for emerging infectious diseases based on the use of key calendar events | This work presents a methodology to recreate the observed dynamics of emerging infectious diseases and to generate short-term forecasts for their evolution based on superspreading events occurring on key calendar dates. The method is illustrated by the COVID-19 pandemic dynamics in Mexico and Peru up to January 31, 2022. We also produce scenarios obtained through the estimation of a time-dependent contact rate, with the main assumption that the dynamic of the disease is determined by the mobility and social activity of the population during holidays and other important calendar dates. First, historical changes in the effective contact rate on predetermined dates are estimated. Then, this information is used to forecast scenarios under the assumption that the trends of the effective contact rate observed in the past will be similar on the same but future key calendar dates. All other conditions are assumed to remain constant in the time scale of the projections. One of the main features of the methodology is that it avoids the necessity of fixing values of the dynamic parameters for the whole prediction period. Results show that considering the key dates as reference information is useful to recreate the different outbreaks, slow or fast-growing, that an epidemic can present and, in most cases, make good short-term predictions. | infectious diseases |
10.1101/2022.04.29.22274477 | Omicron sub-lineages BA.4/BA.5 escape BA.1 infection elicited neutralizing immunity | The SARS-CoV-2 Omicron (B.1.1.529) variant first emerged as the BA.1 sub-lineage, with extensive escape from neutralizing immunity elicited by previous infection with other variants, vaccines, or combinations of both1,2. Two new sub-lineages, BA.4 and BA.5, are now emerging in South Africa with changes relative to BA.1, including L452R and F486V mutations in the spike receptor binding domain. We isolated live BA.4 and BA.5 viruses and tested them against neutralizing immunity elicited to BA.1 infection in participants who were Omicron/BA.1 infected but unvaccinated (n=24) and participants vaccinated with Pfizer BNT162b2 or Johnson and Johnson Ad26.CoV.2S with breakthrough Omicron/BA.1 infection (n=15). In unvaccinated individuals, FRNT50, the inverse of the dilution for 50% neutralization, declined from 275 for BA.1 to 36 for BA.4 and 37 for BA.5, a 7.6 and 7.5-fold drop, respectively. In vaccinated BA.1 breakthroughs, FRNT50 declined from 507 for BA.1 to 158 for BA.4 (3.2-fold) and 198 for BA.5 (2.6-fold). Absolute BA.4 and BA.5 neutralization levels were about 5-fold higher in this group versus unvaccinated BA.1 infected participants. The observed escape of BA.4 and BA.5 from BA.1 elicited immunity is more moderate than of BA.1 against previous immunity1,3. However, the low absolute neutralization levels for BA.4 and BA.5, particularly in the unvaccinated group, are unlikely to protect well against symptomatic infection4.This may indicate that, based on neutralization escape, BA.4 and BA.5 have potential to result in a new infection wave. | infectious diseases |
10.1101/2022.04.29.22274361 | A machine learning based predictive model for the diagnosis of sepsis | The early recognition and treatment of sepsis is essential to increase the probability of survival of the patient. Sepsis is a complex and heterogeneous syndrome influenced by the site of infection, causative microorganisms, acute organ dysfunctions and co-morbidities. So, early diagnosis is a challenge in which complex physiologic, metabolic, biochemical markers and clinical signs must be evaluated simultaneously for a reliable and early identification of sepsis. In this paper, a list of relevant variables involved in sepsis diagnosis is provided. Furthermore, to help answering the question of whether a patient is suffering from sepsis when entering the emergency room, a model based on the machine learning gradient boosting algorithm is proposed. Using three histones H2B, H3, H4 and the activated protein C together with other variables a gradient boosting classifier was trained and evaluated with cross validation. Results show that the model can achieve up to a 97% mean per-class accuracy. | intensive care and critical care medicine |
10.1101/2022.04.28.22274443 | Evidence for Aerosol Transfer of SARS-CoV2-specific Humoral Immunity | Despite the obvious knowledge that infectious particles can be shared through respiration, whether other constituents of the nasal/oral fluids can be passed between hosts has surprisingly never even been postulated, let alone investigated. The circumstances of the present pandemic facilitated a unique opportunity to fully examine this provocative idea. The data we show provides evidence for a new mechanism by which herd immunity may be manifested, the aerosol transfer of antibodies between immune and non-immune hosts. | allergy and immunology |
10.1101/2022.04.29.22274460 | Association between Intraoperative End-Tidal Carbon Dioxide and Postoperative Organ Dysfunction in Major Abdominal Surgery: A Retrospective Cohort Study | BackgroundData on the effects of intraoperative end-tidal carbon dioxide (EtCO2) levels on postoperative organ dysfunction are limited. Thus, this study was designed to investigate the relationship between the intraoperative EtCO2 level and postoperative organ dysfunction in patients who underwent major abdominal surgery under general anesthesia.
MethodsWe conducted a retrospective cohort study involving patients who underwent major abdominal surgery under general anesthesia at Kyoto University Hospital. We classified those with a mean EtCO2 of less than 35 mmHg as low EtCO2. The time effect was determined as the minutes when the EtCO2 value was below 35 mmHg, whereas the cumulative effect was evaluated by measuring the area below the 35-mmHg threshold. The outcome was postoperative organ dysfunction, defined as a composite of at least one organ dysfunction among acute renal injury, circulatory dysfunction, respiratory dysfunction, coagulation dysfunction, and liver dysfunction within 7 days after surgery.
ResultsOf the 4,171 patients, 1,195 (28%) had low EtCO2, and 1,428 (34%) had postoperative organ dysfunction. An association was found between low EtCO2 and increased postoperative organ dysfunction (adjusted risk ratio, 1.11; 95% confidence interval [CI], 1.03-1.20; p = 0.006). Additionally, long-term exposure to EtCO2 values of less than 35 mmHg ([≥]224 min) was associated with postoperative organ dysfunction (adjusted risk ratio, 1.18; 95% CI, 1.06-1.32; p = 0.003) and low EtCO2 severity (area under the threshold) (adjusted risk ratio, 1.13; 95% CI, 1.02-1.26; p = 0.018).
ConclusionsIntraoperative low EtCO2 of below 35 mmHg was associated with increased postoperative organ dysfunction. | anesthesia |
10.1101/2022.04.29.22274489 | PREVALENCE OF PRECONCEPTION CARE PROVISION IN KISUMU COUNTY-KENYA | Background Preconception care (PCC) is the provision of biomedical, behavioral and social health interventions to women and couples before conception occurs. The PCC, is valuable and key in preventing and controlling non-communicable diseases. There is the need for comprehensive preconception care provision in Kisumu County in order to promote healthy reproduction, improve maternal and neonatal health indicators in this and similar settings. Unfortunately, the current rate of provision for this service has not been documented in Kenya and more specifically in Kisumu County.
MethodsIn a cross-sectional study, data on the provision of various services as per the recommended package for preconception care was collected in health facilities using a checklist. The targeted facilities (n=28) were selected using multistage sampling. The means for all of the services in the package was determined. The significance of the difference in the means was determined by one sample T-test at P-value [≤] 0.05
ResultsLevel of implementation of PCC was quite low at 39%. It was observed to be lower in the primary level facilities (KEPH level 2 and 3) at 34% and higher in referral facilities (level 4 and above) at 45%. The service with the highest implementation level was HIV prevention and management (84%) followed by sexually transmitted diseases (80%) and vaccination services (75%). The service with lowest level of implementation was environmental risk exposure reduction at 13% for level 2 and 3 followed by management of mental health disorders.
ConclusionProvision of PCC was relatively low at 39% and provision was fragmented. Provision differed across service levels and care packages. | public and global health |
10.1101/2022.04.28.22274427 | Effectiveness of the air-filled technique to reduce the dead space in syringes and needles during ChAdox1-n CoV vaccine administration | In the current study, we calculated the vaccine volume and amount of dead space in a syringe and needle during ChAdox1-n CoV vaccine administration using the air-filled technique, to reduce dead space in syringe and needle that administer up to 12 doses per vial. The hypothetical situation uses a vial with a similar size as the ChAdox1-n CoV vial. We used distilled water (6.5 ml), to fill the same volume as five vials of ChAdox1-n CoV. When 0.48 ml of distilled water is drawn according to the number on the side of the barrel, an additional 0.10 ml of air can be used in the dead space of the distilled water in the syringe and needle for 60 doses, which can be divided into an average of 0.5 ml dose. ChAdox1-n CoV was administered using a 1-mL syringe and 25G needle to administer ChAdox1-n CoV vaccine into 12 doses using this air-filled technique. The volume of the recipient vaccine will be increase by 20% and save the budget of a low dead space syringe (LDS).
Graphical Abstract
O_FIG O_LINKSMALLFIG WIDTH=152 HEIGHT=200 SRC="FIGDIR/small/22274427v1_ufig1.gif" ALT="Figure 1">
View larger version (37K):
[email protected]@8fe377org.highwire.dtl.DTLVardef@9ed463org.highwire.dtl.DTLVardef@189d2c7_HPS_FORMAT_FIGEXP M_FIG C_FIG | health policy |
10.1101/2022.04.29.22274367 | Implementation strategies for telemental health: a systematic review | BackgroundThe COVID-19 pandemic resulted in a rapid shift from traditional face-to-face care provision towards delivering mental health care remotely through telecommunications, often referred to as telemental health care. However, the manner and extent of telemental health implementation have varied considerably across settings and areas, and substantial barriers are encountered. There is, therefore, now a need to identify what works best for service users and staff and establish the key mechanisms for efficient integration into routine care.
ObjectiveWe aimed to identify investigations of pre-planned strategies intended to achieve or improve effective and sustained implementation of telemental health approaches, and to evaluate how different strategies influence implementation outcomes.
MethodsA systematic review was conducted, with five databases searched for relevant literature using any methodological approach, published between January 2010 and July 2021. Studies were eligible for inclusion if they took place in secondary or tertiary mental health services and focused on pre-planned strategies for achieving or improving delivery of mental health care through remote communication between mental health professionals or between mental health professionals and service users, family members, unpaid carers, or peer supporters. All included studies were assessed for risk of bias. Data were synthesised using the Expert Recommendations for Implementing Change (ERIC) compilation of implementation strategies and the taxonomy of implementation outcomes.
ResultsA total of 14 studies were identified which met the inclusion criteria. A variety of implementation strategies were identified, the most commonly reported being Train and educate stakeholders. All studies reported using a combination of several implementation strategies.
ConclusionsUsing a combination of implementation strategies appears to be a helpful method of supporting the implementation of telemental health. Further research is needed to test the impact of specific implementation strategies on implementation outcomes. | psychiatry and clinical psychology |
10.1101/2022.04.29.22274376 | High-dimensional deconstruction of pancreatic ductal adenocarcinoma identifies tumor microenvironmental communities associated with survival | Bulk and single-cell analyses of the pancreatic ductal adenocarcinoma (PDAC) tumor microenvironment (TME) have revealed a largely immunosuppressive milieu. Thus far, efforts to utilize insights from TME features to facilitate more effective therapeutics have largely failed. Here, we performed single-cell RNA sequencing (scRNA-seq) on a cohort of treatment-naive PDAC time-of-diagnosis endoscopic ultrasound-guided fine needle biopsy samples (n=22) and surgical samples (n=6), integrated with 3 public datasets (n=49), resulting in [~]140,000 individual cells from 77 patients. Based on expression markers assessed by Seurat v3 and differentiation status assessed by CytoTRACE, we divided the resulting tumor cellular clusters into 5 molecular subtypes based on expression of previously reported marker genes: Basal, Mixed Basal/Classical, Classical Low, Classical High, and ADEX. We then queried these 5 tumor cell profiles, along with 15 scRNA-seq-derived tumor microenvironmental cellular profiles, in 391 bulk expression samples from 4 published datasets of localized PDAC with associated clinical metadata using CIBERSORTx. Through unsupervised clustering analysis of these 20 cell state fractions representing tumor, leukocyte and stromal cells, we identified 7 unique clustering patterns representing combinations of tumor cellular and microenvironmental cell states present in PDAC tumors. We termed these cell state patterns communities, and found them to correlate with overall survival, tumor ecotypes, and tumor cellular differentiation status. The community associated with worst overall survival contained basal tumor cells, exhausted CD4 and CD8 T cells, and was enriched for fibroblasts. In contrast, the highest overall survival was associated with a community high in immune cell enrichment. The differentiation state of tumor cells (assessed by CytoTRACE) was also correlated with survival in a dose-dependent fashion. Further, we identified a subset of PDAC samples that were significantly enriched for CD8 T and plasma cells that achieved a 2-year overall survival rate of 71%, suggesting we can identify PDAC patients with significantly improved prognoses and, potentially, higher sensitivity to immunotherapy.
In summary, we identified novel tumor microenvironmental communities from high-dimensional analysis of PDAC RNA sequencing data that reveal new connections between tumor microenvironmental composition and patient survival that could lead to better upfront risk stratification and more personalized clinical decision-making. | oncology |
10.1101/2022.04.29.22274313 | Independent Replication and Drug-specificity of an Antidepressant Response Polygenic Risk Score | We here examine associations of a recently published polygenic risk score of antidepressant response (PRS-AR) with antidepressant treatment outcomes (remission and depression score change) in an independent clinical trial. We not only replicate the PRS-AR for escitalopram, but also find antidepressant interaction effects, suggesting drug-specificity of PRS-AR. We therefore also tested the utility of this PRS-AR to stratify between antidepressants and demonstrate a 14% increase in remission rate (from 43.6% to 49.7%), relative to the randomized remission rate. | genetic and genomic medicine |
10.1101/2022.04.29.22274379 | Genetically determining individualized clinical reference ranges for the biomarker tryptase can limit unnecessary procedures and unmask myeloid neoplasms | Serum tryptase is a biomarker used to aid in the identification of certain myeloid neoplasms, most notably systemic mastocytosis, where baseline (BST) levels >20 ng/mL are a minor criterion for diagnosis. Whereas clonal myeloid neoplasms are rare, the common cause for elevated BST is the genetic trait hereditary alpha-tryptasemia (HT) caused by increased germline TPSAB1 copy number. To date, the precise structural variation and mechanism(s) underlying elevated BST in HT and the general clinical utility of tryptase genotyping, remain undefined. Through cloning, long-read sequencing, and assembling of the human tryptase locus from an individual with HT, and validating our findings in vitro and in silico, we demonstrate that BST elevations arise from over-expression of replicated TPSAB1 loci encoding wild-type -tryptase due to co-inheritance of a linked over-active promoter element. Modeling BST levels based upon TPSAB1 replication number we generate new individualized clinical reference values for the upper limit of normal. Using this personalized laboratory medicine approach, we demonstrate the clinical utility of tryptase genotyping, finding that in the absence of HT, BST levels >11.4 ng/mL frequently identify indolent clonal mast cell disease. Moreover, substantial BST elevations (e.g., >100 ng/mL) which would ordinarily prompt bone marrow biopsy, can result from TPSAB1 replications alone and thus be within normal limits for certain individuals with HT. | genetic and genomic medicine |
10.1101/2022.04.28.22274421 | Epidemiological and clinical features of SARS-CoV-2 Infection in children during the outbreak of Omicron Variant in Shanghai, March 7-March 31, 2022 | ObjectivesTo understand the epidemiological and clinical characteristics of pediatric SARS-CoV-2 infection during the early stage of Omicron variant outbreak in Shanghai.
MethodsThis study included local COVID-19 cases <18 years in Shanghai referred to the exclusively designated hospital by the end of March 2022 since emergence of Omicron epidemic. Clinical data, epidemiological exposure and COVID-19 vaccination status were collected. Relative risks (RR) were calculated to assess the effect of vaccination on symptomatic infection and febrile disease.
ResultsA total of 376 pediatric cases of COVID-19 (median age:6.0{+/-}4.2 years) were referred to the designated hospital during the period of March 7-31, including 257 (68.4%) symptomatic cases and 119 (31.6%) asymptomatic cases. Of the 307 (81.6%) children;3 years eligible for COVID-19 vaccination, 110 (40.4%) received 2-dose vaccines and 16 (4.0%) received 1-dose vaccine. The median interval between 2-dose vaccination and infection was 3.5 (IQR: 3, 4.5) months (16 days-7 months). Two-dose COVID-19 vaccination reduced the risks of symptomatic infection and febrile disease by 35% (RR 0.65, 95% CI:0.53-0.79) and 33% (RR 0.64, 95% CI: 0.51-0.81). Two hundred and sixteen (83.4%) symptomatic cases had fever (mean duration: 1.7{+/-}1.0.8 days), 104 (40.2%) had cough, 16.4% had transient leukopenia; 307 (81.6%) had an epidemiological exposure in household (69.1%), school (21.8%) and residential area (8.8%).
ConclusionThe surge of pediatric COVID-19 cases and multiple transmission model reflect wide dissemination of Omicron variant in the community. Asymptomatic infection is common among Omicron-infected children. COVID-19 vaccination can offer protection against symptomatic infection and febrile disease. | infectious diseases |
10.1101/2022.04.29.22274387 | Outcomes of convalescent plasma with defined high- versus lower-neutralizing antibody titers against SARS-CoV-2 among hospitalized patients: CoronaVirus Inactivating Plasma (CoVIP), double-blind phase 2 study. | BackgroundCOVID-19 Convalescent Plasma (CCP) was an early and widely adopted putative therapy for severe COVID-19. Results from randomized control trials and observational studies have failed to demonstrate a clear therapeutic role for CCP for severe SARS-CoV-2 infection. Underlying these inconclusive findings is a broad heterogeneity in the concentrations of neutralizing antibodies (nAb) between different CCP donors. The present study was designed to evaluate nAb titer threshold for clinically effective CCP.
MethodsWe conducted a double-blind, phase 2 study to evaluate the safety and effectiveness of nAb titer-defined CCP in adults admitted to an academic referral hospital. Patients positive on a SARS-CoV-2 nucleic acid amplification test and with symptoms for < 10 days were eligible. Participants received either CCP with nAb titers [≥]1:160-1:640 (standard titer group) or >1:640 (high titer group) in addition to standard of care treatments. Adverse events were contrasted by CCP titer. The primary clinical outcome was time to hospital discharge, with mortality and respiratory support evaluated as secondary outcomes.
FindingsBetween August 28 and December 4, 2020, 316 participants were screened, 55 received CCP, with 41 and 14 receiving standard versus high titer CCP, respectively. Participants were a median of 61 years of age (IQR 52-67), 36% women, 25% Black and 33% Hispanic. Severe adverse events (SAE) ([≥] grade 3) occurred in 4 (29%) and 23 (56%) of participants in the high versus standard titer groups, respectively by day 28 (Risk Difference -0.28 [95% CI -0.56, 0.01]). There were no observed treatment-related AEs. By day 55, time to hospital discharge was shorter among participants receiving high versus standard titer, accounting for death as a competing event (hazard ratio 1.94 [95% CI 1.05, 3.58], Grays p=0.02).
InterpretationIn this phase 2 trial in a high-risk population of patients admitted for Covid-19, we found earlier time to hospital discharge and lower occurrences of life-threatening SAEs among participants receiving CCP with nAb titers >1:640 compared with participants receiving CCP with lower nAb titer CCP. Though limited by a small study size these findings support further study of high-nAb titer CCP defined as >1:640 in the treatment of COVID-19.
FundingThis clinical study was supported by the UNC Health Foundation and the North Carolina Policy Collaboratory at the University of North Carolina at Chapel Hill with funding from the North Carolina Coronavirus Relief Fund established and appropriated by the North Carolina General Assembly. The laboratory assays for neutralizing antibody titers and SARS-CoV-2 specific antibody-binding assays were partially supported by The NIH NCI/NIAID SeroNet Serocenter of Excellence Award U54 CA260543.
Research in ContextO_ST_ABSEvidence before this studyC_ST_ABSCOVID-19 Convalescent Plasma (CCP) has emergency use authorization from the FDA for early treatment of COVID-19 in either outpatient or inpatient settings. Evidence supporting the use of CCP for severe COVID-19 is mixed and still emerging. One major limitation in interpreting published clinical trials and the clinical role of CCP is incomplete understanding of necessary neutralizing antibody (nAb) titer for clinically effective CCP. Observational studies suggest that higher antibody-content CCP is more effective than lower antibody-content CCP, or that very low antibody-content CCP is harmful. We searched PubMed articles published between February 1, 2020, and April 15, 2022, using the terms "COVID-19", "convalescent plasma", "SARS-CoV-2", and "CCP" alone and in combination. Our search yielded 6,468 results which we filtered to 280 and 162 by selecting Clinical Trial and Randomized Controlled Trial article types, respectively. Among these, we identified 25 open-label or blinded efficacy or effectiveness studies in hospitalized patients that were relevant to our study. Preliminary reports show wide variability in the antibody content of CCP used in clinical trials, the assays used to define CCP antibody content, and the estimates of clinical outcomes following CCP therapy for hospitalized patients. Only one study deliberately infused CCP with nAb > 1:640. Post-hoc analyses of potent monoclonal antibody therapy in hospitalized patients in the UK showed survival benefit when monoclonal antibody was infused to patients who had not yet seroconverted by spike antibody ELISA, suggesting that if dosed appropriately, antibody-based therapies may have a role in improving outcomes of severe COVID-19.
Added value of this studyThis phase 2 study showed that CCP with high nAb titer (>1:640) provided more rapid recovery to hospital discharge and fewer COVID-19 attributable AEs than CCP with nAb titer between the FDA-recommended minimum standard and 4-fold higher ([≥]1:160-1:640). The hazard ratio of time to hospital discharge from baseline through day 55, accounting for death as a competing event, contrasting patients receiving high versus standard CCP titer was 1.94 (95% CI 1.05-3.58). Adjusted hazard ratios of high versus standard titer CCP receipt for time to hospital discharge were consistent with the primary unadjusted findings. Mortality through 55 days was lower in the high titer group, but with a wide confidence interval that did not reach statistical significance.
Implications of all available evidenceOur data that CCP with nAb >1:640 expedites recovery of patients admitted with COVID-19 compared with CCP with nAb [≥]1:160-1:640 suggests that a threshhold of nAb [≥]1:160 may be too low to define CCP as high titer. Analyses in larger CCP trials should consider full reporting of nAb in CCP units administered at individual study participant level, and specifically whether CCP contained nAb >1:640. Further investigation of CCP with nAb >1:640 is warranted given that raising the threshhold of nAb, or a correlative specific anti-spike antibody assay, used to qualify high titer CCP in clinical trials could inform policy guidance and clinical use of CCP. | infectious diseases |
10.1101/2022.04.28.22274402 | Immunogenicity and safety of the CoronaVac inactivated SARS-CoV-2 vaccine in people with underlying medical conditions: a retrospective study | BackgroundPeople living with chronic disease, particularly seniors older than 60 years old, are lagging behind in the national vaccination campaign in China due to uncertainty of safety and effectiveness. However, this special population made up of most severe symptom and death cases among infected patients and should be prioritized in vaccination program. In this retrospective study, we assessed the safety and immunogenicity of the CoronaVac inactivated vaccines in people with underlying medical conditions to address the vaccine hesitation in this special population.
MethodsIn this cohort study, volunteers aged 40 years and older, had received two doses of CoronaVac inactivated vaccines (3-5 weeks interval), been healthy or with at least one of the six diseases: coronary heart disease (CAD), hypertension, diabetes mellitus (DM), chronic respiratory disease (CRD), obesity and cancer, were recruited from 4 study sites in China. The primary safety outcome was the incidence of adverse events within 14 days after each dose of vaccination. The primary immunogenic outcome was geometric mean titer (GMT) of neutralizing antibodies to living SARS-CoV-2 virus at 14-28 days, 3 months, and 6 months after full two-dose vaccination. This study is registered with ChiCTR.org.cn (ChiCTR2200058281) and is active but no longer recruiting.
FindingsAmong 1,302 volunteers screened between Jul 5 and Dec 30, 2021, 969 were eligible and enrolled in our cohort, including 740 living with underlying medical conditions and 229 as healthy control. All of them formed the safety cohort. The overall incidence of adverse reactions was 150 (20.27%) of 740 in the comorbidities group versus 32 (13.97%) of 229 in the healthy group, with significant difference (P=0.0334). The difference was mainly contributed by fatigue and injection-site pain in some groups. Most adverse reactions were mild (Grade 1). We did not observe any serious adverse events related to vaccination. By day 14-28 post vaccination, the seroconversion rates and GMT of neutralizing antibody showed no significant difference between disease group and healthy group, except CAD group (P=0.03) and CRD group (P=0.04) showed slight reduction. By day 90, the neutralizing antibody GMTs were significantly reduced in each group, with no significant difference between diseases and healthy group. By day 180, the neutralizing antibody continued to decrease in each group, but with slower declination.
InterpretationFor people living with chronic disease especially seniors older than 60 years, the CoronaVac vaccines are as safe as in healthy people. Although the immunogenicity is slightly different in subgroup of some diseases compared with that of the healthy population, the overall trend was consistent. Our findings highlight the evidence to address vaccine hesitancy for seniors and people living with chronic diseases.
FundingYunnan Provincial Science and Technology Department (202102AA100051 and 202003AC100010, China), Sinovac Biotech Ltd (PRO-nCOV-4004). | infectious diseases |
10.1101/2022.05.01.22274432 | Passive surveillance of human-biting ticks correlates with town-level disease rates in Massachusetts | We assessed the temporal and spatial distribution of Borrelia burgdorferi, Borrelia miyamotoi, Babesia microti, and Anaplasma phagocytophilum among human-biting Ixodes scapularis ticks in Massachusetts using ticks submitted to the TickReport pathogen passive surveillance program. From January 2015 to December 2017, Ixodes scapularis was the most frequently submitted tick species (n=7462). B. burgdorferi prevalence increased in ticks during the study period in adults and nymphs (37.1-39.1% in adults, 19.0%-23.9% in nymphs). The proportion of B. microti infected ticks increased from 5.7% to 8.1% in adult ticks but remained constant in nymphs (5.4-5.6%). Stable or decreasing annual prevalence of B. miyamotoi (2.2 - 2.2% in adults, 1.0-1.9% in nymphs) and A.phagocytophilum (7.6-7.2% in adults, 5.0-4.0% in nymphs) were detected. Coinfections were observed and included all pathogen combinations.
Ticks were submitted year-round and had stable infection rates. The temporal pattern of B. burgdorferi- positive nymphs aligned with reported cases of Lyme disease, as did positive B. microti nymphs and babesiosis. A similar situation is seen with B. miyamotoi with an insignificant fall peak in cases. Anaplasmosis demonstrated a significant bimodal distribution with reported cases peaking in the spring and fall. This pattern is similar to that of A. phagocytophilum-infected adult ticks.
B. microti infected nymphs were significantly predictive of town-level babesiosis incidence and A. phagocytophilum infected adults were significantly predictive of town-level anaplasmosis incidence in a spatially adjusted negative binomial model. Unlike field collection studies, the high number of ticks submitted provides a high-resolution picture of pathogen prevalence and provides data relevant to human health at the town level. Through temporal and geographic analyses we demonstrate concordance between our passive surveillance tick pathogen data and state reports of tickborne disease. | epidemiology |
10.1101/2022.04.29.22274325 | Reduced motor planning underlying inhibition of prepotent responses in children with ADHD | To flexibly regulate their behavior, childrens ability to inhibit prepotent responses arises from cognitive and motor mechanisms that have an intertwined developmental trajectory. Subtle differences in planning and control can contribute to impulsive behaviors, which are common in Attention Deficit and Hyperactivity Disorder (ADHD) and difficult to be assessed and trained. We adapted a Go/No-Go task and employed a portable, low-cost kinematic sensor to explore the different strategies used by children with ADHD or typical development to provide a prepotent response (dominant condition) or inhibit the prepotent and select an alternative one (non-dominant condition). Although no group difference emerged on accuracy levels, the kinematic analysis of correct responses revealed that, unlike neurotypical children, those with ADHD did not show increased motor planning in non-dominant compared to dominant trials. Despite motor control could have compensated and led to good accuracy in our simple task, this strategy might make inhibition harder in more naturalistic situations that involve complex actions. Combining cognitive and kinematic measures is a potential innovative method for assessment and intervention of subtle differences in executive processes such as inhibition, going deeper than is possible based on behavioral outcomes alone.
Significance StatementThis study proposes an innovative method that integrates kinematic measurement with neuropsychological evaluation, thus providing information on the planning and control mechanisms underlying behavioral outcomes. It is applicable to the study not only of inhibition but more generally of executive functions, which are the basis of the ability of children and adults to achieve goal-directed actions. The use of a wearable motion sensor ensures good applicability in research, clinical evaluation, and intervention. | psychiatry and clinical psychology |
10.1101/2022.05.01.22274242 | Insecticide Resistance Surveillance of Malaria and Arbovirus Vectors in Papua New Guinea 2017-2022. | BackgroundInsecticide resistance monitoring is key for evidence-based control of Anopheles and Aedes disease vectors in particular, since the vast majority of insecticide-based public health adult vector control tools are reliant on pyrethroids. While widespread pyrethroid resistance in Anopheles species and Aedes aegypti has been described in many countries, data for Papua New Guinea are scarce. Available data indicate the local Anopheles populations remain pyrethroid-susceptible, making regular insecticide resistance monitoring even more important. Knowledge on Aedes insecticide resistance in PNG is very limited, however, high levels of Aedes aegypti resistance have been described. Here we present insecticide resistance monitoring data from across PNG generated between 2017 and 2022.
MethodsMosquito larvae were collected in larval habitat surveys and through ovitraps. Mosquitoes were reared to adults and subjected to insecticide treated filter papers in WHO insecticide susceptibility bioassays. Subsets of Aedes mosquitoes were subjected to sequencing of the voltage-sensitive sodium channel (Vssc) region to identify resistance mutations.
ResultsOverall, nearly 20,000 adult female mosquitoes from nine PNG provinces were used in the tests. We show that in general, Anopheline mosquitoes in PNG remain susceptible to pyrethroids but with worrying signs of reduced 24 h mortality in some areas. In addition, some Anopheles populations were indicated to be resistant against DDT. We show that Ae. aegypti in PNG are pyrethroid, DDT and likely bendiocarb resistant with a range of Vssc resistance mutations identified. We demonstrate that Ae. albopictus is DDT resistant and is likely developing pyrethroid resistance based on finding a low frequency of Vssc mutations.
ConclusionThis study represents the largest overview of insecticide resistance in PNG. While Ae. aegypti is highly pyrethroid resistant, the Anopheline and Ae. albopictus populations exhibit low levels of resistance in some areas. It is important to continue to monitor insecticide resistance in PNG and prepare for the widespread emergence of pyrethroid resistance in major disease vectors. | public and global health |
10.1101/2022.04.29.22274496 | Vestibular contribution to path integration deficits in at genetic risk for Alzheimer disease | Path integration changes may precede a clinical presentation of Alzheimers disease by several years. Studies to date have focused on how grid cell changes affect path integration in preclinical AD. However, vestibular input is also critical for intact path integration. Here, we developed a naturalistic vestibular task that requires individuals to manually point an iPad device in the direction of their starting point following rotational movement, without any visual cues. Vestibular features were derived from the sensor data using feature selection. Completing machine learning models illustrate that the vestibular features accurately classified Apolipoprotein E {varepsilon}3{varepsilon}4 carriers and {varepsilon}3{varepsilon}3 carrier controls (mean age 62.7 years), with 65% to 79% accuracy depending on task trial or algorithm. Our results demonstrate the cross-sectional role of the vestibular system in Alzheimers disease risk carriers and may explain individual phenotypic heterogeneity in path integration within this population | neurology |
10.1101/2022.05.01.22274445 | Children with Critical Illness Carry Risk Variants Despite Non-Diagnostic Whole Exome Sequencing | Rapid genetic sequencing is an established and important clinical tool for management of pediatric critical illness. The burden of risk variants in children with critical illness but a non-diagnostic exome has not been explored. This was a retrospective case-control analysis of research whole exome sequencing data that first underwent a diagnostic pipeline to assess the association of rare loss-of-function variants with critical illness in children with diagnostic and non-diagnostic whole exome sequencing including those with virally mediated respiratory failure. Using a gene-based collapsing approach, the odds of a child with critical illness carrying rare loss-of-function variants were compared to controls. A subset of children with virally mediated respiratory failure was also compared to controls. Cases were drawn from the general pediatric ward and pediatric intensive care unit (PICU) at Morgan Stanley Childrens Hospital of NewYork-Presbyterian (MSCH) - Columbia University Irving Medical Center (CUIMC) and from the Office of the Chief Medical Examiner (OCME) of New York City. Of the 285 enrolled patients, 228 (80.0%) did not receive a diagnosis from WES. After quality control filtering and geographic ancestry matching, an analysis of 232 children with critical illness compared to 5,322 healthy and unrelated controls determined cases to harbor more predicted loss-of-function (pLOFs) in genes with a LOEUF score [≤] 0.680 (p-value = 1.0 x 10-5). After quality control and geographic ancestry matching, a subset of 176 children without a genetic diagnosis compared to 5,180 controls harbored pLOFs in genes without a disease association (OR 1.7, CI [1.2 - 2.3], FDR adjusted p-value = 4.4 x 10-3) but not in genes with a disease association (OR 1.2, CI [0.8 - 1.7], FDR adjusted p-value = 0.40). This enrichment primarily existed among ultra-rare variants not found in public data sets. Among a subset of 25 previously healthy children with virally mediated respiratory failure compared to 2,973 controls after quality control and geographic ancestry matching, cases harbored more variants than controls in genes without a disease association at the same LOEUF threshold [≤] 0.680 (OR 2.8, CI [1.2 - 7.2], FDR adjusted p-value = 0.026) but not in genes with a disease association (OR 0.7, CI [0.2 - 2.2], FDR adjusted p-value = 0.84). Finally, children with critical illness for whom whole exome sequencing data from both biological parents were available, we found an enrichment of de novo pLOF variants in genes without a disease association among 114 children without a genetic diagnosis (unadjusted p-value < 0.05) but not among 46 children with a genetic diagnosis. Children with critical illness and non-diagnostic whole exome sequencing may still carry risk variants for their clinical presentation in genes not previously associated with disease. | genetic and genomic medicine |
10.1101/2022.04.30.22273566 | Questioning the association of the STMN2 dinucleotide repeat with ALS | Lowered expression of STMN2 is associated with TDP-43 pathology in amyotrophic lateral sclerosis (ALS). Recently, the number of dinucleotide CA repeats in an intron of the STMN2 gene was reported to be associated with increased risk for ALS. Here, we used a case-control cohort of whole genome sequencing (WGS) as well as WGS from populations in the gnomAD cohort to attempt to replicate this proposed association. We find that repeats well above the previously reported pathogenic threshold of 19 are commonly observed in unaffected individuals across different populations. Further, we did not observe an association between longer STMN2 CA repeats and ALS phenotype. In summary, our results do not support a role of STMN2 CA repeats towards ALS risk.
DisclosuresNone to disclose | genetic and genomic medicine |
10.1101/2022.05.01.22274546 | Adherence to the HIV early infant diagnosis testing protocol among HIV exposed infants in a hard-to-reach fishing community in Uganda | BackgroundInfants born to HIV-infected mothers are at a high risk of acquiring the infection. The World Health Organization (WHO) recommends early diagnosis of HIV-exposed infants (HEIs) through deoxyribonucleic acid polymerase chain reaction (DNA PCR) and rapid HIV testing. Early detection of paediatric HIV is critical for access to antiretroviral therapy treatment (ART) and child survival. Theres, however, limited evidence of the adherence to early infant diagnosis (EID) of HIV testing protocol among HEIs in fishing communities in Uganda. This study assessed adherence to EID of HIV testing protocol among HIV-exposed infants in a hard-to-reach fishing community in Uganda.
MethodsWe conducted a cross-sectional study employing quantitative data collection methods among HEIs in selected healthcare facilities in Buvuma islands, Buvuma district. We obtained secondary data from mother-infant pair files enrolled on the EID program using a data extraction tool. Data were analysed using STATA Version 14. Modified poisson regression analysis was used to determine the factors associated with non-adherence to the 1st DNA PCR test among HIV-exposed infants enrolled into care.
ResultsNone of the HIV-exposed infants had done all the EID tests prescribed by the HIV testing protocol within the recommended time frame for the period of January 2014-December 2016. Adherence to the 1st and 2nd DNA PCR, and rapid HIV tests was 39.5%, 6.1% and 81.0% respectively. Being under the care of single mothers (PR=1.11, 95% CI: 1.01-1.23, p=0.023) and cessation of breast feeding (PR=0.90, 95% CI: 0.83-0.98, p=0.025) were significantly associated with non-adherence to the 1st DNA PCR.
ConclusionNone of the HIV-exposed infants adhered to all the EID tests of HIV testing protocol. Adherence to the 1st DNA PCR was positively associated with being a single mother and exclusive breast feeding. Therefore, single mothers and those who stop breastfeeding should be supported to ensure timely EID. | hiv aids |
10.1101/2022.04.30.22274189 | Then and NOW: A Prospective Population-Level Validation of the Abbott ID NOW SARS-CoV-2 Device Implemented in Multiple Settings for Testing Asymptomatic and Symptomatic Individuals | IMPORTANCESince December, 2020, the ID NOW was implemented for use in 4 different populations across Alberta: in mobile units as part of community outbreak response, COVID-19 community collection sites, emergency shelters and addiction treatment facilities (ES), and hospitals.
OBJECTIVEDiagnostic evaluation of the ID NOW in various real world settings among symptomatic and asymptomatic individuals
DESIGNDepending on the implemented site, the ID NOW was tested on patients with symptoms suggestive of COVID-19, asymptomatic close contacts or asymptomatic individuals as part of outbreak point prevalence screening. From Jan - April, a select number of sites also switched from using oropharyngeal swabs to combined oropharyngeal + nasal (O+N) swabs. For every individual tested, two swabs were collected: one for ID NOW testing and the other for either reverse-transcriptase polymerase chain reaction (RT-PCR) confirmation of negative ID NOW results or for variant testing of positive ID NOW results.
RESULTSA total of 129,112 paired samples were analyzed (16,061 RT-PCR positive). 81,697 samples were from 42 COVID-19 community collection sites, 16,924 from 69 rural hospitals, 1,927 from 9 ES, 23,802 samples from 6 mobile units that responded to 356 community outbreaks, and 4,762 from 3 community collection sites and 1 ES using O+N swabs for ID NOW testing. ID NOW sensitivity was highest among symptomatic individuals presenting to community collection sites [92.5%, 95% confidence interval (CI) 92.0-93.0%, n=10,633 RT-PCR positive] and lowest for asymptomatic individuals associated with community outbreaks (73.9%, 95% CI 69.8-77.7%, n=494 RT-PCR positive). Specificity was greater than 99% in all populations tested, but positive predictive value (PPV) was lower among asymptomatic populations (82.4-91.3%) compared to symptomatic populations (96.0-96.9%). There was no statistically significant differences in sensitivity with respect to age, gender, NP vs OP swab for RT-PCR confirmation, variants of concern, or with combined oropharyngeal and nasal swabs using COVID-19 ID NOW testing.
CONCLUSIONSSensitivity of ID NOW SARS-CoV-2 testing is highest when used on symptomatic community populations not seeking medical care. Sensitivity and PPV drops by approximately 10% when tested on asymptomatic populations. Using combined oropharyngeal and nasal swabs did not improve ID NOW performance. | infectious diseases |
10.1101/2022.04.29.22274454 | Effectiveness of Covid-19 vaccines against SARS-CoV-2 Omicron variant (B.1.1.529): A systematic review with meta-analysis and meta-regression | BackgroundThere is a need for evaluation regarding vaccine effectiveness (VE) and the urgency of booster vaccination against Covid-19 B.1.1.529 (Omicron) variant.
MethodsSystematic search was conducted on April 6th, 2022, on databases (PubMed, ScienceDirect, CENTRAL, Web of Science, Scopus). VE difference (VED) estimates were assessed using random-effects model and DerSimonian-Laird tau estimators. Two models result, i.e., within 3 months and within 3 months or more, are compared. VE versus time meta-regression analysis was evaluated using mixed-effects model with Restricted-Maximum Likelihood tau estimators and Hartung-Knapp adjustments.
FindingsAd26.COV2.S, BNT162b2, ChAdOx1 nCov-19, and mRNA-1273 vaccines were included in the analyses. Compared to full dose, booster dose of overall vaccines provided better protection against any (VED=22% (95%CI 15%-29%), p<0.001), severe (VED=20% (95%CI 8%-32%), p=0.001) and symptomatic (VED=22% (95%CI 11%-34%), p<0.001) Omicron infections within 3 months, as well as within 3 months or more (VED=30% (95%CI 24%-37%), p<0.001 for any, VED=18% (95%CI 13%-23%), p<0.001 for severe and VED=37% (95%CI 29%-46%), p<0.001 for symptomatic infections). The meta-regression analysis of overall vaccines revealed that the full dose VE against any and symptomatic Omicron infections were significantly reduced each month by 3.0% (95%CI 0.9%-4.8%, p=0.004) and 5.2% (95%CI 3.3%-7.1%, p=0.006), respectively; whereas booster dose effectiveness against severe and symptomatic Omicron infections were decreased by 3.7% (95%CI 5.1%-12.6%, p=0.030) and 3.9% (95%CI 1.2%-6.5%, p=0.006), respectively.
InterpretationCompared to full dose only, a booster dose addition provides better protection against B.1.1.529 infection. Although the VE estimates of Ad26.COV2.S, BNT162b2, ChAdOx1 nCov-19, and mRNA-1273 vaccines against B.1.1.529 infection after both full and booster doses are generally moderate, and the booster dose provides excellent protection against severe infection, it is important to note that the VE estimates decline over time, suggesting the need for a regular Covid-19 booster injection after certain period of time to maintain VE. | allergy and immunology |
10.1101/2022.04.29.22273784 | Comparative Analysis Of Bacteriological And Physicochemical Characteristics Of Water Samples From Different Sources Used By Students Of Federal University Dutsinma, Katsina State, Nigeria. | The aim of this study is to comparatively analyze the bacterial load and physiochemical parameters of water samples from various sources used by the students of Federal University, Dutsinma Katsina State. Samples from tap, well, dam, rain, sachet and boreholes were collected at different locations where students reside. There were 6 sources of water, namely; tap water, dam water, well water, borehole water, sachet water and rain water from which 10 samples were obtained each, making a total of 60 samples for analysis. The physicochemical parameters of each water samples were detected. According the technique adopted by Chessbrough. (2000), the samples were serially diluted, 3 test tubes were sterilized and distilled water of 9ml were Pipette into these test tubes, 1ml of the water sample was pipette into the first test tube and was shaken vigorously to have a homogeneous mixture (stock). Bacterial count of each water sample was carried out and presence of Escherichia coli, P.aeruginosa, S.aureus, S.typhi, K.pneumoniae, B.subtilis, Proteus sp, Shigella sp, and E.aerogenes were identified. Biochemical tests were carried out for accurate characterization of the isolates. The pattern of occurrence the studied physico-chemical parameters (except pH) of borehole water, sachet water, Dam, Rain, well, tap water were within the permissible limit set by World Health Organization. The pH of all samples of sachet water were within the permissible limit set by World Health Organization However, the pH of 7 out of 10 samples of borehole water and 8 out of 10 samples of tap water were within the permissible limit set by World Health Organization. The prevalence of indicator organisms in water samples are as follows; Klebsiella pneumoniae (Dam water=100%, Sachet water=0, Tap water=80%, Borehole=70%, Rain=20%, Well=100%), Escherichia coli (Dam water=100%, Sachet water=0, Tap water=20%, Borehole=10%, Rain=0, Well=100%), Pseudomonas aeruginosa (Dam water=100%, Sachet water=60%, Tap water=100%, Borehole=100%, Rain=100, Well=100%), Staphylococcus aureus (Dam water=100%, Sachet water=40%, Tap water=50%, Borehole=50%, Rain=20%, Well=100%), Salmonella TYPHI (Dam water=100%, Sachet water=0, Tap water=30%, Borehole=10%, Rain=10%, Well=100%), Bacillus subtilis (Dam water=100%, Sachet water=60%, Tap water=70%, Borehole=50%, Rain=30%, Well=100%), Proteus sp (Dam water=100%, Sachet water=0, Tap water=50%, Borehole=30%, Rain=10%, Well=100%), Shigella sp (Dam water=100%, Sachet water=0, Tap water=30%, Borehole=10%, Rain=10%, Well=100%), Enterobacter aerogenes (Dam water=100%, Sachet water=30%, Tap water=60%, Borehole=50%, Rain=30%, Well=100%). The research indicates the polluted condition of water in Dutsin-ma. Only sachet water is fit for consumption without further treatment in Dutsin-ma. Tap and borehole water should be treated before consumption. Dam water and well water should be used for other domestic purposes. However it should be treated by sedimentation followed by boiling. Rain water has less bacterial load but has an acidic pH, therefore it is unfit for consumption. | public and global health |
10.1101/2022.05.01.22274540 | Keeping it close: The role of a Campus COVID Support Team (CCST) in sustaining a safe and healthy university campus during COVID-19 | It has been over 24 months since the start of the COVID-19 pandemic forced university campuses to shut down and then reopen under new safety guidelines. Now as we move into the subsequent years of the pandemic, we can look back and evaluate what has worked, improvements to be made, and plans for providing a sustained response for a campus community. In this article we detail one campus response to the COVID-19 pandemic and directions being taken to ensure a sustained campus COVID support team (CCST) is in hand to ensure the health and safety of the university community. The CCST was created to serve as a one-stop-shop to help the university community navigate COVID-19 policies and procedures. The responsibilities of the CCST include conducting case investigations for any positive COVID-19 tests within the university community, contact tracing for authorized university affiliates, epidemiological surveillance and mitigation efforts, and communication through real-time analysis and dashboards. Continuous monitoring procedures demonstrated the CCST conducted all case investigations within the post-testing 24-hour window, thus keeping the university test-positivity rate below 3%. Quality improvement surveys demonstrated a high level of satisfaction with the CCST efforts and provided areas for improvement and sustainability. Having a public health faculty led CCST enabled the university to act swiftly when COVID-19 positive cases were emerging and deter widespread campus COVID-19 outbreaks. The CCST timeliness and connectivity to the campus has demonstrated benefits to the health and safety of the campus.
HighlightsO_LIUniversities are their own communities and having on campus COVID support teams can mitigate potential COVID-19 outbreaks.
C_LIO_LIHaving a public health driven Campus COVID Support Team that can conduct case investigations within 24 hours of a positive test result has demonstrated benefits to taking responsive measures.
C_LIO_LIContinuous quality improvement efforts including surveys of the Campus COVID Support Team should be implemented for any COVID service efforts.
C_LI | public and global health |
10.1101/2022.04.28.22274307 | Acute exercise induces natural killer cell mobilization but not infiltration into prostate cancer tissue in humans | The mobilization and activation of natural killer (NK) cells have been proposed as key mechanisms promoting anti oncogenic effects of physical exercise. Although mouse models have proven that physical exercise recruits NK cells to tumor tissue and inhibits tumor growth, this preclinical finding has not been transferred to the clinical setting yet. In this first-in-human study, we found that physical exercise mobilizes and redistributes NK cells, especially those with a cytotoxic phenotype, in line with preclinical models. However, physical exercise did not increase NK cell tumor infiltrates. Future studies should carefully distinguish between acute and chronic exercise modalities and should be encouraged to investigate more immune responsive tumor entities. | sports medicine |
10.1101/2022.04.28.22274398 | A role for super-spreaders in carrying malaria parasites across the months-long dry season | In malaria endemic regions, transmission of Plasmodium falciparum parasites is often seasonal with very low transmission during the dry season and high transmission in the wet season. Parasites survive the dry season within some individuals who experience prolonged carriage of parasites and are thought to seed infection in the next transmission season. We use a combination of mathematical simulations and data analysis to characterise dry season carriers and their role in the subsequent transmission season. Simulating the life-history of individuals experiencing repeated exposure to infection predicts that dry season carriage is more likely in the oldest, most exposed and most immune individuals. This hypothesis is supported by data from a longitudinal study in Mali that shows that carriers are significantly older, experience a higher biting rate at the beginning of the transmission season and develop clinical malaria later than non-carriers. Further, since the most exposed individuals in a community are most likely to be dry season carriers, we show that this is predicted to enable a more than 2-fold faster spread of parasites into the mosquito population at the start of the subsequent wet season. | infectious diseases |
10.1101/2022.05.01.22274551 | A 5-year review of incidence, presentation and management of Bartholin gland cysts and abscesses in a tertiary hospital, Yenagoa, South-South Nigeria | BackgroundBartholin gland cysts and abscesses are common in women of reproductive age and declines after menopause. Organisms implicated in Bartholin abscess include, Neisseria gonorrhoeae, Chlamydia trachomatis, Escherichia coli, Staphylococcus aureus and Bacterioides spp.
AimTo determine the incidence, presentation and management of Bartholin gland cysts and abscesses at the Federal Medical Centre, Yenagoa, Bayelsa State, South-South Nigeria, over a five-year period.
Settings and DesignThis retrospective study was conducted in the Gynaecological Unit of the Federal Medical Centre, Yenagoa, Bayelsa State, South-South, Nigeria, between January 1, 2016 and December 31, 2020.
Materials and MethodsRelevant data were retrieved, entered into a pre-designed proforma, and analysed.
Statistical AnalysisStatistical Package for the Social Sciences version 25.0 was used for data analysis. Results were presented in frequencies and percentages for categorical variables.
ResultsThere were 2,478 gynaecological cases managed in our Centre; out of which there were 26 cases of Bartholin cyst and abscess, giving an incidence of 1.05%. Most of the women were [≤] 30 years (14, 53.8%), single (17, 65.4%), nulliparous (13, 50.0%), traders (11, 42.3%), with only primary/secondary education (18, 69.2%). The left Bartholin gland was the most frequently affected (17, 65.4%). A positive microbial culture was obtained in 84% of cases, with Staphylococcus aureus and Escherichia coli being the isolated organisms. Marsupialisation was the treatment modality in all the patients.
ConclusionWomen of reproductive age-group should be counselled on this condition and encouraged to keep good perineal hygiene and better sexual conduct so as to reduce the risk of Bartholin cysts and abscesses. | obstetrics and gynecology |
10.1101/2022.04.28.22274306 | Ultrasonic Artificial Intelligence Shows Statistically Equivalent Performance for Thyroid Nodule Diagnosis to Fine Needle Aspiration Cytopathology and BRAFV600E Mutation Analysis Combined | ObjectiveTo investigate the difference between an artificial intelligence (AI) system, fine-needle aspiration (FNA) cytopathology, BRAFV600E mutation analysis and combined method of the latter two in thyroid nodule diagnosis.
MethodsUltrasound images of 490 thyroid nodules (378 patients) with postsurgical pathology or twice of consistent combined FNA examination outcomes with a half-year interval, which were considered as gold standard, were collected and analyzed. The diagnostic efficacies of an AI diagnostic system and FNA-based methods were evaluated in terms of sensitivity, specificity, accuracy,{kappa} coefficient compared to the gold standard.
ResultsThe malignancy threshold of 0.53 for an AI system was selected according to the optimization of Youden index based on a retrospective cohort of 346 nodules and then applied for a prospective cohort of 144 nodules. The combined method of FNA cytopathology according to Bethesda risk stratification system and BRAFV600E mutation analysis showed no significant difference in comparison with the AI diagnostic system in accuracy for both the retrospective and prospective cohort in our single center study. Besides, for the 33 indeterministic Bethesda system category III and IV nodules included in our study, the AI system showed no significant difference in comparison with the BRAFV600E mutation analysis.
ConclusionThe evaluated AI diagnostic system showed similar diagnostic performance to FNA cytopathology combined with and BRAFV600E mutation analysis. Given its advantages in ease of operation, time efficiency, and noninvasiveness for thyroid nodule screening as well as the wide availability of ultrasonography, it can be widely applied in all levels of hospitals and clinics to assist radiologists for thyroid nodule diagnosis and is expected to reduce the need for relatively invasive FNA biopsies and thereby reducing the associated risks and side effects as well as to shorten the diagnostic time. | oncology |
10.1101/2022.05.02.22274436 | Omicron infection induces low-level, narrow-range SARS-CoV-2 neutralizing activity | BackgroundThe rapid worldwide spread of the mildly pathogenic SARS-CoV-2 Omicron variant has led to the suggestion that it will induce levels of collective immunity that will help putting an end to the COVID19 pandemics.
MethodsConvalescent serums from non-hospitalized individuals previously infected with Alpha, Delta or Omicron BA.1 SARS-CoV-2 or subjected to a full mRNA vaccine regimen were evaluated for their ability to neutralize a broad panel of SARS-CoV-2 variants.
FindingsPrior vaccination or infection with the Alpha or to a lesser extent Delta strains conferred robust neutralizing titers against most variants, albeit more weakly against Beta and even more Omicron. In contrast, Omicron convalescent serums only displayed low level of neutralization activity against the cognate virus and were unable to neutralize other SARS-CoV-2 variants.
InterpretationModerately symptomatic Omicron infection is only poorly immunogenic and does not represent a substitute for vaccination.
FundingEPFL COVID Fund; private foundation advised by CARIGEST SA; Private Foundation of the Geneva University Hospitals; General Directorate of Health of the canton of Geneva, the Swiss Federal Office of Public Health. | infectious diseases |
10.1101/2022.05.02.22274368 | Automatized and combined HIV, HBV, HCV, and syphilis testing among illegal gold miners in French Guiana using a standardized dried blood device | BackgroundBlood spotted onto filter paper can be easy collected outside healthcare facilities and shipped to central laboratory for serological testing. However, dried blood testing generally requests manual processing for elution. In this study, we used a standardized blood collection device combined with an automatized elution system to test illegal gold miners living in French Guiana for HIV, HBV, HCV and syphilis.
Material and MethodsWe included 378 participants in five resting sites of illegal gold mining. Serological status was determined by testing serum samples. Blood collected on the Ser-Col device (Labonovum), was eluted using an automated system (SCAUT Ser-Col automation, Blok System Supply) and an automatized analyzer (Alinity i, Abbott). Ser-Col results were compared to serum considered as the reference, and dried blood spot samples were processed manually.
ResultsTwo participants tested positive for HIV (0.5%), eight tested positive for hepatitis B surface antigen (HBsAg), four were weakly positive for anti-HCV antibodies (1%) but negative for HCV RNA, and 47 tested positive for syphilis (12.4%). We observed a full concordance of Ser-Col and DBS results for HIV diagnosis compared to serum results. Ser-Col and DBS samples tested positive in seven HBsAg carriers and negative for one participants having a low HBsAg level in serum (0.5 IU/mL). Three hundred and sixty-eight participants uninfected by HBV tested negative for HBsAg (99.5%). Two Ser-Col samples and two other DBS samples positive tested positive in HBV uninfected participants (false positive results with low S/CO indexes). All participants tested negative for HCV in Ser-Col and DBS samples, including in dried blood samples from the four participants tested low positive for HCV antibodies and HCV RNA negative in serum. The threshold of the treponemal assay was optimized for dried blood sample testing. Among syphilis seropositive participants, 35 (74.61%) tested positive for treponemal antibodies in Ser-Col and DBS samples. Among participants seronegative for syphilis, 326 (98.4%) and 325 (98.1%) tested negative in Ser-Col and DBS samples, respectively.
ConclusionThe Ser-Col method allows automatized dried blood testing of HIV, HBV, HCV and syphilis with performances comparable to DBS. Automated approaches to test capillary blood transported on dried blood devices may facilitate large scale surveys and improve testing of populations living in remote areas for infectious diseases. | infectious diseases |
10.1101/2022.05.01.22274406 | Continued Emergence and Evolution of Omicron in South Africa: New BA.4 and BA.5 lineages | South Africas fourth COVID-19 wave was driven predominantly by three lineages (BA.1, BA.2 and BA.3) of the SARS-CoV-2 Omicron variant of concern. We have now identified two new lineages, BA.4 and BA.5. The spike proteins of BA.4 and BA.5 are identical, and comparable to BA.2 except for the addition of 69-70del, L452R, F486V and the wild type amino acid at Q493. The 69-70 deletion in spike allows these lineages to be identified by the proxy marker of S-gene target failure with the TaqPath COVID-19 qPCR assay. BA.4 and BA.5 have rapidly replaced BA.2, reaching more than 50% of sequenced cases in South Africa from the first week of April 2022 onwards. Using a multinomial logistic regression model, we estimate growth advantages for BA.4 and BA.5 of 0.08 (95% CI: 0.07 - 0.09) and 0.12 (95% CI: 0.09 - 0.15) per day respectively over BA.2 in South Africa. | infectious diseases |
10.1101/2022.05.02.22274571 | Real-World Progression-Free Survival as an Endpoint in Advanced Non-Small-Cell Lung Cancer: Replicating Atezolizumab and Docetaxel Arms of the OAK Trial Using Data Derived From Electronic Health Records | BackgroundEvaluating cancer treatments in real-world data (RWD) requires informative endpoints. Due to non-standardized data collection in RWD, it is unclear if and when common oncology endpoints are approximately equivalent to their clinical trial analogues. This study used RWD to replicate both the atezolizumab and docetaxel arms of the OAK trial. Outcomes using progression-free survival (PFS) derived from abstracted physicians notes in RWD (rwPFS) were then compared against PFS outcomes derived according to Response Evaluation Criteria in Solid Tumors (RECIST) from the clinical trial (ctPFS).
MethodsAtezolizumab and docetaxel arms of the phase III OAK RCT (NCT02008227) were replicated in a US nationwide real-world database by applying selected OAK inclusion/exclusion criteria, followed by adjustment for baseline prognostic variables using propensity score-based methods. Multiple rwPFS definitions were characterized and a definition was chosen that was acceptable from both clinical and data analysis perspectives. Concordance of outcomes was assessed using Kaplan-Meier (KM) medians and hazard ratios (HRs).
ResultsOverall, 133 patients receiving atezolizumab and 479 patients receiving docetaxel were selected for the RWD cohort. After adjustment, prognostic variables were balanced between RCT arms and corresponding RWD cohorts. Comparing rwPFS against ctPFS outcomes in terms of KM median and HR showed better concordance for docetaxel (2.99 vs 3.52 months; HR, 0.99, 95% CI, 0.85-1.15) than for atezolizumab (3.71 vs 2.76 months; HR, 0.8, 95% CI 0.61-1.02). The latter improved when events labelled "pseudo-progression" were excluded from the RWD (im-rwPFS) and immune-modified RECIST PFS (im-ctPFS) was used in the RCT Atezolizumab data (4.24 vs 4.14 months; HR, 0.95, 95% CI, 0.70-1.25). These findings were robust across several sensitivity analyses.
ConclusionsWhile rwPFS and ctPFS were similar under docetaxel treatment, this was only the case for atezolizumab when immune-modified progression criteria were used, suggesting that similarity of RWD endpoints to their clinical trial analogues depends on drug category and possibly other factors. Replication of RCTs using RWD and comparison of outcomes can be used as a tool for characterizing RWD endpoints. Additional studies are needed to verify these findings and to better understand the conditions for approximate numerical equivalence of rwPFS and ctPFS endpoints. | epidemiology |
10.1101/2022.04.30.22274239 | GALC variants affect galactosylceramidase enzymatic activity and risk of Parkinson's disease | The association between glucocerebrosidase (GCase), encoded by GBA, and Parkinsons disease highlights the role of the lysosome in Parkinsons disease pathogenesis. Genome-wide association studies (GWAS) in Parkinsons disease have revealed multiple associated loci, including the GALC locus on chromosome 14. GALC encodes the lysosomal enzyme galactosylceramidase (GalCase), which plays a pivotal role in the glycosphingolipid metabolism pathway. It is still unclear whether GALC is the gene driving the association in the chromosome 14 locus, and if so, by which mechanism.
We first aimed to examine whether variants in the GALC locus and across the genome are associated with GalCase activity. We performed a GWAS in two independent cohorts from a)Columbia University and b)the Parkinsons Progression Markers Initiative study, followed by a meta-analysis with a total of 1,123 Parkinsons disease patients and 576 controls with available data on GalCase activity. We further analyzed the effects of common GALC variants on expression and GalCase activity using colocalization methods. Mendelian randomization was used to study whether GalCase activity may be causal in Parkinsons disease. To study the role of rare GALC variants we analyzed sequencing data from a total of 5,028 Parkinsons disease patients and 5,422 controls. Additionally, we studied the functional impact of GALC knock-out on alpha-synuclein accumulation and on GCase activity in neuronal cell models and performed in silico structural analysis of common GALC variants associated with altered GalCase activity.
The top hit in Parkinsons disease GWAS in the GALC locus, rs979812, is associated with increased GalCase activity (b=1.2; se=0.06; p=5.10E-95). No other variants outside the GALC locus were associated with GalCase activity. Colocalization analysis demonstrated that rs979812 was also associated with increased GalCase expression. Mendelian randomization suggested that increased GalCase activity may be causally associated with Parkinsons disease (b=0.025, se=0.007, p=0.0008). We did not find an association between rare GALC variants and Parkinsons disease. GALC knockout using CRISPR-Cas9 did not lead to alpha-synuclein accumulation, further supporting that increased rather than reduced GalCase levels may be associated with Parkinsons disease. The structural analysis demonstrated that the common variant p.I562T may lead to improper maturation of GalCase affecting its activity.
Our results nominate GALC as the gene associated with Parkinsons disease in this locus and suggest that the association of variants in the GALC locus may be driven by their effect of increasing GalCase expression and activity. Whether altering GalCase activity could be considered as a therapeutic target should be further studied. | neurology |
10.1101/2022.04.30.22270885 | Dopamine D1 agonist effects in late-stage Parkinson's disease | AO_SCPLOWBSTRACTC_SCPLOWO_ST_ABSBackgroundC_ST_ABSCurrent pharmacotherapy has limited efficacy and/or intolerable side effects in late-stage Parkinsons disease (LsPD) patients whose daily life depends primarily on caregivers and palliative care. Clinical metrics inadequately gauge efficacy in LsPD patients.
ObjectiveExplore if a D1/5 dopamine agonist will have efficacy in LsPD that will be detected most sensitively by caregivers in a phase I study.
MethodsA double-blind controlled phase Ia/b study compared the D1/5 agonist PF-06412562 to levodopa/carbidopa in six LsPD patients. Throughout the study, caregivers were with the patients. Assessments included standard quantitative scales of motor function (MDS-UPDRS-III), alertness (Glasgow Coma and Stanford Sleepiness Scales), and cognition (Severe Impairment and Frontal Assessment Batteries) at baseline (Day 1) and thrice daily during drug testing (Days 2 and 3). Clinicians and caregivers completed clinical impression of change questionnaires, and caregivers participated in a qualitative exit interview. Blinded triangulation of quantitative and qualitative data was used to integrate findings.
ResultsNeither traditional scales, nor clinician impression of change, detected consistent differences between treatments in the five participants who completed the study. Conversely, the overall caregiver data strongly favored PF-06412562 over levodopa in four of five patients. The most meaningful improvements converged on motor, alertness, and engagement.
ConclusionD1/5 agonists may offer potential benefit for LsPD patients. Caregiver perspectives with mixed method analyses may overcome limitations in standard rater/clinician-based evaluations. Further studies are warranted and need to integrate caregiver input as an essential component of outcome evaluations.
Trial Registration#ClinicalTrials.gov: NCT03665454 | neurology |
10.1101/2022.05.02.22274580 | Genome-wide investigation of maximum habitual alcohol intake (MaxAlc) in 247,755 European and African Ancestry U.S. Veterans informs the relationship between habitual alcohol consumption and alcohol use disorder. | Importance: Alcohol genome-wide association studies (GWAS) have generally focused on alcohol consumption and alcohol use disorder (AUD); few have examined habitual drinking behaviors like maximum habitual alcohol intake (MaxAlc). Objective: Identify MaxAlc loci and elucidate the genetic architecture across alcohol traits. Design: The MaxAlc GWAS was performed in Million Veteran Program (MVP) participants enrolled from January 10, 2011 to September 30, 2020. Ancestry-specific GWAS were conducted in European (EUR) (n=218,623) and African (AFR) (n=29,132) ancestry subjects, then meta-analyzed (N=247,755). Linkage-disequilibrium score regression (LDSC) was used to estimate SNP-heritability and genetic correlations (rg) with other alcohol and psychiatric traits. Genomic structural equation modeling (gSEM) was used to evaluate genetic relationships between MaxAlc and other alcohol traits. Mendelian randomization (MR) was used to examine causal relationships. MTAG (multi-trait analysis of GWAS) was used to analyze MaxAlc and problematic alcohol use (PAU) jointly. Setting: The study was performed in a sample of U.S military Veterans. Participants: Participants were 92.68% male and had mean age=65.92 (SD=11.70). 36.92% reported MaxAlc [≥] the binge-drinking threshold. Main Outcomes(s) and Measure(s): MaxAlc was defined from survey item: 'in a typical month, what is/was the largest number of drinks of alcohol you may have had in one day?' with ordinal responses from 0 [≥] 15 drinks. Results: The MaxAlc GWAS resulted in 15 genome-wide significant (GWS) loci. Top associations in EUR and AFR were with known functional variants ADH1B*rs1229984 (p=3.12x10-104) and rs2066702 (p=6.30x10-17), respectively. Multiple novel associations were found. The SNP-heritability was 6.65% (s.e.=0.41%) in EUR and 3.42% (s.e.=1.46%) in AFR. MaxAlc was positively correlated with PAU (rg=0.79; p=3.95x10-149) and AUD (rg=0.76; p=1.26x10-127), and had negative rg with 'alcohol usually taken with meals' (rg=-0.53; p=1.40x10-50). For psychiatric traits, MaxAlc had the strongest rg with suicide attempt (rg=0.40; p=3.02x10-21). gSEM supported a two-factor model with MaxAlc loading on a factor with PAU and AUD, and other alcohol consumption measures loading a separate factor. MR supported a small causal effect of MaxAlc on the liver enzyme gamma-glutamyltransferase ({beta}=0.012; p=2.66x10-10). MaxAlc MTAG resulted in 31 GWS loci. Conclusions and Relevance: MaxAlc closely aligns genetically with the etiology of problematic alcohol use traits. | genetic and genomic medicine |
10.1101/2022.05.02.22274580 | Genome-wide investigation of maximum habitual alcohol intake (MaxAlc) in 247,755 European and African Ancestry U.S. Veterans informs the relationship between habitual alcohol consumption and alcohol use disorder. | Importance: Alcohol genome-wide association studies (GWAS) have generally focused on alcohol consumption and alcohol use disorder (AUD); few have examined habitual drinking behaviors like maximum habitual alcohol intake (MaxAlc). Objective: Identify MaxAlc loci and elucidate the genetic architecture across alcohol traits. Design: The MaxAlc GWAS was performed in Million Veteran Program (MVP) participants enrolled from January 10, 2011 to September 30, 2020. Ancestry-specific GWAS were conducted in European (EUR) (n=218,623) and African (AFR) (n=29,132) ancestry subjects, then meta-analyzed (N=247,755). Linkage-disequilibrium score regression (LDSC) was used to estimate SNP-heritability and genetic correlations (rg) with other alcohol and psychiatric traits. Genomic structural equation modeling (gSEM) was used to evaluate genetic relationships between MaxAlc and other alcohol traits. Mendelian randomization (MR) was used to examine causal relationships. MTAG (multi-trait analysis of GWAS) was used to analyze MaxAlc and problematic alcohol use (PAU) jointly. Setting: The study was performed in a sample of U.S military Veterans. Participants: Participants were 92.68% male and had mean age=65.92 (SD=11.70). 36.92% reported MaxAlc [≥] the binge-drinking threshold. Main Outcomes(s) and Measure(s): MaxAlc was defined from survey item: 'in a typical month, what is/was the largest number of drinks of alcohol you may have had in one day?' with ordinal responses from 0 [≥] 15 drinks. Results: The MaxAlc GWAS resulted in 15 genome-wide significant (GWS) loci. Top associations in EUR and AFR were with known functional variants ADH1B*rs1229984 (p=3.12x10-104) and rs2066702 (p=6.30x10-17), respectively. Multiple novel associations were found. The SNP-heritability was 6.65% (s.e.=0.41%) in EUR and 3.42% (s.e.=1.46%) in AFR. MaxAlc was positively correlated with PAU (rg=0.79; p=3.95x10-149) and AUD (rg=0.76; p=1.26x10-127), and had negative rg with 'alcohol usually taken with meals' (rg=-0.53; p=1.40x10-50). For psychiatric traits, MaxAlc had the strongest rg with suicide attempt (rg=0.40; p=3.02x10-21). gSEM supported a two-factor model with MaxAlc loading on a factor with PAU and AUD, and other alcohol consumption measures loading a separate factor. MR supported a small causal effect of MaxAlc on the liver enzyme gamma-glutamyltransferase ({beta}=0.012; p=2.66x10-10). MaxAlc MTAG resulted in 31 GWS loci. Conclusions and Relevance: MaxAlc closely aligns genetically with the etiology of problematic alcohol use traits. | genetic and genomic medicine |
10.1101/2022.05.01.22274549 | SENSITIVITY ANALYSIS OF A TUBERCULOSIS (TB) MATHEMATICAL MODEL | In this work, we have constructed a Mathematical model for the transmission dynamics of tuberculosis TB. Feasibility and positivity of solutions of the model are determined and it is established that the model is well posed and that the solutions are all positive. The disease-free equilibrium (DFE) is also determined and the basic reproduction number R0 is computed. Sensitivity analysis of the basic reproduction number is conducted to find parameters of the model that are most sensitive and should be targeted by intervention strategies. It was therefore, observed through sensitivity analysis that TB induced death (d) has a high impact on R0 and varies inversely with R0. Graphical simulation of the model parameters was performed. It was observed that model parameters such as infection rate ({beta}), recruitment rate ({pi}) and rate of movement from latent to active TB () are directly proportional to the basic reproduction numberR0. Finally, it is observed that the basic reproduction number R0 is a decreasing function of the recovery rate ({gamma}) and natural death rate (). | epidemiology |
10.1101/2022.05.02.22274579 | Medical journeys of victims of exploitation and trafficking: a retrospective cohort study using a large multi-state dataset and new ICD-10 codes | BackgroundTrafficking and exploitation for sex or labor affects millions of persons worldwide, but their use of the healthcare system is not characterized systematically. In 2018 new ICD-10 codes were implemented in the US for diagnosis of patients who were victims of trafficking or exploitation but todate no comprehensive study reported on their medical journeys. Our objective was to examine the extent to which newly implemented ICD-10 codes pertaining to human trafficking were being used in the healthcare system, and to characterize the care received by the victims.
Methods and FindingsDatabase search of a large US health insurer that contained approximately 43 million patients holding commercial or government-provided health insurance across the United States (US). For the dates 2018-09-01 to 2022-03-01, the dataset was found to contain 3,967 instances of the ICD-10 codes. In 31% of the cases, the codes were used as the principal or admitting diagnosis. Regression analysis of the codes found a 2% increase in the adoption of these codes. The codes were used by 1,532 distinct providers, the majority in non-hospital settings.
The data contained 2301 patients of which 1025 were treated for recent exploitation or trafficking, which corresponds to identification of 13 new patients each week. Of the latter patients, 90.1% were victims of sexual exploitation, and 9.9% victims of labor exploitation. The patients were predominantly female (81%), insured by Medicaid (60%) and the median age was 20 (interquartile range: 15-35). Healthcare utilization over 12 months after diagnosis of trafficking was $32,192 (standard deviation: $160,730) - much greater than average for Medicaid patients ($6,556 per year in 2019). The victims were characterized by a high prevalence of mental health conditions, high utilization of the emergency department, and these persisted after diagnosis of exploitation. Surprisingly, 57% of the victims were first diagnosed outside hospitals, primarily during office and psychiatric visits, and only 25% were first diagnosed during an emergency department visit.
ConclusionsThe new ICD-10 codes provide valuable insights into the profiles and medical journeys of victims of trafficking. However, adoption of the codes is not widespread, increasing by only 2% per year and the majority of the victims are likely still not diagnosed by the new codes. Greater adoption, when consistent with patient safety, may improve care coordination and outcomes. | pediatrics |
10.1101/2022.05.02.22274560 | The impact of post-traumatic stress disorder in pharmacological intervention outcomes for adults with bipolar disorder: a protocol for a systematic review and meta-analysis. | BackgroundRecent data indicates high prevalence of post-traumatic stress disorder (PTSD) in bipolar disorder (BD). PTSD may play a role in poor treatment outcomes and quality of life for people with BD. Despite this, few studies have examined the pharmacological treatment interventions and outcomes for this comorbidity. This systematic review will bring together currently available evidence regarding the impact of comorbid PTSD on pharmacological treatment outcomes in adults with BD.
MethodsA systematic search of Embase, MEDLINE Complete, PsycINFO, and the Cochrane Central Register of Controlled Trials (CENTRAL) will be conducted to identify randomised and non-randomised studies of pharmacological interventions for adults with diagnosed bipolar disorder and PTSD. Data will be screened and extracted by two independent reviewers. Literature will be searched from the creation of the databases until April 1 2021. Risk of bias will be assessed using the Newcastle-Ottawa Scale and the Cochrane Collaborations Risk of Bias tool. A meta-analysis will be conducted if sufficient evidence is identified in the systematic review. The meta-analysis will employ a random-effects model and be evaluated using the I2 statistic.
DiscussionThis review and meta-analysis will be the first to systematically explore and integrate the available evidence on the impact of PTSD on pharmacological treatments and outcome in those with BD. The results and outcomes of this systematic review will provide directions for future research and be published in relevant scientific journals and presented at research conferences.
Systematic review registrationThe protocol has been registered at the International Prospective Register of Systematic Reviews (PROSPERO; registration number: CRD42020182540). | psychiatry and clinical psychology |
10.1101/2022.05.02.22274572 | Bridging Structural MRI with Cognitive Function for Individual Level Classification of Early Psychosis via Deep Learning | Recent efforts have been made to apply machine learning and deep learning approaches to the automated classification of schizophrenia using structural magnetic resonance imaging (sMRI) at the individual level. However, these approaches are less accurate on early psychosis (EP) since there are mild structural brain changes at early stage. As cognitive impairments is one main feature in psychosis, in this study we apply a multi-task deep learning framework using sMRI with inclusion of cognitive assessment to facilitate the classification EP patients from healthy individuals. Unlike previous studies, we used sMRI as the direct input to perform EP classifications and cognitive estimations. The proposed model does not require time-consuming volumetric or surface based analysis and can provide additionally cognition predictions. Extensive experiments were conducted on a sMRI data set with a total of 77 subjects (38 EP patients and 39 healthy controls), and we achieved 74.9{+/-}4.3% five-fold cross-validated accuracy and an area under the curve of 71.1{+/-}4.1% on EP classification with the inclusion of cognitive estimations. We reveal the feasibility of automated cognitive estimation using sMRI by deep learning models, and also demonstrate the implicit adoption of cognitive measures as additional information to facilitate EP classifications from healthy controls. | psychiatry and clinical psychology |
10.1101/2022.04.30.22274532 | Comparative analysis of the outcomes of COVID-19 between patients infected with SARS-CoV-2 Omicron and Delta variants: a retrospective cohort study | BackgroundThe SARS-CoV-2 Omicron variant has replaced the previously dominant Delta variant because of high transmissibility. It is responsible for the current increase in the COVID-19 infectivity rate worldwide. However, studies on the impact of the Omicron variant on the severity of COVID-19 are still limited in developing countries. Here, we compared the outcomes of patients infected with SARS-CoV-2 Omicron and Delta variants and associated with prognostic factors, including age, sex, comorbidities, and smoking.
MethodsWe involved 352 patients, 139 with the Omicron variant and 213 with the Delta variant. The whole-genome sequences of SARS-CoV-2 were conducted using the Illumina MiSeq next-generation sequencer.
ResultsCt value and mean age of COVID-19 patients were not significantly different between both groups (Delta: 20.35 {+/-} 4.07 vs. Omicron: 20.62 {+/-} 3.75; p=0.540; and Delta: 36.52 {+/-} 21.24 vs. Omicron: 39.10 {+/-} 21.24; p=0.266, respectively). Patients infected with Omicron and Delta variants showed similar hospitalization (p=0.433) and mortality rates (p=0.565). Multivariate analysis showed that older age ([≥]65 years) had higher risk for hospitalization (OR=3.67 [95% CI=1.22-10.94]; p=0.019) and fatalities (OR=3.93 [95% CI=1.35-11.42]; p=0.012). In addition, patients with cardiovascular disease had higher risk for hospitalization (OR=5.27 [95% CI=1.07-25.97]; p=0.041), whereas patients with diabetes revealed higher risk for fatalities (OR=9.39 [95% CI=3.30-26.72]; p=<0.001).
ConclusionsOur study shows that patients infected with Omicron and Delta variants reveal similar clinical outcomes, including hospitalization and mortality. In addition, our findings further confirm that older age, cardiovascular disease, and diabetes are strong prognostic factors for the outcomes of COVID-19 patients. | public and global health |
10.1101/2022.05.02.22274382 | The efficacy of soap against schistosome cercariae: A systematic review | BackgroundSchistosomiasis is a parasitic disease that is endemic in 78 countries and affects almost 240 million people worldwide. It has been acknowledged that an integrated approach that goes beyond drug treatment is needed to achieve control and eventual elimination of the disease. Improving hygiene has been encouraged by World Health Assembly, and one aspect of good hygiene is using soap during water-contact activities, such as bathing and doing laundry. This hygiene practice might directly reduce the skin exposure to cercariae at transmission sites. A qualitative systematic review was carried out to investigate the efficacy of soap against schistosome cercariae and to identify the knowledge gaps surrounding this topic.
MethodologySix online databases were searched between 5th and 8th July of 2021. Records returned from these databases were screened to remove duplicates, and the remaining records were classified by reading titles, abstracts, and full-texts to identify the included studies. The results were categorised into two groups based on two different protective mechanisms of soap (namely, damage to cercariae and protection of skin).
ConclusionsLimited research has been conducted on the efficacy of soap against schistosome cercariae and only 11 studies met the criteria to be included in this review. The review demonstrates that soap has the potential of protecting people against schistosome cercariae and there are two protective aspects: (1) soap affects cercariae adversely; (2) soap on the skin prevents cercariae from penetrating the skin. Both aspects of protection were influenced by many factors, but the differences in the reported experimental conditions, such as the cercarial endpoint measurement used and the cercaria numbers used per water sample, lead to low comparability between the previous studies. This review demonstrates that more evidence is needed to inform hygiene advice for people living in schistosomiasis endemic areas.
Author summarySchistosomiasis affects millions of people living in low- and middle-income countries lacking safe access to water, sanitation and hygiene (WASH). Schistosomiasis is mainly controlled by drug treatment with praziquantel, but people become infected again if they continue to come into contact with cercaria-contaminated water. Despite all the efforts that have been made, this disease remains a major public health problem in disease-endemic regions. The use of soap during water contact, as part of good hygiene practice, may be an effective complementary control method as it might reduce the penetration of schistosome cercariae into the skin, thereby reducing the likelihood of reinfection. We conducted a systematic review to summarise previous research into the efficacy of soap against schistosome cercariae, and to identify current knowledge gaps to inform future studies. The information gathered from this review can guide hygiene advice regarding soap use in endemic areas to reduce the spread of schistosomiasis. | public and global health |
10.1101/2022.04.29.22273770 | The impact of relationship to cancer patient and extent of caregiving on adherence to cancer screening in a diverse inner-city community | IntroductionThe number of individuals called upon to provide support for cancer patients within their personal networks is steadily increasing. Prior studies show that caregiver screening rates either decrease due to caregiving demands and associated fatalism or increase due to risk perception and healthcare involvement. However, there remains a gap in research in understanding how cancer screening relates to extent of caregiving and relationship to cancer patient, particularly non-family/spouse. This study aims to assess the impact of degrees of relationship to cancer patient(s) and extent of caregiving on adherence to breast, cervical, prostate, and colorectal cancer screening guidelines.
MethodsParticipants of Bronx, New York were recruited online or through community events to complete a set of core items adapted from the NCI Health Information National Trends Survey and other sources. Logistic regression analyses identified factors associated with variation in screening.
ResultsAnalyses were based on 1430 participants (73% female, mean age 50 years, 43% Hispanic). An unexpectedly large proportion had cancer within their families (72%) and/or provided some support to a cancer patient (79%). Four support patterns were found-none, emotional support only, less intensive, and more intensive support. Women who provided emotional support only were less likely to be screened for breast and cervical cancer. Among more active caregivers, cervical and prostate cancer screening were greater among those who provided more intensive support.
ConclusionsHaving family or friends with cancer is a normative experience that affects ones own preventive care. Those with a family member or spouse with cancer have increased personal screening. As involvement in cancer caregiving deepens, ones own screening rises. Caregivers that provided only emotional support are the exception, potentially because they have greater distance from the cancer experience. Findings suggest the potential value of tailoring interventions to address personal experiences with cancer caregiving. | public and global health |
10.1101/2022.05.02.22274561 | Artificial Intelligence for context-aware surgical guidance in complex robot-assisted oncological procedures: an exploratory feasibility study | AO_SCPLOWBSTRACTC_SCPLOWO_ST_ABSBackgroundC_ST_ABSComplex oncological procedures pose various surgical challenges including dissection in distinct tissue planes and preservation of vulnerable anatomical structures throughout different surgical phases. In rectal surgery, a violation of dissection planes increases the risk of local recurrence and autonomous nerve damage resulting in incontinence and sexual dysfunction. While deep learning-based identification of target structures has been described in basic laparoscopic procedures, feasibility of artificial intelligence-based guidance has not yet been investigated in complex abdominal surgery.
MethodsA dataset of 57 robot-assisted rectal resection (RARR) videos was split into a pre-training dataset of 24 temporally non-annotated videos and a training dataset of 33 temporally annotated videos. Based on phase annotations and pixel-wise annotations of randomly selected image frames, convolutional neural networks were trained to distinguish surgical phases and phase-specifically segment anatomical structures and tissue planes. To evaluate model performance, F1 score, Intersection-over-Union (IoU), precision, recall, and specificity were determined.
ResultsWe demonstrate that both temporal (average F1 score for surgical phase recognition: 0.78) and spatial features of complex surgeries can be identified using machine learning-based image analysis. Based on analysis of a total of 8797 images with pixel-wise target structure segmentations, mean IoUs for segmentation of anatomical target structures range from 0.09 to 0.82 and from 0.05 to 0.32 for dissection planes and dissection lines throughout different phases of RARR in our analysis.
ConclusionsImage-based recognition is a promising technique for surgical guidance in complex surgical procedures. Future research should investigate clinical applicability, usability, and therapeutic impact of a respective guidance system. | surgery |
10.1101/2022.05.02.22274438 | Decoding natural gait cycle in Parkinson's disease from cortico-subthalamic field potentials | Human bipedal walking is a complex motor behavior that requires precisely timed alternating activity across multiple nodes of the supraspinal network. However, understanding the neural dynamics that underlie walking is limited. We investigated the cortical-subthalamic circuit dynamics of overground walking from three patients with Parkinsons disease without major gait impairments. All patients were implanted with chronic bilateral deep brain stimulation leads in the subthalamic nucleus (STN) and electrocorticography paddles overlying the primary motor (M1) and sensory (S1) cortices. Local field potentials were wirelessly streamed through implanted bidirectional pulse generators during overground walking and synchronized to external gait kinematics sensors. We found that the STN displays increased low frequency (4-12 Hz) spectral power between ipsilateral heel strike to contralateral leg swing. Furthermore, the STN shows increased theta frequency (4-8 Hz) coherence with M1 through the initiation and early phase of contralateral leg swing. Our findings support the hypothesis that oscillations from the basal ganglia and cortex direct out-of-phase, between brain hemispheres in accordance with the gait cycle. In addition, we identified patient-specific, gait-related biomarkers in both STN and cortical areas at discrete frequency bands. These field potentials support classification of left and right gait events. These putative biomarkers of the gait cycle may eventually be used as control signals to drive adaptive DBS to further improve gait dysfunction in patients with Parkinsons disease. | neurology |
10.1101/2022.05.01.22274504 | Relationship between common polymorphisms of NRAMP1 gene and pulmonary tuberculosis in Lorestan LUR population | IntroductionTuberculosis (TB) is caused by Mycobacterium tuberculosis. In humans, a number of genes have been identified as susceptible to pulmonary tuberculosis. The relationship between NRAMP1 polymorphisms and pulmonary tuberculosis has been studied in different populations and has reported contradictory results. The aim of this study was to investigate the relationship between the common polymorphisms of NRAMP1 gene and the susceptibility to pulmonary tuberculosis in the LUR Population of Lorestan province of Iran.
Materials and MethodsIn this case control study, three common polymorphisms of NRAMP1 gene (3UTR, INT4 and D543N) were genotyped using PCR-RFLP technique in the LUR population of Lorestan province. In this study, 100 patients with pulmonary tuberculosis (PTB) were studied as case group and 100 healthy controls that matched for age and sex with the patient group, studied as control group. Statistical analysis was performed using SPSS 18 software.
ResultsIn the present study we observed that the GG genotype of D543N polymorphism was statistically significantly associated with increased susceptibility to TB (84% in the case group vs. 72% in the control group, %95CI=1.024-4.071, OR=2.042, P=0.0405). Also, G allele of D543N polymorphism was statistically significantly associated with increased susceptibility to pulmonary tuberculosis (90% in the case group vs. 81.5% in the control group, %95CI=1.140-3.663, OR=2.043, P=0.015). On the other hand, the frequency of allele A of D543N polymorphism was significantly lower in patients than in the control group (10% in the case group vs. 18.5% in the control group, %95CI=0.273-0.878, OR=0.489, P=0.015). Although genotypic and allelic frequency of 3UTR and INT4 polymorphisms between patients and controls showed no significant differences in the study population.
Discussion and conclusionOur observations showed that GG genotype and G allele of D543N polymorphism have a significant role in increasing the susceptibility to pulmonary tuberculosis in the LUR Population of Lorestan province. Also, allele A of D543N polymorphism has a significant effect on resistance to pulmonary tuberculosis in this population. Although there was no significant correlation between genotypes and alleles of 3UTR and INT4 polymorphisms with susceptibility to or resistance to pulmonary TB in this population. It is suggested that a larger sample size be used in future studies. It is also recommended to conduct this type of study on other ethnicities. | genetic and genomic medicine |
10.1101/2022.05.02.22274586 | Describing COVID-19 Patients During The First Two Months of Paxlovid (Nirmatrelvir/Ritonavir) Initiation in a Large HMO in Israel | TitleDescribing COVID-19 Patients During The First Two Months of Paxlovid (Nirmatrelvir/Ritonavir) Initiation in a Large HMO in Israel
ObjectiveThe objective of this feasibility study was to assess the number of patients that could be included in a future Real World Evidence study, which would be designed to explore the impact of Paxlovid (nirmatrelvir/ritonavir) on patient outcomes and healthcare resource utilization (HCRU). We also intend to assess the comparability of the patients who were treated with Paxlovid versus patients who did not receive the treatment, either because they declined any COVID-19 treatment or were diagnosed with COVID-19 prior to Paxlovid availability.
MethodsThis retrospective observational secondary data study used data from the Maccabi Healthcare Services database during the identification period of June 1, 2021, to February 28, 2022. The study population included patients with at least one positive SARS-CoV-2 RT-PCR test, or a formal rapid antigen test for SARS-CoV-2, during the identification period, the date of which also served as the COVID-19 diagnosis date. We then divided the study population into the following cohorts: Pre-Paxlovid Time Period and Paxlovid Time Period, which was further split into Paxlovid Treated and Paxlovid Untreated.
ResultsApplication of inclusion and exclusion criteria to the study population rendered 20,284 patients in the Pre-Paxlovid Time Period cohort and 5,542 in the Paxlovid Time Period cohort that were eligible to receive Paxlovid. This resulted in 3,714 in the Paxlovid Treated and 1,810 in the Paxlovid Untreated cohorts.
ConclusionsThis RWE feasibility study of patients with a positive test for COVID-19 between June 1, 2021 to February 28, 2022 illustrates potential comparability between cohorts, as described by their demographics and characteristics. | infectious diseases |
10.1101/2022.05.02.22273921 | Clinical implementation of routine whole-genome sequencing for hospital infection control of multi-drug resistant pathogens | BackgroundProspective whole-genome sequencing (WGS)-based surveillance may be the optimal approach to rapidly identify transmission of multi-drug resistant (MDR) bacteria in the healthcare setting.
Materials/methodsWe prospectively collected methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant enterococci (VRE), carbapenem-resistant Acinetobacter baumannii (CRAB), extended-spectrum beta-lactamase (ESBL-E) and carbapenemase-producing Enterobacterales (CPE) isolated from blood cultures, sterile sites or screening specimens across three large tertiary referral hospitals (2 adult, 1 paediatric) in Brisbane, Australia. WGS was used to determine in silico multi-locus sequence typing (MSLT) and resistance gene profiling via a bespoke genomic analysis pipeline. Putative transmission events were identified by comparison of core genome single nucleotide polymorphisms (SNPs). Relevant clinical meta-data were combined with genomic analyses via customised automation, collated into hospital-specific reports regularly distributed to infection control teams.
ResultsOver four years (April 2017 to July 2021) 2,660 isolates were sequenced. This included MDR gram-negative bacilli (n=293 CPE, n=1309 ESBL), MRSA (n=620) and VRE (n=433). A total of 379 clinical reports were issued. Core genome SNP data identified that 33% of isolates formed 76 distinct clusters. Of the 76 clusters, 43 were contained to the three target hospitals, suggesting ongoing transmission within the clinical environment. The remaining 33 clusters represented possible inter-hospital transmission events or strains circulating in the community. In one hospital, proven negligible transmission of non-multi-resistant MRSA enabled changes to infection control policy.
ConclusionsImplementation of routine WGS for MDR pathogens in clinical laboratories is feasible and can enable targeted infection prevention and control interventions.
SummaryWe initiated a program of routine sequencing of multi-drug resistant organisms. A custom analysis pipeline was used to automate reporting by incorporating clinical meta-data with genomics to define clusters and support infection control interventions. | infectious diseases |
10.1101/2022.05.02.22274575 | SOFA score performs worse than age for predicting mortality in patients with COVID-19 | The use of the Sequential Organ Failure Assessment (SOFA) score, originally developed to describe disease morbidity, is commonly used to predict in-hospital mortality. During the COVID-19 pandemic, many protocols for crisis standards of care used the SOFA score to select patients to be deprioritized due to a low likelihood of survival. A prior study found that age outperformed the SOFA score for mortality prediction in patients with COVID-19, but was limited to a small cohort of intensive care unit (ICU) patients and did not address whether their findings were unique to patients with COVID-19. Moreover, it is not known how well these measures perform across races.
In this retrospective study, we compare the performance of age and SOFA scores in predicting in-hospital mortality across two cohorts: a cohort of 2,648 consecutive adult patients diagnosed with COVID-19 who were admitted to a large academic health system in the northeastern United States over a 4-month period in 2020 and a cohort of 75,601 patients admitted to one of 335 ICUs in the eICU database between 2014 and 2015.
Among the COVID-19 cohort, age (area under receiver-operating characteristic curve (AU-ROC) 0.795, 95% CI 0.762, 0.828) had a significantly better discrimination than SOFA score (AU-ROC 0.679, 95% CI 0.638, 0.721) for mortality prediction. Conversely, age (AU-ROC 0.628 95% CI 0.608, 0.628) underperformed compared to SOFA score (AU-ROC 0.735, 95% CI 0.726, 0.745) in non-COVID-19 ICU patients in the eICU database. There was no difference between Black and White COVID-19 patients in performance of either age or SOFA Score. Our findings bring into question the utility of SOFA score-based resource allocation in COVID-19 crisis standards of care.
Author SummaryThe COVID-19 pandemic has prompted hospitals to develop protocols for allocating resources if the number of patients exceed their capacity in order to save as many lives as possible. Many of these protocols use the Sequential Organ Failure Assessment (SOFA) score to identify patients who are unlikely to survive and thus should be deprioritized for care. There are concerns that the SOFA score may not accurately predict mortality in patients with COVID-19 or perform better in one racial group over another. We asked whether a simple measure, patient age, could better predict mortality than SOFA score in a group of adult patients admitted to a large academic health system in 2020. To see if any findings are unique to patients with COVID-19, we performed the same analysis in a group of adult patients taken from the eICU database, a large publicly available dataset that was collected prior to the COVID-19 pandemic. We found that age was better than SOFA score at predicting patient mortality in patients with COVID-19, but not in patients without COVID. For COVID-19, neither age or SOFA score performed better in one racial group over another. Caution is needed when applying an established disease severity index model to a new illness. | intensive care and critical care medicine |
10.1101/2022.05.01.22274547 | Raising the bar for patient experience during care transitions in Canada: A repeated cross-sectional survey evaluating a patient-oriented discharge summary at Ontario hospitals | BackgroundPatient experience when transitioning home from hospital is an important quality metric linked to improved patient outcomes. We evaluated the impact of a hospital-based care transition intervention, patient-oriented discharge summary (PODS), on patient experience across Ontario acute care hospitals.
MethodsWe used a repeated cross-sectional study design to compare yearly positive responses to four questions centered on discharge communication from the Canadian Patient Experience Survey (2016-2020) among three hospital cohorts with various levels of PODS implementation. Logistic regression using a binomial likelihood accounting for site level clustering was used to assess continuous linear time trends among cohorts and cohort differences during the post-implementation period. This research had oversight from a public advisory group of patient and caregiver partners from across the province.
Results512,288 responses were included with mean age 69 {+/-} 14 years (females) and 61 {+/-} 20 years (males). Compared to non-implementation hospitals, hospitals with full implementation (>50% discharges) reported higher odds for having discussed the help needed when leaving hospital (OR=1.18, 95% CI=1.02-1.37) and having received information in writing about what symptoms to look out for (OR=1.44, 95%=1.17-1.78) post-implementation. The linear time trend was also significant when comparing hospitals with full versus no implementation for having received information in writing about what symptoms to look out for (OR=1.05, 95% CI=1.01-1.09).
InterpretationPODS implementation was associated with higher odds of positive patient experience, particularly for discharge planning. Further efforts should center on discharge management, specifically: understanding of medications and what to do if worried once home. | health systems and quality improvement |
10.1101/2022.05.01.22274548 | Immunogenicity and reactogenicity of a third dose of BNT162b2 vaccine for COVID-19 after a primary regimen with BBIBP-CorV or BNT162b2 vaccines in Lima, Peru. | BackgroundThe administration of a third (booster) dose of COVID-19 vaccines in Peru initially employed the BNT162b2 (Pfizer) mRNA vaccine. The national vaccination program started with healthcare workers (HCW) who received BBIBP-CorV (Sinopharm) vaccine as primary regimen and elderly people previously immunized with BNT162b2. This study evaluated the reactogenicity and immunogenicity of the "booster" dose in these two groups in Lima, Peru.
MethodsWe conducted a prospective cohort study, recruiting participants from November to December of 2021 in Lima, Peru. We evaluated immunogenicity and reactogenicity in HCW and elderly patients previously vaccinated with either two doses of BBIBP-CorV (heterologous regimen) or BTN162b2 (homologous regimen). Immunogenicity was measured by anti-SARS-CoV-2 IgG antibody levels immediately before boosting dose and 14 days later. IgG geometric means (GM) and medians were obtained, and modeled using ANCOVA and quantile regressions.
ResultsThe GM of IgG levels increased significantly after boosting: from 28.5{+/-}5.0 AU/mL up to 486.6{+/-}1.2 AU/mL (p<0.001) which corresponds to a 17-fold increase. The heterologous vaccine regimen produced higher GM of post-booster anti-SARS-CoV-2 IgG levels, eliciting a 13% fold increase in the geometric mean ratio (95%CI: 1.02-1.27) and a median difference of 92.3 AU/ml (95%CI: 24.9-159.7). Both were safe and well tolerated. Previous COVID-19 infection was also associated with higher pre and post-booster IgG GM levels.
ConclusionAlthough both boosting regimens were highly immunogenic, two doses of BBIBP-CorV boosted with BTN162b2 produced a stronger IgG antibody response than the homologous BNT162b2 regimen in the Peruvian population. Additionally, both regimens were mildly reactogenic and well-tolerated. | infectious diseases |
10.1101/2022.05.02.22274007 | Impact of shaggy aorta on intraoperative cerebral embolism during carotid artery stenting | BackgroundCareful selection of patients for carotid stenting is necessary. We suggest that patients with a shaggy aorta syndrome may have a higher risk of perioperative embolic complications.
MethodsThe study is a subanalysis of the SIBERIA Trial. We included 72 patients undergoing transfemoral carotid artery stenting. All patients had an MRI DWI and a clinical neurological examination two days before and on the second and 30th days after the intervention.
Results46 patients had shaggy aorta syndrome. Intraoperative embolisms were recorded in 82.6% and 46.1% of patients with and without shaggy aorta syndrome, respectively (p=0.001). New asymptomatic ischemic brain lesions in the postoperative period occurred in 78.3% and in 26.9% of patients with and without shaggy aorta syndrome, respectively (p=0.000). 3 (6.5%) cases of stroke within 30 days after the procedure were observed only in patients with shaggy aorta syndrome. Shaggy aorta syndrome (OR 5.54 [1.83:16.7], p=0.001) and aortic arch ulceration (OR 6.67 [1.19: 37.3], p = 0.02) were independently associated with cerebral embolism. Shaggy aorta syndrome (OR 9.77 [3.14-30.37], p=0.000) and aortic arch ulceration (OR 12.9 [2.3: 72.8], p = 0.003) were independently associated with ipsilateral new asymptomatic ischemic brain lesions.
ConclusionsShaggy aorta syndrome and aortic arch ulceration significantly increase the odds of intraoperative embolism and new asymptomatic ischemic brain lesions. Carotid endarterectomy or transcervical carotid stent should be selected in the presence of aortic ulceration. | cardiovascular medicine |
10.1101/2022.05.02.22274478 | An Early Return-to-Work Program for COVID-19 Close Contacts in Healthcare During the Omicron Wave in Japan | During the coronavirus disease 2019 (COVID-19) pandemic, maintaining adequate staffing in healthcare facilities is important to provide a safe work environment for healthcare workers (HCWs). Japans early return-to-work (RTW) program may be a rational strategy at a time when there is an increased demand for the services of HCWs. We assessed whether the early RTW program for HCWs who have been in close contact with a COVID-19 case in our hospital was justified. Close contacts were identified according to the guidance of the World Health Organization. Between January and March 2022, 256 HCWs were identified as close contacts (median age, 35 years; 192 female). Thirty-seven (14%) secondary attack cases of COVID-19 were detected. Among 141 HCWs who applied to the early RTW program, nurses and doctors comprised about three-quarters of participants, with a higher participation rate by doctors (78%) than nurses (59%). Eighteen HCWs tested positive for COVID-19 by the sixth day after starting the early RTW program. No COVID-19 infection clusters were reported during the observation period. These findings suggest that the early RTW program for COVID-19 close contacts was a reasonable strategy for HCWs during the Omicron wave. | infectious diseases |
10.1101/2022.05.03.22274459 | The global randomization test: A Mendelian randomization falsification test for the exclusion restriction assumption | Mendelian randomization may give biased causal estimates if the instrument affects the outcome not solely via the exposure of interest (violating the exclusion restriction assumption). We demonstrate use of a global randomization test as a falsification test for the exclusion restriction assumption. Using simulations, we explored the statistical power of the randomization test to detect an association between a genetic instrument and a covariate set due to a) selection bias or b) horizontal pleiotropy, compared to three approaches examining associations with individual covariates: i) Bonferroni correction for the number of covariates, and ii) correction for the effective number of independent covariates and iii) an r2 permutation-based approach. We conducted proof-of-principle analyses in UK Biobank, using CRP as the exposure and coronary heart disease (CHD) as the outcome. In simulations, power of the randomization test was higher than the other approaches for detecting selection bias when the correlation between the covariates was low (R2< 0.1), and at least as powerful as the other approaches across all simulated horizontal pleiotropy scenarios. In our applied example, we found strong evidence of selection bias using all approaches (e.g., global randomization test p<0.002). We identified 51 of the 58 CRP genetic variants as horizontally pleiotropic, and estimated effects of CRP on CHD attenuated somewhat to the null when excluding these from the genetic risk score (OR=0.956 [95% CI: 0.918, 0.996] versus 0.970 [95% CI: 0.900, 1.046] per 1-unit higher log CRP levels). The global randomization test can be a useful addition to the MR researcher s toolkit. | epidemiology |
10.1101/2022.05.03.22274616 | Psychological interventions for adult PTSD: A network and pairwise meta-analysis of short and long-term efficacy, acceptability and trial quality | While dozens of randomized controlled trials (RCTs) have examined psychological interventions for adult posttraumatic stress disorder (PTSD), no network meta-analysis has comprehensively integrated their results for all interventions and both short and long-term efficacy. We conducted systematic searches in bibliographical databases to identify RCTs comparing the efficacy (standardized mean differences in PTSD severity, SMDs) and acceptability (relative risk of all-cause dropout, RR) of trauma-focused cognitive behaviour therapy (TF-CBT), Eye Movement Desensitization and Reprocessing (EMDR), other trauma-focused psychological interventions (other-TF-PIs) and non-trauma-focused psychological interventions (non-TF-PIs) compared to each other or to passive or active control conditions. Hundred-fifty RCTs met inclusion criteria comprising 11,282, 4,443 and 3,167 patients at post-treatment assessment, [≤] 5 months follow-up and > 5 months follow-up, respectively. By far the most data exist for TF-CBT. We performed random effects network meta-analyses (efficacy) and pairwise meta-analyses (acceptability). All therapies produced large effects compared to passive control conditions (SMDs [≥] 0.80) at post-treatment. Compared to active control conditions, TF-CBT and EMDR were moderately more effective (SMDs [≥] 0.50 < 0.80), and other-TF-PIs and non-TF-PIs were slightly more effective (SMDs [≥] 0.20 < 0.50). Interventions did not differ in their short-term efficacy, yet TF-CBT was more effective than non-TF-PIs (SMD = 0.14). Results remained robust in sensitivity and outlier-adjusted analyses. Similar results were found for long-term efficacy. Interventions also did not differ in terms of their acceptability, except for TF-CBT being associated with a slightly increased risk of dropout compared to non-TF-PIs (RR=1.34; 95% CI: 1.05-1.70). Interventions with and without trauma focus appear effective and acceptable in the treatment of adult PTSD with most certainty for TF-CBT, which, however, appears somewhat less acceptable than non-TF-PIs. | psychiatry and clinical psychology |
10.1101/2022.05.03.22274534 | Characteristics of patients with positional OSA according to ethnicity and the identification of a novel phenotype - Lateral Positional Patients (Lateral PP): A MESA study | Study ObjectivesTo investigate the characteristics of Obstructive Sleep Apnea (OSA) positional patients (PP) phenotypes among different ethnic groups in the Multi-Ethnic Study of Atherosclerosis (MESA) dataset. Moreover, we hypothesized the existence of a new OSA PP phenotype we coined "Lateral PP", for whom the lateral apneas hypopnea index (AHI) is at least double the supine AHI.
MethodsFrom 2,273 adults with sleep information, we analyzed data of 1,323 subjects that slept more than 4 hours and had at least 30 minutes of sleep in both the supine and the non-supine positions. Demographics and clinical information were compared for the different PP, and ethnic groups.
Results861 (65.1%) patients had OSA and 35 (4.1%) were Lateral PP. Lateral PP patients were mainly females (62.9%), obese (31.4 median body mass index), had mild to moderate OSA (94.3%), and mostly were non-Chinese American (97.1%). Among all OSA patients, 550 (63.9%) were Supine PP, and 17.7% were supine-isolated OSA (siOSA). Supine PP and Lateral PP were present in 73.1% and 1.0% of Chinese Americans, 61.0% and 3.4% of Hispanics, 68.3% and 4.7% of White-Caucasian, and 56.2% and 5.2% of Black-African American OSA patients.
ConclusionChinese-American have the highest prevalence of Supine PP, whereas Black-African American patients lean towards less Supine PP and higher Lateral PP. Lateral PP appears as a novel OSA phenotype. However, it was found for a small group of OSA patients and thus its presence should be further validated.
Brief SummaryWe studied prevalence of obstructive sleep apnea (OSA) positional phenotypes among Black African-American, Caucasian, Hispanic, and Chinese American patients and described their demographic and polysomnographic characteristics. Despite similar levels of OSA severity in the four ethnic groups, Chinese American had a higher prevalence of Supine PP, whereas Black-African American patients were significantly less Supine PP, as compared to other ethnic groups.
In addition, we identify a novel OSA PP phenotype we named "Lateral positional patients (Lateral-PP)". These OSA patients had apnea and hypopnea events mainly in the lateral position. This was a small group of OSA patients (4.1%) that were mainly obese, female, with mild-moderate OSA, and more prevalent in Black-African American. | respiratory medicine |
10.1101/2022.05.03.22274524 | Kidney Transplant Recipients and Omicron: Outcomes, effect of vaccines and the efficacy and safety of novel treatments | We aimed to describe the outcomes of Omicron infection in kidney transplant recipients (KTR), compare the efficacy of the community therapeutic interventions and report the safety profile of molnupiravir.
From 142 KTRs diagnosed with COVID-19 infection after Omicron had become the dominant variant in the UK, 116 (78.9%) cases were diagnosed in the community; 47 receiving sotrovimab, 21 molnupiravir and 48 no treatment. 10 (20.8%) non-treated patients were hospitalised following infection, which was significantly higher than the sotrovimab group (2.1%), p=0.0048, but not the molnupiravir treated group (14.3%), p=0.47. The only admission following sotrovimab occurred in a patient infected with BA.2. One patient from the molnupiravir and no-treatment groups required ICU support, and both subsequently died, with one other death in the no-treatment group. No patient receiving sotrovimab died. 6/116 (5.2%) patients required dialysis following their diagnosis; 2 (9.5%) patients receiving molnupiravir and 4 (8.3%) no-treatment. This requirement was significantly higher in the molnupiravir group compared with the sotrovimab treated patients, in whom no patient required dialysis, p=0.035. Both molnupiravir treated patients requiring dialysis had features of systemic thrombotic microangiopathy.
Post-vaccination serostatus was available in 110 patients. Seropositive patients were less likely to require hospital admission compared with seronegative patients, 6 (7.0%) and 6 (25.0%) respectively, p=0.023. Seropositive patients were also less likely to require dialysis therapy, p=0.016.
In conclusion, sotrovimab treatment in the community was associated with superior patient and transplant outcomes; its clinical efficacy against the BA.2 variant requires a rapid review. The treatment benefit of molnupiravir was not evident, and wider safety reporting in transplant patients is needed. | nephrology |
10.1101/2022.05.02.22274247 | Magnetic Resonance Imaging Radiomics-Based Model to Identify the Pathological Features and Lymph Node Metastasis in Rectal Cancer | BackgroundPathological features and lymph node staging plays an important role in treatment decision-making. Yet, the preoperative accurate prediction of pathological features and lymph node metastasis (LNM) is challenging.
ObjectiveWe aimed to investigate the value of MRI-based radiomics in predicting the pathological features and lymph node metastasis in rectal cancer.
MethodsIn this prospective study, a total of 37 patients diagnosed with histologically confirmed rectal cancer who underwent pelvic 3.0T magnetic resonance imaging (MRI) were enrolled. MRI images of both the primary tumor alongside the lymph nodes and specimens were performed with a node-to-node match and labeling. The correlation analysis, least absolute shrinkage, and selection operator (LASSO) logistic regression (LR), backward stepwise LR were mainly used for radiomics feature selection and modeling. The univariate and multivariate backward stepwise LR were used for preoperative clinical predictors selection and modeling.
ResultsA total of 487 lymph nodes including 39 metastatic lymph nodes and 11 tumor deposits were harvested from 37 patients. The texture features of the primary tumors could successfully predict tumor differentiation using a well-established model (area under the curve (AUC) = 0.798). Sixty-nine matched lymph nodes were randomly divided into a training cohort (n = 39) and a validation cohort (n = 30).
Three independent risk factors were obtained from 56 texture parameters closely related to LNM. A prediction model was then successfully developed, which provided AUC values of 0.846 and 0.733 in the training and test cohort, respectively. Further, tumor deposits produced a higher radiomics score (Rad-score) compared with LNM (P = 0.042).
ConclusionThe study provides two non-invasive and quantitative methods, which respectively predict the tumor differentiation and regional LNM for rectal cancer preoperatively. Ultimately, these are favorable when producing treatment protocols for rectal cancer patients. | oncology |
10.1101/2022.05.03.22274595 | Mendelian Randomization Analysis of the Relationship Between Native American Ancestry and Gallbladder Cancer Risk | BackgroundA strong association between the proportion of Native American ancestry and the risk of gallbladder cancer (GBC) has been reported in observational studies. Chileans show the highest incidence of GBC worldwide, and the Mapuche are the largest Native American people in Chile. We set out to investigate the causal association between Native American Mapuche ancestry and GBC risk, and the possible mediating effects of gallstone disease and body mass index (BMI) on this association.
MethodsMarkers of Mapuche ancestry were selected based on the informativeness for assignment measure and then used as instrumental variables in two-sample mendelian randomization (MR) analyses and complementary sensitivity analyses.
ResultWe found evidence of a causal effect of Mapuche ancestry on GBC risk (inverse variance-weighted (IVW) risk increase of 0.8% for every 1% increase in Mapuche ancestry proportion, 95% CI 0.4% to 1.2%, p = 6.6x10-5). Mapuche ancestry was also causally linked to gallstone disease (IVW risk increase of 3.6% per 1% increase in Mapuche proportion, 95% CI 3.1% to 4.0%, p = 1.0x10-59), suggesting a mediating effect of gallstones in the relationship between Mapuche ancestry and GBC. In contrast, the proportion of Mapuche ancestry showed a negative causal effect on BMI (IVW estimate -0.006 kg/m2 per 1% increase in Mapuche proportion, 95% CI -0.009 to -0.003, p = 4.4x10-5).
ConclusionsThe results presented here may have significant implications for GBC prevention and are important for future admixture mapping studies. Given that the association between Mapuche ancestry and GBC risk previously noted in observational studies appears to be causal, primary and secondary prevention strategies that take into account the individual proportion of Mapuche ancestry could be particularly efficient. | genetic and genomic medicine |
10.1101/2022.05.02.22274487 | Genotyping and population structure of the China Kadoorie Biobank | China Kadoorie Biobank is a population-based prospective cohort of >512,000 adults recruited in 2004-2008 from 10 geographically diverse regions across China. Detailed data from questionnaire and physical measurements were collected at baseline, with additional measurements at three resurveys involving approximately 5% of surviving participants. Incident disease events are captured through electronic linkage to death and disease registries and to the national health insurance system. Genome-wide genotyping has been conducted for >100,000 participants using custom-designed Axiom(R) arrays. Analysis of these data reveals extensive relatedness within the CKB cohort, signatures of recent consanguinity, and principal component signatures reflecting large-scale population movements from recent Chinese history. In addition to numerous CKB studies of candidate drug targets and disease risk factors, CKB has made substantial contributions to many international genetics consortia. Collected biosamples are now being used for high-throughput omics assays which, together with planned whole genome sequencing, will continue to enhance the scientific value of this biobank. | genetic and genomic medicine |
10.1101/2022.05.02.22273861 | SARS-CoV-2 outbreaks in secondary school settings in the Netherlands during fall 2020; silent circulation | BackgroundIn fall 2020 when schools in the Netherlands operated under a limited set of COVID-19 measures, we conducted outbreaks studies in four secondary schools to gain insight in the level of school transmission and the role of SARS-CoV-2 transmission via air and surfaces.
MethodsOutbreak studies were performed between 11 November and 15 December 2020 when the wild-type variant of SARS-CoV-2 was dominant. Clusters of SARS-CoV-2 infections within schools were identified through a prospective school surveillance study. All school contacts of cluster cases, irrespective of symptoms, were invited for PCR testing twice within 48 hrs and 4-7 days later. Combined NTS and saliva samples were collected at each time point along with data on recent exposure and symptoms. Surface and active air samples were collected in the school environment. All samples were PCR-tested and sequenced when possible.
ResultsOut of 263 sampled school contacts, 24 tested SARS-CoV-2 positive (secondary attack rate 9.1%), of which 62% remained asymptomatic and 42% had a weakly positive test result. Phylogenetic analysis on 12 subjects from 2 schools indicated a cluster of 8 and 2 secondary cases, respectively, but also other distinct strains within outbreaks. Of 51 collected air and 53 surface samples, none were SARS-CoV-2 positive.
ConclusionOur study confirmed within school SARS-CoV-2 transmission and substantial silent circulation, but also multiple introductions in some cases. Absence of air or surface contamination suggests environmental contamination is not widespread during school outbreaks. | infectious diseases |