id
stringlengths 16
27
| title
stringlengths 18
339
| abstract
stringlengths 95
38.7k
| category
stringlengths 7
44
|
---|---|---|---|
10.1101/2022.03.04.22271940 | A new compartment model of COVID-19 transmission: The broken-link model | We propose a new compartment model of COVID-19 spread, the broken-link model, which includes the effect from unconnected infectious links of the transmission. The traditional SIR-type epidemic models are widely used to analyze the spread status, and the models show the exponential growth of the number of infected people. However, even in the early stage of the spread, it is proven by the actual data that the exponential growth did not occur all over the world. We consider this is caused by the suppression of secondary and higher transmissions of COVID-19. We find that the proposed broken-link model quantitatively describes the mechanism of this suppression and is consistent with the actual data. | public and global health |
10.1101/2022.03.07.22271522 | Executable models of pathways built using single-cell RNA seq data reveal immune signaling dysregulations in people living with HIV and atherosclerosis | BackgroundAtherosclerosis (AS)-associated cardiovascular disease is an important cause of mortality in an aging population of people living with HIV (PLWH). This elevated risk of atherosclerosis has been attributed to viral infection, prolonged usage of anti-retroviral therapy, and subsequent chronic inflammation.
MethodsTo investigate dysregulated immune signaling in PLWH with and without AS, we sequenced 9368 peripheral blood mononuclear cells (PBMCs) from 8 PLWH, 4 of whom also had atherosclerosis (AS+). To develop executable models of signaling pathways that drive cellular states in HIV-associated atherosclerosis, we developed the single-cell Boolean Omics Network Invariant Time Analysis (scBONITA) algorithm. ScBONITA (a) uses single-cell RNA sequencing data to infer Boolean rules for topologically characterized networks, (b) prioritizes genes based on their impact on signaling, (c) performs pathway analysis, and (d) maps sequenced cells to characteristic signaling states. We used scBONITA to identify dysregulated pathways in different cell-types from AS+ PLWH and AS-PLWH. To compare our findings with pathways associated with HIV infection, we used scBONITA to analyze a publicly available dataset of PBMCs from subjects before and after HIV infection. Additionally, the executable Boolean models characterized by scBONITA were used to analyze observed cellular states corresponding to the steady states of signaling pathways
ResultsWe identified an increased subpopulation of CD8+ T cells and a decreased subpopulation of monocytes in AS+ PLWH. Dynamic modeling of signaling pathways and pathway analysis with scBONITA provided a new perspective on the mechanisms of HIV-associated atherosclerosis. Lipid metabolism and cell migration pathways are induced by AS rather than by HIV infection. These pathways included AGE-RAGE and PI3K-AKT signaling in CD8+ T cells, and glucagon and cAMP signaling pathways in monocytes. Further analysis of other cell subpopulations suggests that the highly interconnected PI3K-AKT signaling pathway drives cell migratory state in response to dyslipidemia. scBONITA attractor analysis mapped cells to pathway-specific signaling states that correspond to distinct cellular states.
ConclusionsDynamic network modeling and pathway analysis with scBONITA indicates that dysregulated lipid signaling regulates cell migration into the vascular endothelium in AS+ PLWH. Attractor analysis with scBONITA facilitated pathway-based characterization of cellular states that are not apparent in gene expression analyses. | hiv aids |
10.1101/2022.03.05.22271947 | Efficacy and safety of intensified versus standard prophylactic anticoagulation therapy in patients with Covid-19: a systematic review and meta-analysis | BackgroundRandomised controlled trials (RCTs) have reported inconsistent effects from intensified anticoagulation on clinical outcomes in Covid-19. We performed an aggregate data meta-analysis from available trials to quantify effect on non-fatal and fatal outcomes and identify subgroups who may benefit.
MethodsWe searched multiple databases for RCTs comparing intensified (intermediate or therapeutic dose) versus standard prophylactic dose anticoagulation in adults with laboratory-confirmed Covid-19 through 19 January 2022. The primary efficacy outcome was all-cause mortality at end of follow-up or discharge. We used random effects meta-analysis to estimate pooled risk ratios for mortality, thrombotic, and bleeding events, and performed subgroup analysis for clinical setting and dose of intensified anticoagulation.
ResultsEleven RCTs were included (n = 5873). Intensified anticoagulation was not associated with a reduction in mortality for up to 45 days compared with prophylactic anticoagulation: 17.5% (501/2861) died in the intensified anticoagulation group and 18.8% (513/2734) died in the prophylactic anticoagulation group, relative risk (RR) 0.93; 95%CI, 0.79 - 1.10. On subgroup analysis, there was a possible signal of mortality reduction for inpatients admitted to general wards, although with low precision and high heterogeneity (5 studies; RR 0.84; 95% CI, 0.49 - 1.44; I2 = 75%) and not significantly different to studies performed in the ICU (interaction P = 0.51). Risk of venous thromboembolism was reduced with intensified anticoagulation compared with prophylaxis (8 studies; RR 0.53, 95%CI 0.41 - 0.69; I2 = 0%). This effect was driven by therapeutic rather than intermediate dosing on subgroup analysis (interaction P =0.04). Major bleeding was increased with use of intensified anticoagulation (RR 1.73, 95% CI 1.17 - 2.56) with no interaction for dosing and clinical setting.
ConclusionIntensified anticoagulation has no effect on short term mortality among hospitalised adults with Covid-19 and is associated with increased risk of bleeding. The observed reduction in venous thromboembolism risk and trend towards reduced mortality in non-ICU hospitalised patients requires exploration in additional RCTs.
SummaryIn this aggregate data meta-analysis, use of intensified anticoagulation had no effect on short term mortality among hospitalised adults with Covid-19 and was associated with increased risk of bleeding. | infectious diseases |
10.1101/2022.03.04.22271911 | Impact of the COVID-19 pandemic on breast cancer screening indicators in a Spanish population-based program: a cohort study | BackgroundTo assess the effect of the COVID-19 pandemic on performance indicators in the population-based breast cancer screening program of Parc de Salut Mar (PSMAR), Barcelona, Spain.
MethodsWe conducted a before-and-after, quasi-experimental study to evaluate participation, recall, false-positives, cancer detection rate, and cancer characteristics in our screening population from March 2020 to March 2021 compared with the four previous rounds (2012-2019). Using independent logistic regression models, we estimated the adjusted odds ratios (aOR) of each of the performance indicators for the COVID-19 period, controlling by type of screening (prevalent or incident), socioeconomic index, family history of breast cancer, and menopausal status. We analyzed 144,779 observations from 47,571 women.
ResultsDuring the COVID-19 period, the odds of participation were 11% lower in first-time invitees (aOR=0.89[95%CI=0.84-0.96]) and in those who had previously participated regularly and irregularly (aOR=0.65 [95%CI=0.61-0.69] and aOR=0.93 [95%CI=0.85-1.03], respectively). Participation showed a modest increase in women not attending any of the previous rounds (aOR=1.07 [95%CI=0.99-1.17]). The recall rate slightly decreased in both prevalent and incident screening (aOR=0.89 [95%CI=0.78-1.01] and aOR=0.89 [95%CI=0.79-1.00], respectively). No significant differences were observed in false-positives (prevalent - aOR=1.07 [95%CI=0.92-1.24] and incident screening -aOR=0.94 [95%CI=0.82-1.08]), cancer detection rate (aOR=0.91 [95%CI=0.69-1.18]), or cancer stages.
ConclusionsThe COVID-19 pandemic negatively affected screening attendance, especially in previous participants and newcomers. We found no marked differences in recall, false-positives, or cancer detection, indicating the programs resilience. There is a need for further evaluations of interval cancers and potential diagnostic delays. | epidemiology |
10.1101/2022.03.05.22271966 | Topical anticholinergic medications for primary hyperhidrosis: A protocol for systematic review and meta-analysis | IntroductionPrimary hyperhidrosis (PHH) is a chronic condition characterized by excessive sweating. Several topical anticholinergic agents have been developed, but the evidence for the efficacy and tolerability of changing medication pathways of anticholinergics for PHH is limited.
Methods and analysisPubMed, the Cochrane Library, Embase, the Web of Science, and the Cochrane Central Register of Controlled Trials will be searched from inception to March 2022 for studies that may be eligible for inclusion. Randomized controlled trials (RCTs) of PHH treated by topical anticholinergic drugs will be included. The primary outcomes include severity of hyperhidrosis measured quantitatively, the Hyperhidrosis Disease Severity Scale (HDSS) score or the proportion of subjects with a minimum 2-grade improvement from baseline in HDSS, the Hyperhidrosis Disease Severity Measure-Axillary (HDSM-Ax) score and Dermatology Life Quality Index (DLQI). The secondary outcomes focused on safety and tolerability. Study selection, data extraction and assessment of risk of bias will be performed by two investigators independently. Data synthesis will be performed with RevMan 5.4 software.
Ethics and disseminationEthical approval will not be needed in this review due to no data are involved in patients information and privacy. The results will be published and diffused in a peer-reviewed journal or relative conferences.
Strengths and limitations of this studyThis systematic review will be the first to evaluate the efficacy and safety of topical anticholinergic agents in the treatment of PHH.
Different inclusion criteria between studies and different skin sites where drugs applied may lead to clinical heterogeneity, which will be explored in the subgroup analysis.
This study will also focus on evaluating systemic and topical safety and tolerability of topical application of anticholinergics. | dermatology |
10.1101/2022.03.06.22271974 | Competitive fitness of emerging SARS-CoV-2 variants is linked to their Distinctiveness relative to preceding lineages from that region | The COVID-19 pandemic has seen the persistent emergence of fitter Variants of Concern (VOCs) that have successfully out-competed circulating strains, but the determinants of viral fitness remain unknown. Here we define Distinctiveness of SARS-CoV-2 sequences based on a proteome-wide comparison with all prior sequences from the same geographical region. From the perspective of viral evolution, Distinctiveness captures "regional herd exposure" and has the advantage over the canonical concept of mutation, which relies foremost on the reference ancestral sequence that is invariant over time. By assessing the correlation between Distinctiveness and change in prevalence for all circulating lineages in each region when a new lineage is introduced, we find that the relative Distinctiveness of emergent SARS-CoV-2 lineages is associated with their competitive fitness (Pearson r = 0.67). Further, by assessing the Delta variant in India versus Brazil, we show that the same lineage can have different Distinctiveness-contributing positions in different geographical regions depending on the other variants that previously circulated in those regions. Finally, analysis of Omicron lineages in India and USA shows the BA.1 and BA.2 sub-lineages have comparable distinctiveness, suggesting that they may have similar levels of competitive fitness. Overall, our study proposes that augmenting the ongoing surveillance of highly mutated variants with real-time assessment of Distinctiveness can aid in achieving robust pandemic preparedness. | public and global health |
10.1101/2022.03.05.22271967 | Intentions and willingness to engage in risky driving behaviour among high school adolescents: evaluating the bstreetsmart road safety program | Objectiveinvestigate the impact of a road-safety program on adolescent willingness to engage in risky behaviour as probationary drivers, adjusted for covariates of interest.
Methodbstreetsmart is a road-safety program delivered to around 25,000 adolescent students annually in New South Wales. Using a smart phone-based app, student and teacher participation incentives, students were surveyed before and after program attendance. Mixed methods linear regression analysed pre-post modified Behaviour of Young Novice Driver (BYNDS_M) scores.
Results2360 and 1260 students completed pre- and post-event surveys respectively. Post-event BYNDS_M scores were around 3 points lower than pre-event scores (-2.99, 95%CI - 3.418 to -2.466), indicating reduced intention to engage in risky driving behaviours. Covariates associated with higher stated intentions of risky driving were exposure to risky driving as a passenger (1.21, 95% CI 0.622-2.011), identifying as non-binary gender (20.8, 95% CI 8.795 to 40.852), adjusting for other predictors.
ConclusionsTrauma-informed, reality-based injury prevention programs are effective in changing short term stated intentions to engage in risky driving, among a pre-independent driving student population. The adolescent novice driver age group is historically challenging to engage, and injury prevention action must be multi-pronged to address the many factors influencing their behaviour.
What is already known on this topicRoad traffic injuries are the leading cause of death for adolescents in most developed countries globally. Injury prevention action must be multi-pronged to address the many factors influencing their behaviour
What this study addsThe bstreetsmart injury prevention intervention which incorporates trauma informed, CBT influence and reality-based road safety information to around 25,000 students annually, showed significant short-term impact on the stated willingness of the study population to engage in risky driving behaviour when obtaining their probationary licence. Adolescents are strongly influenced by examples of risky road behaviours among their closest adult drivers.
How this study might affect research, practice, or policyInterventions such as bstreetsmart hold a positive place in the multi-pronged approach needed to address the difficult issue of novice drivers. | public and global health |
10.1101/2022.03.05.22271084 | Changing dynamics of SARS-CoV2 B.1.617.2 (Delta variant) outbreak in the United Kingdom: Shifting of SARS-CoV2 infections from younger to elderly populations with increasing hospitalizations and mortality among elderly. | BackgroundTo assess the comprehensive dynamics of outcomes during the SARS-CoV2 B.1.617.2 (Delta variant) compared to the Alpha variant outbreak in the United Kingdom.
MethodsIn this observational study of the cases reported by Public Health England for confirmed (sequencing and genotyping), SARS-CoV2 cases Delta variant (n=592,692) and Alpha variant (n=150,934) were used. Outcomes were analyzed by age groups and compared with all reported weekly cases in the UK.
ResultsThe Delta variant surge is associated with a significantly lower case fatality rate (0.43% vs 1.07; RR 0.39; 95% CI 0.37-0.42; P<0.0001); lower odds of hospitalization (2.1% vs 3.0%; RR 0.70; 95% CI 0.68-0.73; P<0.0001) than the Alpha variant. During the Delta variant surge there were significant increased cases (11.3% to 21.1%, RR 1.87; 95% CI 1.84-1.89; P<0.0001), hospitalizations (40.2% to 56.5%; RR 1.40, 95% CI 1.3-1.46; P<0.0001) among confirmed Delta variant cases in the [≥]50 years age group during the August 3-September 12, 2021 period compared to earlier reported cases. There was also a significant increase in total weekly COVID-19 deaths noted among [≥]70 years old age group (71.4% to 75.1%; RR 1.05; 95% CI 1.01-1.08; P=0.0028) during August 6-October 8, 2021 compared to June 4-July 30, 2021 period.
ConclusionsThe Delta variant surge is associated with significantly lower mortality and hospitalizations than the Alpha variant. As the Delta variant surge progressed, [≥]50 years old had a significant increased percentage of cases, hospitalizations and a significant increased COVID-19 deaths occurred among [≥]70 years old age group. | infectious diseases |
10.1101/2022.03.04.22271706 | Remdesivir for the treatment of patients hospitalized with COVID-19 receiving supplemental oxygen: a targeted literature review and meta-analysis | This network meta-analysis (NMA) assessed the efficacy of remdesivir in hospitalized patients with COVID-19 requiring supplemental oxygen. Randomized controlled trials of hospitalized patients with COVID-19, where patients were receiving supplemental oxygen at baseline and at least one arm received treatment with remdesivir, were identified. Outcomes included mortality, recovery, and no longer requiring supplemental oxygen. NMAs were performed for low-flow oxygen (LFO2); high-flow oxygen (HFO2), including NIV; or oxygen at any flow (AnyO2) at early (day 14/15) and late (day 28/29) time points. Six studies were included (N=5,245 patients) in the NMA. Remdesivir lowered early and late mortality among AnyO2 patients (risk ratio (RR) 0.52, 95% credible interval (CrI) 0.34-0.79; RR 0.81, 95%CrI 0.69-0.95) and LFO2 patients (RR 0.21, 95%CI 0.09-0.46; RR 0.24, 95%CI 0.11-0.48); no improvement was observed among HFO2 patients. Improved early and late recovery was observed among LFO2 patients (RR 1.22, 95%CrI 1.09-1.38; RR 1.17, 95%CrI 1.09-1.28). Remdesivir also lowered the requirement for oxygen support among all patient subgroups. Among hospitalized patients with COVID-19 requiring supplemental oxygen at baseline, use of remdesivir compared to best supportive care is likely to improve the risk of mortality, recovery and need for oxygen support in AnyO2 and LFO2 patients. | infectious diseases |
10.1101/2022.03.07.22271833 | GWAS and meta-analysis identifies multiple new genetic mechanisms underlying severe Covid-19. | Pulmonary inflammation drives critical illness in Covid-19, 1;2 creating a clinically homogeneous extreme phenotype, which we have previously shown to be highly efficient for discovery of genetic associations. 3;4 Despite the advanced stage of illness, we have found that immunomodulatory therapies have strong beneficial effects in this group. 1;5 Further genetic discoveries may identify additional therapeutic targets to modulate severe disease. 6 In this new data release from the GenOMICC (Genetics Of Mortality in Critical Care) study we include new microarray genotyping data from additional critically-ill cases in the UK and Brazil, together with cohorts of severe Covid-19 from the ISARIC4C 7 and SCOURGE 8 studies, and meta-analysis with previously-reported data. We find an additional 14 new genetic associations. Many are in potentially druggable targets, in inflammatory signalling (JAK1, PDE4A), monocyte-macrophage differentiation (CSF2), immunometabolism (SLC2A5, AK5), and host factors required for viral entry and replication (TMPRSS2, RAB2A). As with our previous work, these results provide tractable therapeutic targets for modulation of harmful host-mediated inflammation in Covid-19. | intensive care and critical care medicine |
10.1101/2022.03.03.22271812 | Culture and identification of a Deltamicron SARS-CoV-2 in a three cases cluster in southern France | Multiple SARS-CoV-2 variants have successively, or concommitantly spread worldwide since summer 2020. A few co-infections with different variants were reported and genetic recombinations, common among coronaviruses, were reported or suspected based on co-detection of signature mutations of different variants in a given genome. Here we report three infections in southern France with a Delta 21J/AY.4-Omicron 21K/BA.1 "Deltamicron" recombinant. The hybrid genome harbors signature mutations of the two lineages, supported by a mean sequencing depth of 1,163-1,421 reads and mean nucleotide diversity of 0.1-0.6%. It is composed of the near full-length spike gene (from codons 156-179) of an Omicron 21K/BA.1 variant in a Delta 21J/AY.4 lineage backbone. Importantly, we cultured an isolate of this recombinant and sequenced its genome. It was observed by scanning electron microscopy. As it is misidentified with current variant screening qPCR, we designed and implemented for routine diagnosis a specific duplex qPCR. Finally, structural analysis of the recombinant spike suggested its hybrid content could optimize viral binding to the host cell membrane. These findings prompt further studies of the virological, epidemiological, and clinical features of this recombinant. | infectious diseases |
10.1101/2022.03.03.22271812 | Culture and identification of a Deltamicron SARS-CoV-2 in a three cases cluster in southern France | Multiple SARS-CoV-2 variants have successively, or concommitantly spread worldwide since summer 2020. A few co-infections with different variants were reported and genetic recombinations, common among coronaviruses, were reported or suspected based on co-detection of signature mutations of different variants in a given genome. Here we report three infections in southern France with a Delta 21J/AY.4-Omicron 21K/BA.1 "Deltamicron" recombinant. The hybrid genome harbors signature mutations of the two lineages, supported by a mean sequencing depth of 1,163-1,421 reads and mean nucleotide diversity of 0.1-0.6%. It is composed of the near full-length spike gene (from codons 156-179) of an Omicron 21K/BA.1 variant in a Delta 21J/AY.4 lineage backbone. Importantly, we cultured an isolate of this recombinant and sequenced its genome. It was observed by scanning electron microscopy. As it is misidentified with current variant screening qPCR, we designed and implemented for routine diagnosis a specific duplex qPCR. Finally, structural analysis of the recombinant spike suggested its hybrid content could optimize viral binding to the host cell membrane. These findings prompt further studies of the virological, epidemiological, and clinical features of this recombinant. | infectious diseases |
10.1101/2022.03.01.22271721 | The relative impact of vaccination momentum on COVID-19 rates of death in the USA in 2020/2021. The forgotten role of population wellness. | It is widely accepted that individual underlying health conditions contribute to morbidity and mortality associated with COVID-19; and by inference population wellness will also contribute to COVID-19 outcomes. In addition, over the last two years the predominant pharmaceutical public health response to COVID-19 has been vaccination momentum (i.e. mass and rapid inoculation campaigns).
This paper aims to compare vaccination momentum throughout 2021 and measures of population wellness to estimate the relative impact of each on deaths attributed to COVID-19 across the 50 States of America, plus Washington DC, during 2020 (i.e. the pre-vaccination period) and 2021 (i.e. the vaccination period).
Our analysis shows that: (a) COVID-19 rates of death in 2020 are more important, and statistically more significant, at predicting rates of death in 2021 than vaccination momentum during 2021; (b) vaccination momentum does not predict the magnitude of change in COVID-19 rates of death between 2020 and 2021; and (c) for several underlying heath and risk factors vaccination momentum is significantly less important than population wellness at predicting COVID-19 rates of death.
Of particular interest are our observations that exercise and fruit consumption are 10.1 times more significant at predicting COVID-19 deaths than vaccination momentum, obesity (BMI 30+) is 9.6 times more significant at predicting COVID-19 deaths than vaccination momentum, heart attacks are 4.37 times more significant at predicting COVID-19 deaths than vaccination momentum and smoking is 3.2 times more significant at predicting COVID-19 deaths than vaccination momentum.
If medical and health regulators are to deliver a quantum decrease in COVID-19 deaths they must move beyond the overwhelming focus on COVID-19 vaccination. They must have the courage to urge governments and private organisations to mandate greater exercise, weight loss, less junk food, and better nutrition. And a concerted effort at reducing chronic adverse health conditions. | public and global health |
10.1101/2022.02.27.22271450 | Improving the diagnosis of endometrial hyperplasia using computerized analysis and immunohistochemical biomarkers | Endometrial hyperplasia (EH) is a precursor lesion to endometrial carcinoma (EC). Risks for EC include genetic, hormonal and metabolic factors most notably those associated with obesity: rates are rising and there is concern that cases in pre-menopausal women may remain undetected. Making an accurate distinction between benign and pre-malignant disease is both a challenge for the pathologist and important to the gynaecologist who wants to deliver the most appropriate care to meet the needs of the patient. Premalignant change may be recognised by histological changes of endometrial hyperplasia (which may occur with or without atypia) and endometrial intraepithelial neoplasia (EIN).
In this study we created a tissue resource of EH samples diagnosed between 2004 and 2009 (n=125) and used this to address key questions: 1. Are the EIN/WHO2014 diagnostic criteria able to consistently identify premalignant endometrium? 2. Can computer aided image analysis inform identification of EIN? 3. Can we improve diagnosis by incorporating analysis of protein expression using immunohistochemistry.
Our findings confirmed the inclusion of EIN in diagnostic criteria resulted in a better agreement between expert pathologists compared with the previous WHO94 criteria used for the original diagnosis of our sample set. A computer model based on assessment of stromal:epithelial ratio appeared most accurate in classification of areas of tissue without EIN. From an extensive panel of putative endometrial protein tissue biomarkers a score based on assessment of HAND2, PTEN and PAX2 was able to identify four clusters one of which appeared to be more likely to be benign.
In summary, our study has highlighted new opportunities to improve diagnosis of pre-malignant disease in endometrium and provide a platform for further research on this important topic.
HighlightsO_LIBlinded re-analysis of n=125 samples previously classified as endometrial hyperplasia found improved intra-observer agreement (67%) using EIN/WHO2014 compared with standard WHO1994 criteria (52%)
C_LIO_LIComputerised analysis of endometrial hyperplasia tissue architecture showed promise as a tool to assist pathologists in diagnosis of difficult to classify cases
C_LIO_LIA diagnosis of endometrial intraepithelial neoplasia (EIN) using the WHO2014 criteria more accurately predicted risk of endometrial cancer than WHO1994 system.
C_LIO_LIEIN samples exhibited altered expression of ARID1A (negative glands) and HAND2 (reduced or absent from stroma)
C_LIO_LIUnsupervised hierarchical cluster analysis based on immunostaining for PTEN, PAX2 and HAND2 identified 4 subtypes one of which segregated with benign disease.
C_LIO_LIThese results provide a framework for improved classification of pre-malignant lesions in endometrium that may inform personalized care pathways
C_LI | obstetrics and gynecology |
10.1101/2022.03.01.22271578 | Population-based whole-genome sequencing with constrained gene analysis identifies predisposing germline variants in children with central nervous system tumors | BackgroundThe underlying cause of central nervous system (CNS) tumors in children is largely unknown. In this nationwide, prospective population-based study we investigate rare germline variants across known and putative CPS genes and genes exhibiting evolutionary intolerance of inactivating alterations in children with CNS tumors.
MethodsOne hundred and twenty-eight children with CNS tumors underwent whole-genome sequencing of germline DNA. Single nucleotide and structural variants in 315 cancer related genes and 2,986 highly evolutionarily constrained genes were assessed. A systematic pedigree analysis covering 3,543 close relatives was performed.
ResultsThirteen patients harbored rare pathogenic variants in nine known CPS genes. The likelihood of carrying pathogenic variants in CPS genes was higher for patients with medulloblastoma than children with other tumors (OR 5.9, CI 1.6-21.2). Metasynchronous CNS tumors were observed exclusively in children harboring pathogenic CPS gene variants (n=2, p=0.01).
In general, known pCPS genes were shown to be significantly more constrained than both genes associated with risk of adult-onset malignancies (p=5e-4) and all other genes (p=5e-17). Forty-seven patients carried 66 loss-of-functions variants in 60 constrained genes, including eight variants in six known pCPS genes. A deletion in the extremely constrained EHMT1 gene, formerly somatically linked with sonic hedgehog medulloblastoma, was found in a patient with this tumor.
Conclusions{bsim}10% of pediatric CNS tumors can be attributed to rare variants in known CPS genes. Analysis of evolutionarily constrained genes may increase our understanding of pediatric cancer susceptibility.
3 key pointsO_LI{bsim}10% of children with CNS tumors carry a pathogenic variant in a known cancer predisposition gene
C_LIO_LIKnown pediatric-onset cancer predisposition genes show high evolutionary constraint
C_LIO_LILoss-of-function variants in evolutionarily constrained genes may explain additional risk
C_LI
Importance of this studyAlthough CNS tumors constitute the most common form of solid neoplasms in childhood, our understanding of their underlying causes remains sparse. Predisposition studies often suffer from selection bias, lack of family and clinical data or from being limited to SNVs in established cancer predisposition genes. We report the findings of a prospective, population-based investigation of genetic predisposition to pediatric CNS tumors. Our findings illustrate that 10% of children with CNS tumors harbor a damaging alteration in a known cancer gene, of which the majority (9/13) are loss-of-function alterations. Moreover, we illustrate how recently developed knowledge on evolutionarily loss-of-function intolerant genes may be used to investigate additional pediatric cancer risk and present EHMT1 as a putative novel predisposition gene for SHH medulloblastoma. Previously undescribed links between variants in known cancer predisposition genes and specific brain tumors are presented and the importance of assessing both SV and SNV is illustrated. | genetic and genomic medicine |
10.1101/2022.02.23.22271275 | Nurses' experience of using video consultation in a digital care setting and its impact on their workflow and communication | Sweden as many other countries uses video consultation to increase patients access to primary healthcare services particularly during the COVID-19 pandemic. Working in digital care settings and using new technologies, in this case video consultations, require learning new skills and adoption to new workflow. The aim of this study is to explore nurses experience of using video consultation in a digital care setting and its impact on their workflow and communication. Fifteen semi-structured interviews were carried out with registered nurses recruited from a private digital healthcare provider. Interviews were recorded, transcribed, and analysed using an abductive approach. Nurses workflow was modeled, and several categories and subcategories were identified: nurses workflow (efficiency, flexibility, and information accessibility); communication (interaction with patients and interprofessional communication); user experience (change and development of the platform, challenges, and combining digital and physical care). Even though providing online care has its limitations, the nurses were positive towards using video consultations. | health informatics |
10.1101/2022.03.07.22271764 | Usefulness of real-time RT-PCR to understand the kinetics of SARS-CoV-2 in blood: a prospective study. | BackgroundSARS-CoV-2 viral load and kinetics assessed in serial blood samples from hospitalised COVID-19 patients by RT-PCR are poorly understood.
MethodsWe conducted an observational, prospective case series study in hospitalised COVID-19 patients. Clinical outcome data (Intensive Care Unit admission and mortality) were collected from all patients until discharge. Viremia was determined longitudinally during hospitalisation, in plasma and serum samples using two commercial and standardised RT-PCR techniques approved for use in diagnosis of SARS-CoV-2. Viral load (copies/mL and log10) was determined with quantitative TaqPathCOVID-19 test.
ResultsSARS-CoV-2 viremia was studied in 57 hospitalised COVID-19 patients. Persistent viremia (PV) was defined as two or more quantifiable viral loads detected in blood samples (plasma/serum) during hospitalisation. PV was detected in 16 (28%) patients. All of them, except for one who rapidly progressed to death, cleared viremia during hospitalisation. Poor clinical outcome occurred in 62.5% of patients with PV, while none of the negative patients or those with sporadic viremia presented this outcome (p<0.0001). Viral load was significantly higher in patients with PV than in those with Sporadic Viremia (p<0.05). Patients presented PV for a short period of time: median time from admission was 5 days (Range=2-12) and 4.5 days (Range=2-8) for plasma and serum samples, respectively. Similar results were obtained with all RT-PCR assays for both types of samples.
ConclusionsDetection of persistent SARS-CoV-2 viremia, by real time RT-PCR, expressed as viral load over time, could allow identifying hospitalised COVID-19 patients at risk of poor clinical outcome.
HighlightsO_LICommercial RT-PCR techniques could be used to monitor SARS-CoV-2 viremia kinetics.
C_LIO_LISARS-CoV-2 persistent viremia is related with poor outcome in COVID-19 patient.
C_LIO_LISARS-Cov-2 viremia kinetics could be used as a biomarker of poor prognosis.
C_LIO_LIPlasma samples are the best choice for analysis of SARS-CoV-2 viremia kinetics.
C_LI | infectious diseases |
10.1101/2022.02.23.22271045 | High frequency and persistence of Anti-DSG2 antibodies in post COVID-19 serum samples | BackgroundThere is growing recognition that COVID-19 does cause cardiac sequelae. The underlying mechanisms involved are still poorly understood to date. Viral infections, including COVID-19, have been hypothesized to contribute to autoimmunity, by exposing previously hidden cryptic epitopes on damaged cells to an activated immune system Given the high incidence of cardiac involvement seen in COVID-19, our aim was to determine the frequency of anti-DSG2 antibodies in a population of post COVID-19 patients.
Methods and Results300 convalescent serum samples were obtained from a group of post COVID-19 infected patients from October 2020 to February 2021. 154 samples were drawn 6 months post-COVID-19 infection and 146 samples were drawn 9 months post COVID infection. 17 samples were obtained from the same patient at the 6- and 9-month mark. An electrochemiluminescent-based immunoassay utilizing the extracellular domain of DSG2 for antibody capture was used. The mean signal intensity of anti-DSG2 antibodies in the post COVID-19 samples was significantly higher than that of a healthy control population (19+/-83.2 vs. 2.1+/-6.8, P value <0.001). Of note, 29.3% of the post COVID-19 infection samples demonstrated a signal higher than the 90th percentile of the control population and 8.7% were higher than the median found in ARVC patients. The signal intensity between the 6-month and 9-month samples did not differ significantly.
ConclusionsWe report for the first time that recovered COVID-19 patients demonstrate significantly higher and sustained levels of anti-DSG2 autoantibodies as compared to a healthy control population, comparable to that of a diagnosed ARVC group. | cardiovascular medicine |
10.1101/2022.03.03.22271786 | Regional variation in cardiovascular genes enables a tractable genome editing strategy | Recent rise in the number and therapeutic potential of genome engineering technologies has generated excitement in cardiovascular genetics. One significant barrier to their implementation is costly and time consuming reagent development for novel unique variants. Prior data have illuminated the potential for regional clustering of disease-causing genetic variants in known and potential novel functional protein domains. We hypothesized that most cardiovascular disease-relevant genes in ClinVar would display regional variant clustering, and that multiple variants within a regional hotspot could be targeted with limited prime editing reagents. We collated 2,471 high confidence pathogenic or likely pathogenic (P/LP) missense and truncating variants from 111 cardiovascular disease genes. We then defined a regional clustering index (RCI), the percent of P/LP variants in a given gene located within a pre-specified window around each variant. At a window size commensurate with maximally reported prime editing extension length, 78bp, we found that missense variants displayed a higher mean RCI than truncating variants. We next identified genes particularly attractive for pathogenic hotspot-targeted prime editing with at least 20 P/LP variants and observed that the mean RCI in missense variants remained higher than for truncating variants and for rare variants observed in the same genes in gnomAD (Mean{+/-}SD RCI78bp: P/LP ClinVar Missense=5.2{+/-}5.9%; P/LP ClinVar Truncating=2.1{+/-}3.2%, gnomAD Missense=2{+/-}3.2%, gnomAD Truncating=1.1{+/-}2.4%. p<2.2e-16; ANOVA with post hoc Bonferroni correction). Further, we tested the feasibility of prime editing for multiple variants in a single hotspot in KCNH2, a gene with a high mean missense variant RCI. Sixty variants were induced in this hotspot in HEK293 cells with CRISPR-X. Correction of these variants was attempted with prime editing using two overlapping prime editing guide RNA extensions. The mean prime editing efficiency across CRISPR-X-enriched variants within this hotspot was 57{+/-}27%, including 3 P/LP variants. These data underscore the utility of pathogenic variant hotspots in the diagnosis and treatment of inherited cardiovascular disease. | cardiovascular medicine |
10.1101/2022.03.03.22271780 | Detecting eczema areas in digital images: an impossible task? | Assessing the severity of atopic dermatitis (AD, or eczema) traditionally relies on a face-to-face assessment by healthcare professionals, and may suffer from inter- and intra-rater variability. With the expanding role of telemedicine, several machine learning algorithms have been proposed to automatically assess AD severity from digital images. Those algorithms usually detect and then delineate ("segment") AD lesions before assessing lesional severity, and are trained using the data of AD areas detected by healthcare professionals. To evaluate the reliability of such data, we estimated the inter-rater reliability of AD segmentation in digital images.
Four dermatologists independently segmented AD lesions in 80 digital images collected in a published clinical trial. We estimated the inter-rater reliability of the AD segmentation using the intra-class correlation coefficients (ICCs) at the pixel-level and the area-levels for different resolutions of the images. The average ICC was 0.45 (SE = 0.04) corresponding to a "poor" agreement between raters, while the degree of agreement for AD segmentation varied from image to image.
The AD segmentation in digital images is highly rater-dependent even between dermatologists. Such limitations need to be taken into consideration when the AD segmentation data are used to train machine learning algorithms that assess eczema severity. | dermatology |
10.1101/2022.03.08.22270944 | Predictability and Stability Testing to Assess Clinical Decision Instrument Performance for Children After Blunt Torso Trauma | ObjectiveThe Pediatric Emergency Care Applied Research Network (PECARN) has developed a clinical-decision instrument (CDI) to identify children at very low risk of intra-abdominal injury. However, the CDI has not been externally validated. We sought to vet the PECARN CDI with the Predictability Computability Stability (PCS) data science framework, potentially increasing its chance of a successful external validation.
Materials & MethodsWe performed a secondary analysis of two prospectively collected datasets: PECARN (12,044 children from 20 emergency departments) and an independent external validation dataset from the Pediatric Surgical Research Collaborative (PedSRC; 2,188 children from 14 emergency departments). We used PCS to reanalyze the original PECARN CDI along with new interpretable PCS CDIs we developed using the PECARN dataset. External validation was then measured on the PedSRC dataset.
ResultsThree predictor variables (abdominal wall trauma, Glasgow Coma Scale Score <14, and abdominal tenderness) were found to be stable. Using only these variables, we developed a PCS CDI which had a lower sensitivity than the original PECARN CDI on internal PECARN validation but performed the same on external PedSRC validation (sensitivity 96.8% and specificity 44%).
ConclusionThe PCS data science framework vetted the PECARN CDI and its constituent predictor variables prior to external validation. In this case, the PECARN CDI with 7 predictors, and our PCS-based CDI with 3 stable predictors, had identical performance on independent external validation. This suggests that both CDIs will generalize well to new populations, offering a potential strategy to increase the chance of a successful (costly) prospective validation. | emergency medicine |
10.1101/2022.02.25.22270903 | The association between major trauma centre care and outcomes of adult patients injured by low falls in England and Wales. | BackgroundDisability and death due to low falls is increasing worldwide and disproportionately affects older adults. Current trauma systems were not designed to suit the needs of these patients. This study assessed the effectiveness of major trauma centre care in adult patients injured by low falls.
MethodsData were obtained from the Trauma Audit and Research Network on adult patients injured by falls from <2 metres between 2017-2019 in England and Wales. 30-day survival, length of hospital stay and discharge destination were compared between major trauma centres (MTCs) and trauma units or local emergency hospitals (TU/LEHs).
Results127,334 patients were included of whom 27.6% attended an MTC. The median age was 79.4 years (IQR 64.5-87.2 years), and 74.2% of patients were aged >65 years. MTC care was not associated with improved 30-day survival (adjusted odds ratio [AOR] 0.91, 95% CI 0.87-0.96). Transferred patients had a significant impact upon the results. After excluding transferred patients, the AOR for survival in MTCs was 1.056 (95% CI 1.001-1.113).
ConclusionTU/LEH care is at least as effective as MTC care due to the facility for secondary transfer from TU/LEHs to MTCs. In patients who are not transferred, MTCs are associated with greater odds of 30-day survival in the whole cohort and in the most severely injured patients. Future research must determine the optimum means of identifying patients in need of higher-level care; the components of care which improve patient outcomes; and develop patient-focused outcomes which reflect the characteristics and priorities of contemporary trauma patients.
KEY MESSAGESO_ST_ABSWhat is already known on this topicC_ST_ABSO_LICurrent trauma systems were not designed to manage rising numbers of elderly patients injured by low falls.
C_LIO_LIPrevious evidence for the role of major trauma centre (MTC) care in such patients yielded conflicting results.
C_LI
What this study addsO_LIThis study demonstrates that non-trauma centre care is no worse than MTC care, as long as the possibility of transfer exists.
C_LIO_LITherefore MTCs do have a role in the management of elderly adults injured by low falls, particularly the severely injured.
C_LI
How this study might affect research, policy or practiceO_LIResearch must identify those patients who need transfer, the most effective components of care, and patient-centric outcomes.
C_LI | emergency medicine |
10.1101/2022.03.08.22271577 | Metabolic changes induced by pharmacological castration of young, healthy men: a study of the plasma metabolome | Testosterone is a hormone that plays a key role in carbohydrate, fat, and protein metabolism. Testosterone deficiency is associated with multiple comorbidities, e.g., metabolic syndrome and type 2 diabetes. Despite its importance in many metabolic pathways, the mechanisms by which it controls metabolism are not fully understood. The present study investigated the short-term metabolic changes of pharmacologically induced castration and testosterone supplementation in healthy young males. Thirty subjects were submitted to testosterone depletion (TD) followed by testosterone supplementation (TS). Plasma samples were collected three times corresponding to basal, low, and restored testosterone levels. An untargeted metabolomics study was performed by liquid chromatography-high resolution mass spectrometry (UHPLC-HRMS) to monitor the metabolic changes induced by the altered hormone levels. Our results demonstrated that TD is associated with major metabolic changes partially restored by TS. Carnitine and amino acid metabolism were the metabolic pathways most impacted by variations in testosterone. Furthermore, our results also indicate that LH and FSH might strongly alter the plasma levels of indoles and lipids, especially glycerophospholipids and sphingolipids. Our results demonstrate major metabolic changes induced by low testosterone that may be important for understanding the mechanisms behind the association of testosterone deficiency and its comorbidities. | endocrinology |
10.1101/2022.03.06.21267462 | Risk of myocarditis and pericarditis following COVID-19 vaccination in England and Wales | We describe our analyses of data from over 49.7 million people in England, representing near-complete coverage of the relevant population, to assess the risk of myocarditis and pericarditis following BNT162b2 and ChAdOx1 COVID-19 vaccination. A self-controlled case series (SCCS) design has previously reported increased risk of myocarditis after first ChAdOx1, BNT162b2, and mRNA-1273 dose and after second doses of mRNA COVID-19 vaccines in England. Here, we use a cohort design to estimate hazard ratios for hospitalised or fatal myocarditis/pericarditis after first and second doses of BNT162b2 and ChAdOx1 vaccinations. SCCS and cohort designs are subject to different assumptions and biases and therefore provide the opportunity for triangulation of evidence. In contrast to the findings from the SCCS approach previously reported for England, we found evidence for lower incidence of hospitalised or fatal myocarditis/pericarditis after first ChAdOx1 and BNT162b2 vaccination, as well as little evidence to suggest higher incidence of these events after second dose of either vaccination. | epidemiology |
10.1101/2022.03.06.22270594 | Post COVID-19 Condition in South Africa: 3-month follow-up after hospitalisation with SARS-CoV-2 | BackgroundPost COVID-19 Condition (PCC) as defined by WHO refers to a wide range of new, returning, or ongoing health problems experienced by COVID-19 survivors, and represents a rapidly emerging public health priority. We aimed to establish how this developing condition has impacted patients in South Africa and which population groups are at risk.
MethodsIn this prospective cohort study, participants [≥]18 years who had been hospitalised with laboratory-confirmed SARS-CoV-2 infection during the second and third wave between December 2020 and August 2021 underwent telephonic follow-up assessment up at one-month and three-months after hospital discharge. Participants were assessed using a standardised questionnaire for the evaluation of symptoms, functional status, health-related quality of life and occupational status. Multivariable logistic regression models were used to determine factors associated with PCC.
FindingsIn total, 1,873 of 2,413 (78%) enrolled hospitalised COVID-19 participants were followed up at three-months after hospital discharge. Participants had a median age of 52 years (IQR 41-62) and 960 (51.3%) were women. At three-months follow-up, 1,249 (66.7%) participants reported one or more persistent COVID-related symptom(s), compared to 1,978/2,413 (82.1%) at one-month post-hospital discharge. The most common symptoms reported were fatigue (50.3%), shortness of breath (23.4%), confusion or lack of concentration (17.5%), headaches (13.8%) and problems seeing/blurred vision (10.1%). On multivariable analysis, factors associated with new or persistent symptoms following acute COVID-19 were age [≥]65 years [adjusted odds ratio (aOR) 1.62; 95%confidence interval (CI) 1.00-2.61]; female sex (aOR 2.00; 95% CI 1.51-2.65); mixed ethnicity (aOR 2.15; 95% CI 1.26-3.66) compared to black ethnicity; requiring supplemental oxygen during admission (aOR 1.44; 95% CI 1.06-1.97); ICU admission (aOR 1.87; 95% CI 1.36-2.57); pre-existing obesity (aOR 1.44; 95% CI 1.09-1.91); and the presence of [≥]4 acute symptoms (aOR 1.94; 95% CI 1.19-3.15) compared to no symptoms at onset.
InterpretationThe majority of COVID-19 survivors in this cohort of previously hospitalised participants reported persistent symptoms at three-months from hospital discharge, as well as a significant impact of PCC on their functional and occupational status. The large burden of PCC symptoms identified in this study emphasises the need for a national health strategy. This should include the development of clinical guidelines and training of health care workers, in identifying, assessing and caring for patients affected by PCC, establishment of multidisciplinary national health services, and provision of information and support to people who suffer from PCC. | public and global health |
10.1101/2022.03.02.22271808 | "Pandemic of the unvaccinated"? At midlife, white people are less vaccinated but still at less risk of Covid-19 mortality in Minnesota | Recent research underscores the exceptionally young age distribution of Covid-19 deaths in the United States compared with international peers. We show that the high level of Covid mortality at midlife ages (45-64) is deeply intertwined with continuing racial inequality in Covid-19 mortality, which persists even in contexts in which non-white populations might seem to have a survival advantage. We use data from Minnesota, which allows vaccination and mortality rates to be observed with greater age and temporal precision than national data. At midlife ages in Minnesota, Black, Hispanic, and Asian populations were all more highly vaccinated than white populations during the states substantial and sustained Delta surge. Yet white mortality rates were lower than those of all other racial groups. These mortality disparities were extreme; during the Delta period, non-white Covid-19 mortality at ages 55-64 was greater than white mortality at 10 years older. This discrepancy between vaccination and mortality patterning by race suggests that, if the current period is a "pandemic of the unvaccinated," it also remains a "pandemic of the disadvantaged" in ways that can decouple from vaccination rates. This result implies an urgent need to center health equity in the development of Covid-19 policy measures. | public and global health |
10.1101/2022.03.04.22271909 | Tract-based white matter hyperintensity patterns in patients with Systemic Lupus Erythematosus using an unsupervised machine learning approach. | Currently, little is known about the spatial distribution of white matter hyperintensities (WMH) in the brain of patients with Systemic Lupus erythematosus (SLE). Previous lesion markers, such as number and volume, ignore the strategic location of WMH. The goal of this work was to develop a fully-automated method to identify predominant patterns of WMH across WM tracts based on cluster analysis. A total of 221 SLE patients with and without neuropsychiatric symptoms from two different sites were included in this study. WMH segmentations and lesion locations were acquired automatically. Cluster analysis was performed on the WMH distribution in 20 WM tracts. Our pipeline identified five distinct clusters with predominant involvement of the forceps major, forceps minor, as well as right and left anterior thalamic radiations and the right inferior fronto-occipital fasciculus. The patterns of the affected WM tracts were consistent over the SLE subtypes and sites. Our approach revealed distinct and robust tract-based WMH patterns within SLE patients. This method could provide a basis, to link the location of WMH with clinical symptoms. Furthermore, it could be used for other diseases characterized by presence of WMH to investigate both the clinical relevance of WMH and underlying pathomechanism in the brain. | radiology and imaging |
10.1101/2022.03.03.22271859 | A qualitative examination of the factors affecting the adoption of injury focused wearable technologies in recreational runners | PurposeUnderstanding users perceived usefulness and ease of use of technologies will influence their adoption and sustained use. The objectives of this study were to determine the metrics deemed important by runners for monitoring running-related injury (RRI) risk, and identify the barriers and facilitators to their use of injury focused wearable technologies.
MethodsA qualitative focus group study was undertaken. Nine semi-structured focus groups with male (n=13) and female (n=14) recreational runners took place. Focus groups were audio and video recorded, and transcribed verbatim. Transcripts were thematically analysed. A critical friend approach was taken to data coding, and multiple methods of trustworthiness were executed.
ResultsExcessive loading and inadequate recovery were deemed the most important risk factors to monitor for RRI risk. Other important factors included training activities, injury status and history, and running technique. The location and attachment method of a wearable device and the design of a smartphone application were identified as important barriers and facilitators, with receiving useful injury-related feedback identified as a further facilitator.
ConclusionsOvertraining, training-related and individual- related risk factors are essential metrics that need to be monitored for RRI risk. RRI apps should include the metrics deemed important by runners, once there is supporting evidence- based research. The difficulty and/or ease of use of a device, and receiving useful feedback will influence the adoption of injury focused running technologies. There is a clear willingness from recreational runners to adopt injury focused wearable technologies whilst running. | sports medicine |
10.1101/2022.03.06.22271955 | Exploring potential causal genes for uterine leiomyomas: A summary data-based Mendelian randomization and FUMA analysis | ObjectiveTo explore potential causal genetic variants and genes underlying the pathogenesis of uterine leiomyomas (ULs).
MethodsWe conducted the summary data-based Mendelian randomization (SMR) analysis and performed functional mapping and annotation using FUMA to examine genetic variants and genes that are potentially involved in the pathogenies of ULs.
Both analyses used summarized data of a recent genome-wide association study (GWAS) on ULs, which has a total sample size of 244,324 (20,406 cases and 223,918 controls). For the SMR analysis, we performed separate analysis using CAGE and GTEx eQTL data.
ResultsUsing the CAGE eQTL data, our SMR analysis identified 13 probes tagging 10 unique genes that were pleiotropically/potentially causally associated with ULs, with the top three probes being ILMN_1675156 (tagging CDC42, PSMR=8.03x10-9), ILMN_1705330 (tagging CDC42, PSMR=1.02x10-7) and ILMN_2343048 (tagging ABCB9, PSMR=9.37x10-7). Using GTEx eQTL data, our SMR analysis did not identify any significant genes after correction for multiple testing. FUMA analysis identified 106 independent SNPs, 24 genomic loci and 137 genes that are potentially involved in the pathogenesis of ULs, seven of which were also identified by the SMR analysis.
ConclusionsWe identified many genetic variants, genes, and genomic loci that are potentially involved in the pathogenesis of ULs. More studies are needed to explore the exact underlying mechanisms in the etiology of ULs. | genetic and genomic medicine |
10.1101/2022.03.04.22271462 | SETBP1 variants outside the degron disrupt DNA-binding and transcription independent of protein abundance to cause a heterogeneous neurodevelopmental disorder | Germline de novo SETBP1 variants cause clinically distinct and heterogeneous neurodevelopmental disorders. Heterozygous missense variants at a hotspot encoding a canonical degron lead to SETBP1 accumulation and Schinzel-Giedion syndrome (SGS), a rare severe developmental disorder involving multisystem malformations. Heterozygous loss-of-function variants result in SETBP1 haploinsufficiency disorder which is phenotypically much milder than SGS. Following an initial description of four individuals with atypical SGS carrying heterozygous missense variants adjacent to the degron, a few individual cases of variants outside the degron were reported. Due to the lack of systematic investigation of genotype-phenotype associations of different types of SETBP1 variants, and limited understanding of the roles of the gene in brain development, the extent of clinical heterogeneity and how this relates to underlying pathophysiological mechanisms remain elusive, imposing challenges for diagnosis and patient care. Here, we present a comprehensive investigation of the largest cohort to-date of individuals carrying SETBP1 missense variants outside the degron (n=18, including one in-frame deletion). We performed thorough clinical and speech phenotyping with functional follow-up using cellular assays and transcriptomics. Our findings suggest that such variants cause a clinically and functionally variable developmental syndrome, showing only partial overlaps with classical SGS and SETBP1 haploinsufficiency disorder, and primarily characterised by intellectual disability, epilepsy, speech and motor impairment. We provide evidence of loss-of-function pathophysiological mechanisms impairing ubiquitination, DNA-binding and transcription. In contrast to SGS and SETBP1 haploinsufficiency, these effects are independent of protein abundance. Overall, our study provides important novel insights into diagnosis, patient care and aetiology of SETBP1-related disorders. | genetic and genomic medicine |
10.1101/2022.03.05.22271961 | Mining for Health: A Comparison of Word Embedding Methods for Analysis of EHRs Data | Electronic health records (EHRs), routinely collected as part of healthcare delivery, offer great promise for advancing precision health. At the same time, they present significant analytical challenges. In EHRs, data for individual patients are collected at irregular time intervals and with varying frequencies; they include both structured and unstructured data. Advanced statistical and machine learning methods have been developed to tackle these challenges, for example, for predicting diagnoses earlier and more accurately. One powerful tool for extracting useful information from EHRs data is word embedding algorithms, which represent words as vectors of real numbers that capture the words semantic and syntactic similarities. Learning embeddings can be viewed as automated feature engineering, producing features that can be used for predictive modeling of medical events. Methods such as Word2Vec, BERT, FastText, ELMo, and GloVe have been developed for word embedding, but there has been little work on re-purposing these algorithms for the analysis of structured medical data. Our work seeks to fill this important gap. We extended word embedding methods to embed (structured) medical codes from a patients entire medical history, and used the resultant embeddings to build prediction models for diseases. We assessed the performance of multiple embedding methods in terms of predictive accuracy and computation time using the Medical Information Mart for Intensive Care (MIMIC) database. We found that using Word2Vec, Fast-Text, and GloVe algorithms yield comparable models, while more recent contextual embeddings provide marginal further improvement. Our results provide insights and guidance to practitioners regarding the use of word embedding methods for the analysis of EHR data. | health informatics |
10.1101/2022.03.04.22271937 | VinDr-PCXR: An open, large-scale chest radiograph dataset for interpretation of thoracic diseases in children | Computer-aided diagnosis systems in adult chest radiography (CXR) have recently achieved great success thanks to the availability of large-scale, annotated datasets and the advent of high-performance supervised learning algorithms. However, the development of diagnostic models for detecting and diagnosing pediatric diseases in CXR scans is undertaken due to the lack of high-quality physician-annotated datasets. To overcome this challenge, we introduce and release VinDr-PCXR, a new pediatric CXR dataset of 9,125 studies retrospectively collected from a major pediatric hospital in Vietnam between 2020 and 2021. Each scan was manually annotated by a pediatric radiologist who has more than ten years of experience. The dataset was labeled for the presence of 36 critical findings and 15 diseases. In particular, each abnormal finding was identified via a rectangle bounding box on the image. To the best of our knowledge, this is the first and largest pediatric CXR dataset containing lesion-level annotations and image-level labels for the detection of multiple findings and diseases. For algorithm development, the dataset was divided into a training set of 7,728 and a test set of 1,397. To encourage new advances in pediatric CXR interpretation using data-driven approaches, we provide a detailed description of the VinDr-PCXR data sample and make the dataset publicly available on https://physionet.org/. | health informatics |
10.1101/2022.03.05.22271946 | A scoping systematic review of implementation frameworks to effectively transition interventions into clinical practice in oncology, nuclear medicine and radiology | BackgroundClinical research studies have made significant strides globally requiring clear processes to transition research interventions into clinical practice. Theoretically, implementation of novel interventions require clear methods as part of a fit-for-purpose (FFP) framework comprising of effective adaptation processes in conjunction with practice policies. Implementation science (IS) based operational research (OR) is vital in global health as it addresses the know-how-do gap using a real-world setting to achieve best practices to sustain healthcare. Despite this, limited OR is available to evaluate and validate implementation frameworks for complex clinical specialties such as oncology, diagnostic radiology (DR), nuclear medicine (NM) and interventional radiology (IR). This is the first study to systematically review implementation frameworks including its validity and applicability in healthcare.
MethodWe searched 17 databases including PubMed, Medline/OvidSP, Science Direct, PROSPERO, PRISMA, PubMed Health, Embase, EBSCOhost, SciELO, TRIP, ProQuest, Academic search complete, Ageline, Cochrane, Web-of-Science and BIOSIS using a comprehensive search strategy and MeSH indexing to review publications from January 1st 1980 to 31st March 2019 in English. We selected 20 publications as per the inclusion/exclusion criteria developed under a review protocol registered with PROSPERO (CRDG42019124020).
FindingsThere were no publications indicating a validated framework or a specific system used to implement evidence based interventions (EBIs) within oncology, IR, NM and DR although there were generalized implementation processes, adaptation models and policies. Furthermore, validation studies were not conducted against these frameworks to review their applicability and viability in healthcare especially in the UK.
InterpretationIt is evident there is a research implementation gap in healthcare and further research is required to establish a fit for purpose framework to cover multiple blind spots using a real-world (RW) setting. Current evidence also suggests, alignment of academic theories to healthcare including its applicability to various clinical specialties is needed. | health systems and quality improvement |
10.1101/2022.03.06.22271718 | Three separate spike antigen exposures by COVID-19 vaccination or SARS-CoV-2 infection elicit strong humoral immune responses in healthcare workers | As part of our ongoing prospective seroprevalence study, we assessed the SARS-CoV-2 infection and COVID-19 vaccination-induced immunity of 697 hospital workers at the University Medical Center Hamburg-Eppendorf between January 17 and 31, 2022. The overall prevalence of anti-NC-SARS-CoV-2 antibodies indicating prior infection was 9.8% (n=68) and thus lower than the seroprevalence in the general population in Hamburg. At the current study visit, 99.3% (n=692) had received at least one vaccine dose and 93.1% (n=649) had received at least three vaccine doses. All vaccinated individuals had detectable anti-S1-RBD-SARS-CoV-2 antibodies (median AU/ml [IQR]: 13891 [8505 - 23543]), indicating strong protection against severe COVID-19. In addition, we show that individuals who received three COVID-19 vaccine doses (median AU/ml [IQR]: 13856 [8635 - 22705]), and those who resolved a prior SARS-CoV-2 infection and received two COVID-19 vaccine doses (median AU/ml [IQR] 13409 [6934 - 25000]) exhibited the strongest humoral immune responses. The low prevalence of anti-NC-SARS-CoV-2 antibodies indicates persistent effectiveness of established infection control interventions in preventing nosocomial SARS-CoV-2 transmission with the delta and omicron viral variants as predominant strains. Our study further indicates that three exposures to the viral spike protein by either SARS-CoV-2 infection or COVID-19 vaccination are necessary to elicit particularly strong humoral immune responses, which supports current vaccination recommendations. | infectious diseases |
10.1101/2022.03.04.22271870 | Nanopore sequencing for Mycobacterium tuberculosis drug susceptibility testing and outbreak investigation | BackgroundMycobacterium tuberculosis whole-genome sequencing (WGS) using Illumina technology has been widely adopted for genotypic drug susceptibility testing (DST) and outbreak investigation. Oxford Nanopore Technologies is reported to have higher error rates but has not been thoroughly evaluated for these applications.
MethodsWe analyse 151 isolates from Madagascar, South Africa and England with phenotypic DST and matched Illumina and Nanopore data. Using PacBio assemblies, we select Nanopore filters for BCFtools (software) detection of single nucleotide polymorphisms (SNPs). We compare transmission clusters identified by Nanopore and the United Kingdom Health Security Agency Illumina pipeline (COMPASS). We compare Illumina and Nanopore WGS-based DST predictions using Mykrobe (software).
FindingsNanopore/BCFtools identifies SNPs with median precision/recall of 99{middle dot}5/90{middle dot}2% compared with 99{middle dot}6/91{middle dot}9% for Illumina/COMPASS. Using a threshold of 12 SNPs for putative transmission clusters, Illumina identifies 98 isolates as unrelated and 53 as belonging to 19 distinct clusters (size range 2-7). Nanopore reproduces this distribution with addition of 5 singleton isolates to distinct clusters and merging of two cluster pairs. Illumina-based clusters are also replicated using a 5 SNP threshold. Clustering accuracy is maintained using mixed Illumina/Nanopore datasets. Genotyping resistance variants is highly concordant, with 0(4) discordant SNPs (indels) across 151 isolates genotyped at >3000 (60,000) SNPs (indels).
InterpretationIllumina and Nanopore sequence data provide comparable cluster-identification and DST results.
FundingAcademy for Medical Sciences (SGL018\110), Oxford Wellcome Institutional Strategic Support Fund (ISSF TT17 4). Swiss South Africa Joint Research Award (Swiss national science Foundation and South African national research foundation).
Research in contextO_ST_ABSEvidence before this studyC_ST_ABSTwo key types of information can be obtained from laboratory testing of M. tuberculosis isolates to help directly guide public health interventions: drug susceptibility testing (DST) to guide therapy, and bacterial typing to enrich understanding of the epidemiology and guide interventions to mitigate transmission.
DST is typically performed by the "gold standard" culture-based phenotyping method or nucleic acid amplification assays targeting specific resistance-conferring mutations. Studies over the last 7 years have shown that prediction of susceptibility profile using Illumina-technology genome sequence data is possible, and can be automated. In a key publication, the CRyPTIC consortium and UK 100,000 Genomes project evaluated the method on over 10,000 genomes including prospectively sampled isolates and showed that for first-line tuberculosis (TB) drugs (isoniazid, rifampicin, ethambutol, pyrazinamide) a pan-susceptibility profile is accurate enough to be used clinically. The genetic basis of resistance remains imperfectly understood for second-line TB drugs, in particular for new and repurposed drugs (bedaquiline, clofazimine, delamanid, linezolid). Prior work in the field of genotypic DST was heavily based on Illumina technology, which provides short (70-300 base pair) sequence reads of very high quality. Many different softwares (e.g. TBProfiler, Mykrobe, MTBseq, kvarq) have been designed for sequence analysis and genotypic DST. However, the increasingly used Nanopore sequencing platforms yield very different data with much longer sequence reads (frequently over 1kb) and higher error rates including systematic biases. To date, very limited evaluation of Nanopore-based drug susceptibility prediction has been performed using the only two compatible tools (Mykrobe (n=5 independent samples), TBProfiler (n=3 independent samples)).
Molecular typing of M. tuberculosis allows lineage identification and detection of putative transmission clusters. In the last decade, multiple M. tuberculosis molecular epidemiology studies have shown how genomic information can complement traditional epidemiology in identifying person-to-person transmission clusters with a high level of resolution. Typically, the number of single nucleotide polymorphism (SNP) disagreements between genomes, or SNP distance, is calculated and single-linkage clustering is performed for genomes falling within retrospectively established transmission thresholds of either 5 or 12 SNPs. Just as with DST, these thresholds were established with Illumina sequencing data. The increased error rate in Nanopore sequencing is believed to lead to inflated SNP distances if standard genome analysis tools are used. Prior to this study it was unknown what impact on isolate-clustering this would incur.
Added value of this studyFull-scale adoption of genomic sequencing in tuberculosis reference laboratories has so far taken place in a limited number of settings - England, the Netherlands, and New York State - all using Illumina-based sequencing data. Building on current evidence, specific WHO technical guidance and diversification and democratisation of technology, sequencing is expected to be increasingly used in tuberculosis control globally. For the first time, our study offers 4 key deliverables intended to inform adoption of Nanopore technology as an alternative, or a complement, to Illumina. First: a systematic head-to-head comparison of Nanopore and Illumina data for M. tuberculosis drug susceptibility profiling and isolate clustering, including quantitative metrics for cluster precision and recall. Second: an assessment of the impact of mixed Illumina and Nanopore data on clustering which represents an increasingly common challenge. Third: an open-source software pipeline allowing research and reference laboratories to replicate our analytical approach. Fourth: a publicly available curated test set of 151 isolates, including matched Illumina and Nanopore sequence data, and (for a subset of seven isolates) high-quality PacBio assemblies, for method development and validation.
Implications of all the available evidenceCatalogues of drug resistance conferring mutations will keep improving, especially for new and repurposed drugs. Our data confirms that Illumina and Nanopore sequencing technologies can be used to identify those mutations equally accurately in M. tuberculosis. Bacterial molecular typing is constantly shown to support the understanding of disease transmission and tuberculosis control in new settings. The bioinformatics tools and filters we have developed, assessed, and made publicly available allow the use of Nanopore or mixed-technology data to appropriately cluster genetically related isolates. We provide a measure of the expected level of over-clustering associated with Nanopore technology. This study confirms that Illumina and Nanopore sequence data provide comparable DST results and isolate cluster-identification. | infectious diseases |
10.1101/2022.03.06.22271982 | Risk Factor Stratification for Postoperative Delirium: A Retrospective Database Study | BackgroundPostoperative Delirium (POD) is a disturbing reality for patients and their families. Absence of easy-to-use and accurate risk scores prompted us to retrospectively extract data from the electronic health records (EHR) to identify clinical factors associated with POD. We seek to create a multivariate nomogram to predict the risk of POD based upon the most significant clinical factors.
MethodsThe EHR of patients >18 years of age undergoing surgery and had POD assessment were reviewed. Patient characteristics and study variables were summarized between delirium groups. We constructed univariate logistic regression models for POD using each study variable to estimate odds ratios (OR) and constructed a multivariable logistic regression model with stepwise variable selection. In order to create a clinically useful/implementable tool we created a nomogram to predict risk of delirium.
ResultsOverall, we found a rate of POD of 3.7% across our study population. The Model achieved an AUC of the ROC curve of 0.83 (95% CI 0.82-0.84). We found that age, increased ASA score (ASA 3-4 OR 2.81, CI1.49-5.28, p<0.001), depression (OR 1.28, CI1.12-1.47, p<0.001), postoperative benzodiazepine use (OR 3.52, CI3.06-4.06, p<0.001) and urgent cases (Urgent OR 3.51, CI2.92-4.21, p<0.001; Emergent OR 3.99, CI3.21-4.96, p<0.001; Critically Emergent OR 5.30, CI3.53-7.96, p<0.001) were associated with POD.
ConclusionsWe were able to distinguish the contribution of individual risk factors to the development of POD. We created a clinically useful easy-to-use tool that has the potential to accurately identify those at high-risk of delirium, a first step to prevent POD. | anesthesia |
10.1101/2022.03.06.22271984 | Acute temporal effect of ambient air pollution on common congenital cardiovascular defects and cleft palate: a case-crossover study | This symmetric bidirectional case-crossover study examined the association between short-term ambient air pollution exposure during weeks 3-8 of pregnancy and certain common congenital anomalies in Ulaanbaatar, Mongolia, between 2014 and 2018. Using predictions from a Random Forest regression model, authors assigned daily ambient air pollution exposure of particulate matter <2.5 um aerodynamic diameter, sulphur dioxide, nitrogen dioxide, and carbon monoxide for each subject based on their administrative area of residence. We used conditional logistic regression with adjustment for corresponding apparent temperature to estimate relative odds of select congenital anomalies per IQR increase in mean concentrations and quartiles of air pollutants. The adjusted relative odds of cardiovascular defects (ICD-10 subchapter: Q20-Q28) was 2.64 (95% confidence interval: 1.02-6.87) per interquartile range increase in mean concentrations of particulate matter <2.5 um aerodynamic diameter for gestational week 7. This association was further strengthened for cardiac septal defects (ICD-10 code: Q21, odds ratio: 7.28, 95% confidence interval: 1.6-33.09) and isolated ventricular septal defects (ICD-10 code: Q21.0, odds ratio: 9.87, 95% confidence interval: 1.6-60.93). We also observed an increasing dose-response trend when comparing the lowest quartile of air pollution exposure with higher quartiles on weeks 6 and 7 for Q20-Q28 and Q21 and week 4 for Q21.0. Other notable associations include increased relative odds of cleft lip and cleft palate subchapter (Q35-Q37) and PM2.5 (OR: 2.25, 95% CI: 0.62-8.1), SO2 (OR: 2.6, 95% CI: 0.61-11.12), and CO (OR: 2.83, 95% CI: 0.92-8.72) in week 4. Our findings contribute to the limited body of evidence regarding the acute effect of ambient air pollution exposure on common adverse birth outcomes. | epidemiology |
10.1101/2022.03.06.22271884 | The Relationship between Continuity of Care with a Primary Care Provider and Duration of Work Disability in Workers with Low Back Pain: A Retrospective Cohort Study | ObjectiveTo determine the continuity of care (CoC) provided by general practitioners among workers with low back pain; identify personal, workplace and social factors associated with CoC in this population; and investigate if CoC is associated with working time loss.
Data sourcesAn administrative database containing accepted workers compensation claim and service level data, for workers with back pain from five Australian jurisdictions, injured between July 2010 and June 2015.
Study DesignA retrospective cohort study. Outcomes were CoC with a general practitioner, measured with the Usual Provider Continuity index, and working time loss, measured as the number of weeks for which workers compensation income support benefits were paid.
Extraction methodsEligible workers had at least four general practitioner services, and greater than two weeks working time loss. Usual Provider Continuity index score was categorised as complete, high, moderate, or low CoC. Ordinal logistic regression models examined factors associated with Usual Provider Continuity category. Quantile regression models examined association between duration of working time loss and Usual Provider Continuity category, in four groups with different volumes of general practitioner services.
Principal FindingsComplete CoC was observed in 33.8% of workers, high CoC among 37.7%, moderate CoC in 22.1%, and low CoC in 6.4%. Higher Usual Provider Continuity was associated with fewer general practitioner services, older age, living in urban areas, an occupation as a Community and Personal Service Worker or Clerical and Administrative Worker, and the state of Victoria. In workers with more than two months of time loss, those with complete CoC consistently had shorter durations of time loss.
ConclusionsHigher CoC with a general practitioner is associated with less working time loss and this relationship is strongest in the sub-acute phase of low back pain.
Callout Box
What is known on this topicO_LIContinuity of care is a key component of best practice primary care
C_LIO_LILow back pain is a condition that often requires ongoing management and care from a general practitioner
C_LIO_LIThe relationship between continuity of care and work disability duration, recovery and return to work in workers with low back pain is not known
C_LI
What this study addsO_LIWorkers with low back pain who see the same general practitioner for all services (i.e., have a greater continuity of care) generally have shorter durations of working time loss
C_LIO_LIHigher continuity of care was observed in workers who had fewer PCP services, were aged over 45, lived in urban areas, and worked as a Community and Personal Service Worker or Clerical and Administrative Worker
C_LIO_LIWorkers compensation systems should consider policies and guidelines that increase continuity of care in injured workers
C_LI | primary care research |
10.1101/2022.03.04.22271871 | mtDNA copy number, mtDNA total somatic deletions, Complex I activity, synapse number and synaptic mitochondria number are altered in schizophrenia and bipolar disorder | Mitochondrial dysfunction is a neurobiological phenomenon implicated in the pathophysiology of schizophrenia and bipolar disorder that can synergistically affect synaptic neurotransmission. We hypothesized that schizophrenia and bipolar disorder share molecular alterations at the mitochondrial and synaptic level: mtDNA copy number (CN), mtDNA common deletion (CD), mtDNA total deletion, complex I activity, synapse number, and synaptic mitochondria number. These mitochondrial parameters were studied in dorsolateral prefrontal cortex (DLPFC), superior temporal gyrus (STG), primary visual cortex (V1), and nucleus accumbens (NAc) of 44 controls (CON), 27 schizophrenia (SZ), and 26 bipolar disorder (BD) subjects. The results showed- (i) the mtDNA CN is significantly higher in DLPFC of both SZ and BD, decreased in the STG of BD, and unaltered in V1 and NAc of both SZ and BD; (ii) the mtDNA CD is significantly higher in DLPFC of BD while unaltered in STG, V1 and NAc of both SZ and BD; (iii) The total deletion burden is significantly higher in DLPFC in both SZ and BD while unaltered in STG, V1, and NAc of SZ and BD; (iv) Complex I activity is significantly lower in DLPFC of both SZ and BD with no alteration in STG, V1 and NAc. (v) The number of synapses is decreased in DLPFC of both SZ and BD, while the synaptic mitochondria number was significantly lower in female SZ and female BD compared to female controls. Overall, these findings will pave the way to better understand the pathophysiology of schizophrenia and bipolar disorder for therapeutic interventions. | psychiatry and clinical psychology |
10.1101/2022.03.07.22271219 | DIAGNOSTIC ACCURACY OF NON-INVASIVE DETECTION OF SARS-COV-2 INFECTION BY CANINE OLFACTION | BACKGROUNDThroughout the COVID-19 pandemic, testing individuals remains a key action. One approach to rapid testing is to consider the olfactory capacities of trained detection dogs.
METHODSProspective cohort study in two community COVID-19 screening centers. Two nasopharyngeal swabs (NPS), one saliva and one sweat samples were simultaneously collected. The dog handlers (and the dogs...) were blinded with regards to the Covid status. The diagnostic accuracy of non-invasive detection of SARS-CoV-2 infection by canine olfaction was assessed as compared to nasopharyngeal RT-PCR as the reference standard, saliva RT-PCR and nasopharyngeal antigen testing.
RESULTS335 ambulatory adults (143 symptomatic and 192 asymptomatic) were included. Overall, 109/335 participants tested positive on nasopharyngeal RT-PCR either in symptomatic (78/143) or in asymptomatic participants (31/192). The overall sensitivity of canine detection was 97% (95% CI, 92 to 99) and even reached 100% (95% CI, 89 to 100) in asymptomatic individuals compared to NPS RT-PCR. The specificity was 91% (95% CI, 72 to 91), reaching 94% (95% CI, 90 to 97) for asymptomatic individuals. The sensitivity of canine detection was higher than that of nasopharyngeal antigen testing (97% CI: 91 to 99 versus 84% CI: 74 to 90, p=0.006), but the specificity was lower (90% CI: 84 to 95 versus 97% CI: 93 to 99, p=0.016).
CONCLUSIONSNon-invasive detection of SARS-CoV-2 infection by canine olfaction could be one alternative to NPS RT-PCR when it is necessary to obtain a result very quickly according to the same indications as antigenic tests in the context of mass screening. | public and global health |
10.1101/2022.03.04.22271541 | Automatic Extraction of Social Determinants of Health from Medical Notes of Chronic Lower Back Pain Patients | BackgroundAdverse social determinants of health (SDoH), or social risk factors, such as food insecurity and housing instability, are known to contribute to poor health outcomes and inequities. Our ability to study these linkages is limited because SDoH information is more frequently documented in free-text clinical notes than structured data fields. To overcome this challenge, there is a growing push to develop techniques for automated extraction of SDoH. In this study, we explored natural language processing (NLP) and inference (NLI) methods to extract SDoH information from clinical notes of patients with chronic low back pain (cLBP), to enhance future analyses of the associations between SDoH and low back pain outcomes and disparities.
MethodsClinical notes (n=1,576) for patients with cLBP (n=386) were annotated for seven SDoH domains: housing, food, transportation, finances, insurance coverage, marital and partnership status, and other social support, resulting in 626 notes with at least one annotated entity for 364 patients. We additionally labelled pain scores, depression, and anxiety. We used a two-tier taxonomy with these 10 first-level ontological classes and 68 second-level ontological classes. We developed and validated extraction systems based on both rule-based and machine learning approaches. As a rule-based approach, we iteratively configured a clinical Text Analysis and Knowledge Extraction System (cTAKES) system. We trained two machine learning models (based on convolutional neural network (CNN) and RoBERTa transformer), and a hybrid system combining pattern matching and bag-of-words models. Additionally, we evaluated a RoBERTa based entailment model as an alternative technique of SDoH detection in clinical texts. We used a model previously trained on general domain data without additional training on our dataset.
ResultsFour annotators achieved high agreement (average kappa=95%, F1=91.20%). Annotation frequency varied significantly dependent on note type. By tuning cTAKES, we achieved a performance of F1=47.11% for first-level classes. For most classes, the machine learning RoBERTa-based NER model performed better (first-level F1=84.35%) than other models within the internal test dataset. The hybrid system on average performed slightly worse than the RoBERTa NER model (first-level F1=80.27%), matching or outperforming the former in terms of recall. Using an out-of-the-box entailment model, we detected many but not all challenging wordings missed by other models, reaching an average F1 of 76.04%, while matching and outperforming the tested NER models in several classes. Still, the entailment model may be sensitive to hypothesis wording and may require further fine tuning.
ConclusionThis study developed a corpus of annotated clinical notes covering a broad spectrum of SDoH classes. This corpus provides a basis for training machine learning models and serves as a benchmark for predictive models for named entity recognition for SDoH and knowledge extraction from clinical texts. | public and global health |
10.1101/2022.03.05.22271959 | 13 cis retinoic acid improved the outcomes of COVID-19 patients. A randomized clinical trial | The COVID-19 pandemic caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) has infected over 100 million people causing over 2.4 million deaths over the world, and it is still expanding. Given the urgency of the COVID-19 pandemic, the clinical investigation of approved drugs is a promising alternative to find a timely effective treatment. In this randomized trial, we investigated the activity of both oral and aerosolized 13 cis retinoic acid in the treatment of SARS-COV-2 added to standard of care treatment in patients with COVID-19 versus standard of care treatment alone. This was a randomized controlled trial conducted at Kafrelsheikh Universitys Quarantine Hospitals, Egypt. After obtaining informed consent, forty patients with a confirmed diagnosis of COVID-19 were enrolled in the study. They were randomly assigned to one of two groups: Group I; 20 patients received aerosolized and oral 13 cis retinoic acid plus standard of care treatment (13 cis RA group) and Group II; 20 patients received only standard care treatment as a control group. The two groups were age and gender matched. There was no statistically significant difference between them in any of the baseline characteristics or laboratory parameters. The results showed that there was a high significant difference between the two groups regarding intensive care unit (ICU) admission, mortality and improvement (P<0.05). Only 10.52 % of patients in the 13 cis retinoic acid group needed ICU admission compared to 28.57 % in the control arm. There was no mortality in the 13 cis retinoic acid group, whereas about 14.35% were died in the group II. All patients who received 13 cis retinoic acid noticed a high improvement (P<0.001), and the mean value for clinical improvement was 16.3{+/-}4.5 days. There was no significant difference regarding the laboratory parameters before and after 14 days of treatment in the group of patients received the standard of care treatment (P=0.66). Univariate logistic regression analysis showed overall mortality was significantly related to the patients age, serum ferritin, C-reactive protein, oxygen saturation, the presence of diabetes mellitus, obesity, and abdominal pain. We conclude that 13 cis retinoic acid is a promising drug in the treatment of patients with COVID-19 infection, when added to the standard of care treatment. | respiratory medicine |
10.1101/2022.03.05.22271957 | Recovery of kicking kinematics and performance following intermittent high-intensity running bouts in young soccer players: can a local cooling intervention help? | Repeated high-intensity running (RHIR) exercise is known to affect central and peripheral functioning. Declines in RHIR performance are exacerbated by environmental heat stress. Accordingly, the use of post-exercise cooling strategies (COOL) is recommended as it may assist recovery. The present study aimed to investigate, in a hot environment (> 30{o}C), the effects of local COOL following RHIR on indices of soccer kicking movement and performance in youth soccer. Fifteen academy under-17 players (16.27 {+/-} 0.86 years-old; all post-PHV), acting as their own controls, participated. In #Experiment 1, players completed an all-out RHIR protocol (10 x 30 m bouts interspersed with 30 s intervals). In #Experiment 2, the same players performed the same running protocol under two conditions, 1) 5 minutes of COOL where ice packs were applied to the quadriceps and hamstrings regions and, 2) a control condition involving only passive resting. In both experiments, perceptual measures [ratings of perceived exertion (RPE), pain and recovery], thigh temperature and kick-derived video kinematics (hip, knee, ankle and foot) and performance (ball speed and placement) were collected at baseline and post exercise and intervention. In the first experiment, RHIR led to moderate-to-large increases (p < 0.03) in RPE (d = 4.08), ankle eversion/inversion angle (d = 0.78) and mean radial error (d = 1.50) and small-to-large decreases (p < 0.04) in recovery (d = -1.83) and average/peak ball speeds (d = -0.42--0.36). In the second experiment RPE (p < 0.01; Kendalls W = 0.30) and mean radial error (p = 0.057; {eta}2 = 0.234) increased only post-control. Significant small declines in ball speed were also observed only post-control (p < 0.05; d = 0.35). Post-intervention CMfoot velocity was moderately faster in COOL as compared to control (p = 0.04; d = 0.60). RHIR acutely impaired kicking movement, ball speed and placement in youth soccer players. However, a short period of local cryotherapy may be beneficial in counteracting declines in indices of kicking performance in hot environment.
Trial registration number#RBR-8prx2m - ReBEC Brazilian Clinical Trials Registry | sports medicine |
10.1101/2022.03.05.22271957 | Recovery of kicking kinematics and performance following intermittent high-intensity running bouts in young soccer players: can a local cooling intervention help? | Repeated high-intensity running (RHIR) exercise is known to affect central and peripheral functioning. Declines in RHIR performance are exacerbated by environmental heat stress. Accordingly, the use of post-exercise cooling strategies (COOL) is recommended as it may assist recovery. The present study aimed to investigate, in a hot environment (> 30{o}C), the effects of local COOL following RHIR on indices of soccer kicking movement and performance in youth soccer. Fifteen academy under-17 players (16.27 {+/-} 0.86 years-old; all post-PHV), acting as their own controls, participated. In #Experiment 1, players completed an all-out RHIR protocol (10 x 30 m bouts interspersed with 30 s intervals). In #Experiment 2, the same players performed the same running protocol under two conditions, 1) 5 minutes of COOL where ice packs were applied to the quadriceps and hamstrings regions and, 2) a control condition involving only passive resting. In both experiments, perceptual measures [ratings of perceived exertion (RPE), pain and recovery], thigh temperature and kick-derived video kinematics (hip, knee, ankle and foot) and performance (ball speed and placement) were collected at baseline and post exercise and intervention. In the first experiment, RHIR led to moderate-to-large increases (p < 0.03) in RPE (d = 4.08), ankle eversion/inversion angle (d = 0.78) and mean radial error (d = 1.50) and small-to-large decreases (p < 0.04) in recovery (d = -1.83) and average/peak ball speeds (d = -0.42--0.36). In the second experiment RPE (p < 0.01; Kendalls W = 0.30) and mean radial error (p = 0.057; {eta}2 = 0.234) increased only post-control. Significant small declines in ball speed were also observed only post-control (p < 0.05; d = 0.35). Post-intervention CMfoot velocity was moderately faster in COOL as compared to control (p = 0.04; d = 0.60). RHIR acutely impaired kicking movement, ball speed and placement in youth soccer players. However, a short period of local cryotherapy may be beneficial in counteracting declines in indices of kicking performance in hot environment.
Trial registration number#RBR-8prx2m - ReBEC Brazilian Clinical Trials Registry | sports medicine |
10.1101/2022.03.06.22271981 | Increased variance in second electrode accuracy during deep brain stimulation and its relationship to pneumocephalus, brain shift, and clinical outcomes: a retrospective cohort study. | IntroductionAccurate placement of deep brain stimulation electrodes within the intended target is believed to be a key variable related to outcomes. However, methods to verify electrode location are not universally established.
Research QuestionThe aim of this study was to determine the feasibility of post-op lead localisation in clinical practice and its utility to audit our own DBS accuracy.
Material and MethodsA retrospective cohort study was performed of a consecutive series of patients with Parkinsons disease who underwent deep brain stimulation of either the globus pallidus internus (GPi) or subthalamic nucleus (STN) between 2016 and 2019. Image processing was performed using the Lead-DBS toolbox. Institutional ethical approval was granted as a review of service.
ResultsIn total 38 participants met the inclusion criteria. Electrode localisation was completed in 79%. Clinical outcomes included improvement in UPDRS III of 46% and PDQ39 of 32%. Overall electrode accuracy was 0.22 +/- 0.4mm for all electrodes to the main nucleus with 9 (12%) outliers but only 3 (4%) electrodes out with 2mm from the intended target. Accuracy was worse for the second electrode implanted and in the GPi but was not affected by pneumocephalus or brain shift. Neither clinical outcomes nor the volume of activated tissue was affected by electrode accuracy.
Discussion and ConclusionsA neuroimaging approach to electrode localisation allows qualitative appraisal of targeting accuracy and is feasible with routine clinical data. Such methods are complimentary to traditional co-ordinate approaches and lend themselves to developing large, collaborative, and quantitative projects.
HIGHLIGHTSO_LIOverall electrode accuracy was 0.22 +/- 0.4mm with only 3 (4%) electrodes out with 2mm from the intended target
C_LIO_LIAccuracy was significantly worse in the GPi versus the STN and on the second side implanted
C_LIO_LIInaccuracy occurred in the X (lateral) plane but was not related to pneumocephalus or brain shift
C_LI | neurology |
10.1101/2022.03.06.22271725 | A multi-phenotype analysis reveals 19 novel susceptibility loci for basal cell carcinoma and 15 for squamous cell carcinoma. | Basal cell carcinoma (BCC) and squamous cell carcinoma (SCC) are the most common forms of skin cancer. There is genetic overlap between skin cancers, pigmentation traits, and autoimmune diseases. We use linkage disequilibrium score regression to identify 20 traits (melanoma, pigmentation traits, autoimmune diseases, and blood biochemistry biomarkers) with a high genetic correlation (rg > 10%, P < 0.05) with BCC (20,791 cases and 286,893 controls in the UK Biobank) and SCC (7,402 cases and 286,892 controls in the UK Biobank), and use a multi-trait genetic analysis to identify 78 and 69 independent genome-wide significant (P < 5 x 10-8) susceptibility loci for BCC and SCC respectively; 19 BCC and 15 SCC loci are both novel and replicated (P < 0.05) in a large independent cohort; 23andMe, Inc (BCC: 251,963 cases and 2,271,667 controls, and SCC: 134,700 cases and 2,394,699 controls. Novel loci are implicated in BCC/SCC development and progression (e.g. CDKL1), pigmentation (e.g. DSTYK), cardiometabolic pathways (e.g. FADS2), and immune-regulatory pathways including; innate immunity against coronaviruses (e.g. IFIH1), and HIV-1 viral load modulation and disease progression (e.g. CCR5). We also report a powerful and optimised BCC polygenic risk score that enables effective risk stratification for keratinocyte cancer in a large prospective Canadian Longitudinal Study of Aging (794 cases and 18139 controls); e.g. percentage of participants reclassified; MTAGPRS = 36.57%, 95% CI = 35.89-37.26% versus UKBPRS= 33.23%, 95% CI=32.56-33.91%). | oncology |
10.1101/2022.03.04.22270834 | Assessment of an AI-based triage and notification system in detecting clinical signs of retinal diseases | PurposeThis study evaluates the performance of an artificial intelligence based triage and notification system that analyzes fundus photographs for nine signs: cotton wool spots, dot & blot hemorrhages, drusens, flame shaped hemorrhages, glaucomatous disc, hard exudates, retinal neovascularization, preretinal hemorrhage and vascular tortuosity. These signs may be present in multiple retinal diseases.
MethodsIn a blinded and adjudicated study, a set of 3484 photographs of unique eyes from 3305 patients, from 15 fundus cameras, were graded by retina specialists, and the results compared with an AI-based system.
ResultsThe AI performed at a mean sensitivity of 90.19% and a mean specificity of 88.38% across all signs. The best performance was in detecting glaucomatous disc with a sensitivity of 94.65% and a specificity of 95.36%. The worst performance for sensitivity was for detecting vascular tortuosity at 85.06% and that for specificity was for detecting drusens 85.21%.
ConclusionThe AI-based system performs at acceptable sensitivity and specificity levels in comparison to retina specialists in a large sample pooled across 15 fundus cameras for 9 different clinical signs. | ophthalmology |
10.1101/2022.03.04.22271916 | Therapeutic effects of Rosa Canina, Urtica Dioica and Tanacetum Vulgare Herbal Combination in Treatment of Tinnitus Symptoms; A Double-blind Randomized Clinical Trial | BackgroundTinnitus is defined as the perception of sound in the ear or head in the absence of an external stimulus for which we have no definite treatment. Neurotec(R) is a medication of herbal origin with IFDA approval. Previous studies showed the neuroprotective effect of Neurotec(R). In this study we evaluated the effectiveness of Neurotec in improving tinnitus symptoms.
MethodsThis double-blind randomized clinical trial was performed on patients with tinnitus. Patients received Neurotec 100 mg capsules (BID) or placebo for three months. Pure tone audiometry (PTA) was measured at 0.5, 1, 2, 4 and 6 KHz frequencies. Using a Tinnitus Handicap Inventory (THI) questionnaire, tinnitus loudness, daily annoyance, daily life or sleep disturbance, daily perception and mood alteration were evaluated.
ResultsFinally, 103 (69 male and 34 female) patients with a mean age of 51.33{+/-}13.91 years were analyzed. There was no significant difference between the intervention (n=53) and the control group (n=50) regarding baseline symptoms before and one month after the intervention (P>0.05). While, they were significantly different three months after the intervention (P<0.05). The mean pure tone air and bone conduction were not significantly different between the control and the intervention group before and three months after the intervention at 0.5,1,2 and 4 kHz (P>0.05). The mean pure tone air conduction was not significantly different between the two groups before and three months after the intervention at 6 kHz (P>0.05).
ConclusionA three-month treatment with Neurotec Capsules beside patient education can effectively control symptoms of patients with tinnitus. | otolaryngology |
10.1101/2022.03.04.22271900 | Attempt to replicate voxel-based morphometry analysis in fibromyalgia: Detection of below threshold differences framed by contributions of variable clinical presentation to low reproducibility. | BackgroundFibromyalgia is a prevalent chronic pain condition characterized by widespread pain and sensory hypersensitivity. While much remains unknown about fibromyalgias neurobiological underpinnings, central nervous system alterations appear to be heavily implicated in its pathophysiology. Previous research examining brain structural abnormalities associated with fibromyalgia has yielded inconsistent findings. Thus, we followed previous methods to examine brain gray matter differences in fibromyalgia. We hypothesized that, relative to healthy controls, participants with fibromyalgia would exhibit lower gray matter volume in regions consistently implicated in fibromyalgia: the anterior cingulate cortex and medial prefrontal cortex.
MethodsThis study used magnetic resonance imaging to evaluate regional and whole brain differences in gray matter among females with and without fibromyalgia. Group differences were analyzed with two-sample t-tests, controlling for total intracranial volume.
ResultsNo significant differences in regional or whole brain gray matter volumes were detected between fibromyalgia and healthy controls.
ConclusionsResults add to an existing body of disparate findings regarding brain gray matter volume differences in fibromyalgia, and suggest structural differences previously detected in fibromyalgia should be examined for reproducibility. Absent significant differences may also suggest that functional, but not structural, brain adaptations are primarily associated with fibromyalgia. | pain medicine |
10.1101/2022.03.04.22271003 | Cross-Ancestry Investigation of Venous Thromboembolism Genomic Predictors | Venous thromboembolism (VTE) is a complex disease with environmental and genetic determinants. We present new cross-ancestry meta-analyzed genome-wide association study (GWAS) results from 30 studies, with replication of novel loci and their characterization through in silico genomic interrogations. In our initial genetic discovery effort that included 55,330 participants with VTE (47,822 European, 6,320 African, and 1,188 Hispanic ancestry), we identified 48 novel associations of which 34 replicated after correction for multiple testing. In our combined discovery-replication analysis (81,669 VTE participants) and ancestry-stratified meta-analyses (European, African and Hispanic), we identified another 44 novel associations, which are new candidate VTE-associated loci requiring replication. In total, across all GWAS meta-analyses, we identified 135 independent genomic loci significantly associated with VTE risk. We also identified 31 novel transcript associations in transcriptome-wide association studies and 8 novel candidate genes with protein QTL Mendelian randomization analyses. In silico interrogations of hemostasis and hematology traits and a large phenome-wide association analysis of the 135 novel GWAS loci provided insights to biological pathways contributing to VTE, indicating that some loci may contribute to VTE through well-characterized coagulation pathways while others provide new data on the role of hematology traits, particularly platelet function. Many of the replicated loci are outside of known or currently hypothesized pathways to thrombosis. In summary, these findings highlight new pathways to thrombosis and provide novel molecules that may be useful in the development of antithrombosis treatments with reduced risk of bleeds. | genetic and genomic medicine |
10.1101/2022.02.27.22271072 | Navigating life with HIV as an older adult on the Kenyan coast: perceived health challenges seen through the biopsychosocial model | BackgroundKenya, like many sub-Saharan African countries (SSA), is experiencing a rise in the number of HIV infected adults aged [≥]50 years (recognized as older adults living with HIV [OALWH]). This trend has created a subgroup of vulnerable older adults demanding a prompt response in research, policy, and practice to address their complex and transitioning needs. Unfortunately, little is known about the health and wellbeing of these adults in Kenya. As such, we explore the experiences of OALWH and key stakeholders at the coast of Kenya to understand the health challenges facing the OALWH.
Material and methodsWe utilized the biopsychosocial model to explore views from 34 OALWH and 22 stakeholders (11 health care providers and 11 primary caregivers) on the physical, mental, and psychosocial health challenges of ageing with HIV in Kilifi County, Kenya, between October and December 2019. Data were drawn from semi-structured in-depth interviews, which were audio-recorded and transcribed. A framework approach was used to synthesize the data.
ResultsSymptoms of common mental disorders (e.g. stress, worry, thinking too much), comorbidities (especially ulcers/hyperacidity, hypertension, visual and memory difficulties), somatic symptoms (especially pain/body aches, fatigue, and sleep problems), financial difficulties, stigma, and discrimination were viewed as common across the participants. Suicidal ideation and substance use problems (especially mnazi - the local palm wine and ugoro - snuff) were also raised. There was an overlap of perceived risk factors across the three health domains, such as family conflicts, poverty, lack of social support, stigma, and the presence of comorbid health complaints.
ConclusionOur findings provide a preliminary understanding of challenges, using the biopsychosocial model, facing OALWH in a low-literacy Kenyan setting. We found that OALWH at the Kenyan coast are at risk of multiple physical, mental, and psychosocial challenges, likely affecting their HIV treatment and overall health. Before programmes can have any lasting impact on these adults, improved access to basic needs, including food, financial support, and caregiving, and a reduction of stigma and discrimination must be addressed. Future research should quantify the burden of these challenges and examine the resources available to these adults before piloting and testing feasible interventions. | hiv aids |
10.1101/2022.03.06.22271809 | Defining factors that influence vaccine-induced, cross-variant neutralizing antibodies for SARS-CoV-2 in Asians | The scale and duration of neutralizing antibody responses targeting SARS-CoV-2 viral variants represents a critically important serological parameter that predicts protective immunity for COVID-19. In this study, we present longitudinal data illustrating the impact of age, sex and comorbidities on the kinetics and strength of vaccine-induced neutralizing antibody responses for key variants in an Asian volunteer cohort. We demonstrate a reduction in neutralizing antibody titres across all groups six months post-vaccination and show a marked reduction in the serological binding and neutralizing response targeting Omicron compared to other viral variants. We also highlight the increase in cross-protective neutralizing antibody responses against Omicron induced by a third dose (booster) of vaccine. These data illustrate how key virological factors such as immune escape mutation combined with host factors such as age and sex of the vaccinated individuals influence the strength and duration of cross-protective serological immunity for COVID-19. | allergy and immunology |
10.1101/2022.03.04.22271898 | Impact on alcohol selection and purchasing of increasing the proportion of non-alcoholic versus alcoholic drinks: randomised controlled trial | ObjectiveIncreasing the availability of non-alcoholic options is a promising population-level intervention to reduce alcohol consumption, currently unassessed in naturalistic settings. This study in an online retail setting aimed to estimate the impact of increasing the proportion of non-alcoholic (relative to alcoholic) drinks, on selection and actual purchasing of alcohol.
DesignParallel-group randomised controlled trial.
SettingParticipants selected drinks in a simulated online supermarket, before purchasing them in an actual online supermarket.
ParticipantsAdults in England and Wales who regularly consumed and purchased beer and wine online (n=737).
InterventionParticipants were randomised to one of three groups: Higher Proportion of non-alcoholic drinks available (75%); Same Proportion (50%); Lower Proportion (25%).
Main outcome measureThe primary outcome was the number of alcohol units selected (with intention to purchase); secondary outcomes included purchasing.
Results607 participants completed the study and were included in the primary analysis. The Higher Proportion group selected 10.0 fewer alcohol units than the Lower Proportion group (-32%; 95%CI -42%,-22%) and 7.1 fewer units compared to the Same Proportion group (-25%; 95%CI -36%,-13%), based on model results in those selecting any drinks containing alcohol (559/607). There was no evidence of a difference between the Same Proportion and Lower Proportion groups (2.9 fewer units, -9%; 95%CI -22%,5%). For all other outcomes, alcohol selection and purchasing were consistently lowest in the Higher Proportion group.
ConclusionThis study provides evidence that substantially increasing the proportion of non-alcoholic drinks - from 25% to 50% or 75% - meaningfully reduces alcohol selection and purchasing. Further studies are warranted to assess whether these effects are realised in a range of real-world settings. | public and global health |
10.1101/2022.03.04.22271898 | Impact on alcohol selection and purchasing of increasing the proportion of non-alcoholic versus alcoholic drinks: randomised controlled trial | ObjectiveIncreasing the availability of non-alcoholic options is a promising population-level intervention to reduce alcohol consumption, currently unassessed in naturalistic settings. This study in an online retail setting aimed to estimate the impact of increasing the proportion of non-alcoholic (relative to alcoholic) drinks, on selection and actual purchasing of alcohol.
DesignParallel-group randomised controlled trial.
SettingParticipants selected drinks in a simulated online supermarket, before purchasing them in an actual online supermarket.
ParticipantsAdults in England and Wales who regularly consumed and purchased beer and wine online (n=737).
InterventionParticipants were randomised to one of three groups: Higher Proportion of non-alcoholic drinks available (75%); Same Proportion (50%); Lower Proportion (25%).
Main outcome measureThe primary outcome was the number of alcohol units selected (with intention to purchase); secondary outcomes included purchasing.
Results607 participants completed the study and were included in the primary analysis. The Higher Proportion group selected 10.0 fewer alcohol units than the Lower Proportion group (-32%; 95%CI -42%,-22%) and 7.1 fewer units compared to the Same Proportion group (-25%; 95%CI -36%,-13%), based on model results in those selecting any drinks containing alcohol (559/607). There was no evidence of a difference between the Same Proportion and Lower Proportion groups (2.9 fewer units, -9%; 95%CI -22%,5%). For all other outcomes, alcohol selection and purchasing were consistently lowest in the Higher Proportion group.
ConclusionThis study provides evidence that substantially increasing the proportion of non-alcoholic drinks - from 25% to 50% or 75% - meaningfully reduces alcohol selection and purchasing. Further studies are warranted to assess whether these effects are realised in a range of real-world settings. | public and global health |
10.1101/2022.03.04.22271923 | Factors affecting zero-waste behaviors: Focusing on the health effects of microplastics | Microplastics harm human health. Therefore, the present study assessed the knowledge and attitude of university students towards reducing microplastic use and examined their zero-waste behaviors. Our results lay the foundation for program development aimed at promoting zero-waste activities. The study was conducted from August 20, 2021, to September 10, 2021, including students at a university in G metropolitan city. Questions were developed to verify how the use of disposables and the knowledge, attitude, and behaviors related to zero-waste were affected after the COVID-19 pandemic. A survey was conducted with 197 students, and the data of 196 students were analyzed. Family type ({beta}=0.146, p=0.042) and usage of disposables ({beta}=0.158, p=0.049) were the factors affecting zero-waste behavior in Model 1. In Model 2, which included the subcategory of zero-waste knowledge, the health effects of microplastics ({beta}=0.197, p=0.008) and environmental preservation ({beta}=0.236, p=0.001) were significant factors. In Model 3, which included the subcategory of zero-waste attitude, the health effects of microplastics ({beta}=0.149, p=0.016), use of eco-friendly products ({beta}=0.342, p<0.001), and environmental preservation ({beta}=0.317, p<.001) were significant factors. Therefore, additional studies and education on the health effects of microplastics are warranted, and suitable alternatives for disposables must be developed. | nursing |
10.1101/2022.03.04.22271927 | Proteogenomic analysis identifies the persistence of targetable lesions and proteomes through disease evolution in pediatric acute lymphoblastic leukemia | Childhood acute lymphoblastic leukemia (ALL) genome landscapes indicate that relapses often arise from subclonal outgrowths. However, the impact of clonal evolution on the actionable proteome and response to targeted therapy is not known. Here, we present a comprehensive retrospective analysis of paired ALL diagnosis and relapsed specimen. Targeted next generation sequencing and proteome analysis indicated persistence of genome variants and stable proteomes through disease progression. Although variant-targeted therapies showed poor selectivity against viably-frozen tumor samples, paired samples showed high correlation of drug response. Proteome analysis prioritized PARP1 as a new pan-ALL target candidate; diagnostic and relapsed ALL samples demonstrated robust sensitivity to treatment with two PARP1/2 inhibitors. Together, these findings support initiating prospective precision oncology approaches at ALL diagnosis and emphasize the need to incorporate proteome analysis to prospectively determine tumor sensitivities, which are likely to be retained at disease relapse.
STATEMENT OF SIGNIFICANCEWe discover that disease progression and evolution in pediatric acute lymphoblastic leukemia is defined by the persistence of targetable genomic variants and stable proteomes, which reveal pan-ALL target candidates. Thus, personalized treatment options in childhood ALL may be improved with the incorporation of prospective proteogenomic approaches initiated at disease diagnosis. | oncology |
10.1101/2022.03.03.22271851 | Familial Clustering and Genetic Analysis of Severe Thumb Carpometacarpal Joint Osteoarthritis in a Large Statewide Cohort | ObjectivesThe objectives of this study are to 1) identify individuals that required surgery for thumb carpometacarpal osteoarthritis (CMCJ OA), 2) determine if CMCJ OA clusters in multigenerational families, 3) define the magnitude of familial risk of CMCJ OA, 4) identify risk factors associated with CMCJ OA and 5) identify rare genetic variants that segregate with familial CMCJ OA.
MethodsWe searched the Utah Population Database to identify a cohort of CMCJ OA patients that required a surgical procedure (CMC fusion or arthroplasty). Affected individuals were mapped to pedigrees to identify high-risk multigenerational families with excess clustering of CMCJ OA. Cox regression models were used to calculate familial risk of CMCJ OA in related individuals. Risk factors were evaluated using logistic regression models. Whole exome sequencing was used to identify a rare coding variant associated with familial CMCJ OA.
ResultsWe identified 550 pedigrees with excess clustering of severe CMCJ OA. The relative risk of developing CMCJ OA requiring surgical treatment was significantly elevated in first- and third-degree relatives of affected individuals, and significant associations with advanced age, female sex, obesity, and tobacco use were observed. A dominantly segregating, rare variant in CHSY3 was associated with familial CMCJ OA.
ConclusionsFamilial clustering of severe CMCJ OA was observed in a statewide population. Identification of a candidate gene indicates a genetic contribution to the etiology of the disease. Our data indicate the genetic and environmental factors contribute to the disease process, further highlighting the multifactorial nature of the disease.
Key messagesO_LIWe study a unique cohort of individuals requiring surgical management of CMCJ OA.
C_LIO_LISevere CMCJ OA clusters in large, multigenerational families indicating a genetic contribution to the disease.
C_LIO_LI.We discovered a dominant coding variant in CHSY3 in a family with severe CMCJ OA.
C_LI | orthopedics |
10.1101/2022.03.04.22271919 | Determinants of Neonatal Sepsis Admitted In Neonatal Intensive Care Unit At Public Hospitals Of Kaffa Zone, South West Ethiopia | BackgroundNeonatal sepsis is a systemic inflammatory response syndrome in the presence of infection during the first 28 days of life. Globally every year about 4 million children die in the first 4 weeks of life, of which 99% of the deaths occur in low and middle income countries and the most common causes of neonatal death in Ethiopia. Identification of the determinants for neonatal sepsis and treatment of newborns with sepsis is not adequate in low income countries like Ethiopia especially in southern part of the country.
ObjectiveTo identify determinants of neonatal sepsis admitted in neonatal intensive care unit at public hospitals of Kaffa zone, south west Ethiopia 2021.
MethodsInstitutional based unmatched case control study was conducted on a total sample of 248 (62 cases and 186 controls) in public Hospitals of kaffa zone from March to April 2021.The collected data were entered, coded and cleaned by Epidata version 3.1 and it was exported to SPSS version 25. Bi-variable and multivariable logistic regression was conducted. Variables with (p< 0.25) in bi-variable logistic regression analysis, were entered to multivariable logistic regression and then determinants which is statistical significant will be declared at P<0.05.
ResultA total of 248 (62 cases and 186 controls) were included in the study. variables like prolonged rupture of membrane [≥]18 hours [AOR =5.13, 95%CI=1.38-19.05], meconium stained amniotic fluid[AOR =6.03, 95%CI=2.16-16.90], intra-partum fever [AOR =8.26, 95%CI=3.12-21.97], urinary tract infections [AOR=14,55, 95%CI=4.91-43.10], breast feeding after a hour [AOR =3.9, 95%CI=1.27-12.02], resuscitation [AOR =13.25, 95%CI=3.44-51.01], no chlorohexidine application [AOR =4.27, 95%CI=1.65-11.08] were significantly associated with neonatal sepsis.
Conclusion and RecommendationAmong the variables prolonged rupture of membranes, meconium stained amniotic fluid, intra-partum fever, UTI/STI, and not breast feeding with in a hour were maternal variables and resuscitation at birth and not application of chlorohexidine ointment on the umbilicus were neonatal variables that were found to be neonatal-related risk factors of neonatal sepsis. Infection prevention strategies and clinical management need to be strengthening and/or implementing by providing especial attention for the specified determinants. | health systems and quality improvement |
10.1101/2022.03.07.22271994 | Preferences for the management of sexually transmitted infections in the South African health system: A discrete choice experiment | IntroductionYoung people have a disproportionate burden of sexually transmitted infections. Despite strengthening HIV prevention with the introduction of PrEP, STI services have remained relatively unchanged, and the standard of care remains syndromic management. We used a discrete choice experiment to investigate young peoples preferences for the diagnosis and treatment of STIs in South Africa.
Methods and FindingsBetween 1 March 2021 and 20 April 2021, a cross-sectional online questionnaire hosted on REDCap was administered through access links sent to WhatsApp support groups for HIV PrEP users and attendees of two primary healthcare clinics and two mobile facilities in the Eastern Cape and Gauteng provinces aged between 18-49 years. Participants either self-completed the questionnaire or received support from a research assistant. We used a CLOGIT model for the initial analysis and latent class model (LCM) to establish class memberships with results displayed as odds ratios and probabilities.
We enrolled 496 individuals, the majority were female (69%) and <30 years (74%). About 29% reported previous STI treatment and 20% reported current STI symptoms.
The LCM showed two distinct groups within the respondent sample with different preferences for STI care. The first group comprising 68% of participants showed a strong preference for self-sampling compared to sampling by a healthcare professional (HCP) [OR 2.32; 95%-CI (1.79-3.00)] and viewed no sampling as similar to HCP sampling [OR 1.08; 95%-CI (0.92-1.26)]. There was a lower preference to receive results within 4 hours versus 2 hours [OR 0.63; 95%-CI (0.51-0.77)] and the later was viewed as equal to the receipt of results in 1-7 days by SMS or online [OR 1.03; 95%-CI (0.88-1.21). A clinic follow-up appointment for treatment was less preferable than same-day treatment [OR 0.78; 95%-CI (0.63-0.95)] while treatment from a local pharmacy was viewed with equal preference as same-day treatment [OR 1.16; 95%-CI (1.04-1.29)]. Contact slip from index patient [OR 0.86; 95%-CI (0.76-0.96)] and HCP-initiated partner notification [OR 0.63; 95%-CI (0.55-0.73)] were both less preferable than expedited partner treatment (EPT). The second group included 32% of participants with a much lower preference for self-sampling compared to sampling by HCP [OR 0.55; 95%-CI (0.35-0.86)]. No sampling was not significantly different to HCP-sampling [OR 0.85; 95%-CI (0.64-1.13)]. There was a strong preference for a 4-hour wait than a 2-hour wait for results [OR 1.45; 95%-CI (1.05-2.00)]. There was no treatment option that was significantly different from the others, however there was a strong preference for HCP-initiated partner notification than EPT [OR 1.53; 95%-CI (1.10-2.12)]. Participants were more likely to be members of group 1 than group 2 if they were aged 25-49 years compared to 18-24 years (p=0.001) and receive care from a rural compared to urban facility (p=0.011). Employed individuals were more likely to be in group 2 than group 1 (p=0.038).
ConclusionsOur results suggest that health service users preferred to undergo STI testing prior to treatment but there were subgroups who differed on how this should be done. This highlights the need for STI care to be flexible to accommodate different patient choices. | sexual and reproductive health |
10.1101/2022.03.06.22271294 | Prion protein gene mutation detection using long-read Nanopore sequencing | Prion diseases are fatal neurodegenerative conditions that affect humans and animals. Rapid and accurate sequencing of the prion gene PRNP is paramount to human prion disease diagnosis and for animal surveillance programmes. Current methods for PRNP genotyping involve sequencing of small fragments within the protein-coding region. The contribution of variants in the non-coding regions of PRNP including large structural changes is poorly understood. Here we use long-range PCR and Nanopore sequencing to sequence the full length of PRNP, including its regulatory region, in 25 samples from blood and brain of individuals with various prion diseases. Nanopore sequencing detected the same variants as identified by Sanger sequencing, including repeat expansions/contractions. Nanopore identifies additional single-nucleotide variants in the non-coding regions of PRNP, but no novel structural variants were discovered. Finally, we explore somatic mosaicism of PRNPs octapeptide repeat region, which is a hypothetical cause of sporadic prion disease. While we find changes consistent with somatic mutations, we demonstrate that they may have been generated by the PCR. Our study illustrates the accuracy of Nanopore sequencing for rapid and field prion disease diagnosis and highlights the need for single-molecule sequencing methods for the detection of somatic mutations. | neurology |
10.1101/2022.03.02.22271711 | Understanding frailty: probabilistic causality between components and their relationship with death through a Bayesian network and evidence propagation | Identifying relationships between components of an index helps to have a better understanding of the condition they define. The Frailty Index (FI) measures the global health of individuals and can be used to predict outcomes as mortality. Previously, we modelled the relationship between the FI components (deficits) and death through an undirected graphical model, analyzing their relevance from a social network analysis framework. Here, we model the FI components and death through an averaged Bayesian network obtained through a structural learning process and resampling, to understand how the FI components and death are causally related. We identified that the components are not similarly related between them and that deficits are related according to its type. Two deficits were the most relevant in terms of their connections and other two were directly associated with death. We also obtained the strength of the relationships to identify the most plausible, identifying clusters of deficits heavily related. Finally, we propagated evidence (assigned values to all deficits) and studied how FI components predict mortality, obtaining a correct assignation of almost 74%, whereas a sensitivity of 56%. As a classifier of death, the more number of deficits included for the evidence, the best performance; but the FI seems not to be very good to correctly classify death people. | geriatric medicine |
10.1101/2022.03.08.22271822 | Selection of optimum formulation of RBD-based protein sub-unit covid19 vaccine (Corbevax) based on safety and immunogenicity in an open-label, randomized Phase-1 and 2 clinical studies | BackgroundWe present the data from an open-label study involved in the selection of optimum formulation of RBD-based protein sub-unit COVID-19 vaccine.
MethodsThe randomized Phase-1/2 trial followed by a Phase-2 trial was carried out to assess safety and immunogenicity of different formulation of COVID-19 vaccine (Corbevax) and select an optimum formulation for a phase 3 study. Healthy adults without a history of Covid-19 vaccination or SARS-CoV-2 infection, were enrolled.
FindingsLow incidence of adverse events were reported post-vaccination of different Corbevax formulations and majority were mild in nature and no Grade-3 or serious adverse events were observed. All formulations in Phase-1/2 study showed similar profile of humoral and cellular immune-response with higher response associated with increasing CpG1018 adjuvant content at same RBD protein content. Hence, high concentration of CpG1018 was tested in phase-2 study, which showed significant improvement in immune-responses in terms of anti-RBD-IgG concentrations, anti-RBD-IgG1 titers, nAb-titers and cellular immune-responses while maintaining the safety profile. Interestingly, binding and neutralizing antibody titers were persisted consistently till 6 months post second vaccine dose.
InterpretationsCorbevax was well tolerated with no observed safety concerns. Neutralizing antibody titers were suggestive of high vaccine effectiveness compared with human convalescent plasma or protective thresholds observed during vaccine efficacy trials of other COVID-19 vaccines. The study was prospectively registered with clinical trial registry of India-CTRI/2021/06/034014 and CTRI/2020/11/029032.
FundingBill & Melinda Gates Foundation, BIRAC-division of Department of Biotechnology, Govt of India, and the Coalition for Epidemic Preparedness Innovations funded the study. | infectious diseases |
10.1101/2022.03.08.22271822 | Selection of optimum formulation of RBD-based protein sub-unit covid19 vaccine (Corbevax) based on safety and immunogenicity in an open-label, randomized Phase-1 and 2 clinical studies | BackgroundWe present the data from an open-label study involved in the selection of optimum formulation of RBD-based protein sub-unit COVID-19 vaccine.
MethodsThe randomized Phase-1/2 trial followed by a Phase-2 trial was carried out to assess safety and immunogenicity of different formulation of COVID-19 vaccine (Corbevax) and select an optimum formulation for a phase 3 study. Healthy adults without a history of Covid-19 vaccination or SARS-CoV-2 infection, were enrolled.
FindingsLow incidence of adverse events were reported post-vaccination of different Corbevax formulations and majority were mild in nature and no Grade-3 or serious adverse events were observed. All formulations in Phase-1/2 study showed similar profile of humoral and cellular immune-response with higher response associated with increasing CpG1018 adjuvant content at same RBD protein content. Hence, high concentration of CpG1018 was tested in phase-2 study, which showed significant improvement in immune-responses in terms of anti-RBD-IgG concentrations, anti-RBD-IgG1 titers, nAb-titers and cellular immune-responses while maintaining the safety profile. Interestingly, binding and neutralizing antibody titers were persisted consistently till 6 months post second vaccine dose.
InterpretationsCorbevax was well tolerated with no observed safety concerns. Neutralizing antibody titers were suggestive of high vaccine effectiveness compared with human convalescent plasma or protective thresholds observed during vaccine efficacy trials of other COVID-19 vaccines. The study was prospectively registered with clinical trial registry of India-CTRI/2021/06/034014 and CTRI/2020/11/029032.
FundingBill & Melinda Gates Foundation, BIRAC-division of Department of Biotechnology, Govt of India, and the Coalition for Epidemic Preparedness Innovations funded the study. | infectious diseases |
10.1101/2022.03.08.22271681 | Multi-steroid profiling by uPLC-MS/MS with post-column infusion of ammonium fluoride | BackgroundMulti-steroid profiling is a powerful analytical tool that simultaneously quantifies steroids from different biosynthetic pathways. Here we present an ultra-high performance liquid chromatography-tandem mass spectrometry (uPLC-MS/MS) assay for the profiling of 25 steroids using post-column infusion of ammonium fluoride.
MethodsFollowing liquid-liquid extraction, steroids were chromatographically separated over 5 minutes using a Phenomenex Luna Omega C18 column and a water (0.1 % formic acid) methanol gradient. Quantification was performed on a Waters Acquity uPLC and Xevo(R) TQ-XS mass spectrometer. Ammonium fluoride (6 mmol/L, post-column infusion) and formic acid (0.1 % (vol/vol), mobile phase additive) were compared as additives to aid ionisation.
ResultsPost-column infusion (PCI) of ammonium fluoride (NH4F) enhanced ionisation in a steroid structure-dependent fashion compared to formic acid (122-140% for 3{beta}OH-{Delta}5 steroids and 477-1274% for 3-keto-{Delta}4 steroids). Therefore, we fully analytically validated PCI with NH4F. Lower limits of quantification ranged from 0.28 to 3.42 nmol/L; 23 of 25 analytes were quantifiable with acceptable accuracy (bias range -14% to 11.9%). Average recovery ranged from 91.6% to 113.6% and average matrix effects from -29.9% to 19.9%. Imprecision ranged from 2.3% to 23.9% for all analytes and was <15% for 18/25 analytes. The serum multi-steroid profile of 10 healthy men and 10 healthy women was measured.
ConclusionsuPLC-MS/MS with post-column infusion of ammonium fluoride enables comprehensive multi-steroid profiling through enhanced ionisation particularly benefiting the detection of 3-keto-{Delta}4 steroids.
HighlightsO_LIThis multi-steroid profiling assay quantifies 25 steroids in 5.5 minutes
C_LIO_LIPost-column infusion of NH4F enhances the ionisation of 3-keto-{Delta}4 steroids
C_LIO_LIThe assay simultaneously quantifies steroids from several biosynthetic pathways
C_LIO_LIWe present analytical data validated for serum steroid profiling
C_LI | endocrinology |
10.1101/2022.03.07.22271699 | Severity of maternal SARS-CoV-2 infection and perinatal outcomes during the Omicron variant dominant period: UK Obstetric Surveillance System national cohort study. | ObjectivesTo describe the severity of maternal infection when the Omicron SARS-CoV-2 variant was dominant (15/12/21-14/01/22) and compare outcomes among groups with different vaccination status.
DesignProspective cohort study
SettingUK consultant-led maternity units
ParticipantsPregnant women hospitalised with a positive SARS-CoV-2 PCR test up to 7 days prior to admission and/or during admission up to 2 days after giving birth.
Main outcome measuresSymptomatic or asymptomatic infection. Vaccination status. Severity of maternal infection (moderate or severe infection according to modified WHO criteria). Mode of birth and perinatal outcomes.
ResultsOut of 1561 women admitted to hospital with SARS-CoV-2 infection, 449 (28.8%) were symptomatic. Among symptomatic women admitted, 86 (19.2%) had moderate to severe infection; 51 (11.4%) had pneumonia on imaging, 62 (14.3%) received respiratory support, and 19 (4.2%) were admitted to the intensive care unit (ICU). Three women died (0.7%). Vaccination status was known for 383 symptomatic women (85.3%) women; 249 (65.0%) were unvaccinated, 45 (11.7%) had received one vaccine dose, 76 (19.8%) had received two doses and 13 (3.4%) had received three doses. 59/249 (23.7%) unvaccinated women had moderate to severe infection, compared to 10/45 (22.2%) who had one dose, 9/76 (11.8%) who had two doses and 0/13 (0%) who had three doses. Among the 19 symptomatic women admitted to ICU, 14 (73.7%) were unvaccinated, 3 (15.8%) had received one dose, 1 (5.3%) had received two doses, 0 (0%) had received 3 doses and 1 (5.3%) had unknown vaccination status.
ConclusionThe risk of severe respiratory disease amongst unvaccinated pregnant women admitted with symptomatic SARS-CoV-2 infection during the Omicron dominance period was comparable to that observed during the period the wildtype variant was dominant. Most women with severe disease were unvaccinated. Vaccine coverage among pregnant women admitted with SARS-CoV-2 was low compared to the overall pregnancy population and very low compared to the general population. Ongoing action to prioritise and advocate for vaccine uptake in pregnancy is essential.
O_TEXTBOXSUMMARY BOX
What is already known on this topic
O_LIIn non-pregnant adults, growing evidence indicates a lower risk of severe respiratory disease with the Omicron SARS-CoV-2 Variant of Concern (VOC).
C_LIO_LIPregnant women admitted during the periods in which the Alpha and Delta VOC were dominant were at increased risk of moderate to severe SARS-CoV-2 infection compared to the period when the original wildtype infection was dominant.
C_LIO_LIMost women admitted to hospital with symptomatic SARS-CoV-2 infection have been unvaccinated.
C_LI
What this study adds
O_LIOne in four women who had received no vaccine or a single dose had moderate to severe infection, compared with one in eight women who had received two doses and no women who had received three doses
C_LIO_LIThe proportional rate of moderate to severe infection in unvaccinated pregnant women during the Omicron dominance period is similar to the rate observed during the wildtype dominance period
C_LIO_LIOne in eight symptomatic admitted pregnant women needed respiratory support during the period when Omicron was dominant
C_LI
C_TEXTBOX | epidemiology |
10.1101/2022.03.07.22271752 | Low cost and real-time surveillance of enteric infection and diarrhoeal disease using rapid diagnostic tests: A pilot study | BackgroundReal-time disease surveillance is an important component of infection control in at-risk populations. However, data on cases or from lab testing is often not available in many low-resource settings. Rapid diagnostic tests (RDT), including immunochromatographic assays, may provide a low cost, expedited source of infection data.
MethodsWe conducted a pilot survey-based prevalence mapping study of enteric infection in Camp 24 of the camps for the forcibly displaced Rohingya population from Myanmar in Coxs Bazar, Bangladesh. We randomly sampled the population and collected and tested stool from under-fives for eight pathogens using RDTs in January-March 2021 and September-October 2021. A Bayesian geospatial statistical model allowing for imperfect sensitivity and specificity of the tests was adapted.
ResultsWe collected and tested 396 and 181 stools in the two data collection rounds. Corrected prevalence estimates ranged from 0.5% (Norovirus) to 27.4% (Giardia). Prevalence of E.coli O157, Campylobacter, and Cryptosporidium were predicted to be higher in the high density area of the camp with relatively high probability (70-95%), while Adenovirus, Norovirus, and Rotavirus were lower in the areas with high water chlorination. Clustering of cases of Giardia and Shigella was also observed, although associated with relatively high uncertainty.
ConclusionsWith an appropriate correction for diagnostic performance RDTs can be used to generate reliable prevalence estimates, maps, and well-calibrated uncertainty estimates at a significantly lower cost than lab-based studies, providing a useful approach for disease surveillance in these settings. | epidemiology |
10.1101/2022.03.07.22272032 | Changes in outpatient care patterns and subsequent outcomes during the COVID-19 pandemic: A retrospective cohort analysis from a single payer healthcare system | BackgroundThere have been rapid shifts in outpatient care models during the COVID-19 pandemic but the impact of these changes on patient outcomes are uncertain. We designed this study to examine ambulatory outpatient visit patterns and outcomes between March 1, 2019 to February 29, 2020 (pre-pandemic) and from March 1, 2020 to February 28, 2021 (pandemic).
MethodsWe conducted a population-based retrospective cohort study of all 3.8 million adults in the Canadian province of Alberta, which has a single payer healthcare system, using linked administrative data. We examined all outpatient physician encounters (virtual or in-person) and outcomes (emergency department visits, hospitalizations, or deaths) in the next 30- and 90-days.
ResultsAlthough in-person outpatient visits declined by 38.9% in the year after March 1, 2020 (10,142,184 vs. 16,592,599), the increase in virtual visits (7,152,147; 41.4% of total) meant that total outpatient encounters increased by 4.1% in the first year of the pandemic. Outpatient care and prescribing patterns remained stable for adults with ambulatory-care sensitive conditions (ACSC): 97.2% saw a primary care physician (median 6 visits), 59.0% had at least one specialist visit, and 98.5% were prescribed medications (median 9) in the year prior to the pandemic compared to 96.6% (median 3 in-person and 2 virtual visits), 62.6%, and 98.6% (median 8 medications) during the first year of the pandemic. In the first year of the pandemic, virtual outpatient visits were associated with less subsequent healthcare encounters than in-person ambulatory visits, particularly for patients with ACSC (9.2% vs. 10.4%, aOR 0.89 [95% confidence interval 0.87-0.92] at 30 days and 26.9% vs. 29.3%, aOR 0.93 [0.92-0.95] at 90 days).
ConclusionsThe shifts in outpatient care patterns caused by the COVID-19 pandemic did not disrupt prescribing or follow-up for patients with ACSC and did not worsen post-visit outcomes.
FundingNone
RegistrationNone
KEY MESSAGESO_ST_ABSWhat is already known on this topicC_ST_ABSThere have been rapid shifts in outpatient care models during the COVID-19 pandemic but outcomes are uncertain.
What this study addsTotal outpatient encounters increased by 4% in the first year of the pandemic due to a rapid increase in virtual visits (which made up 41% of all outpatient encounters). Prescribing patterns and frequency of follow-up were similar in the first year after onset of the pandemic in adults with ambulatory-care sensitive conditions. Compared to in-person visits, virtual outpatient visits were associated with less subsequent healthcare encounters, particularly for patients with ambulatory-care sensitive conditions (11% less at 30 days and 7% less at 90 days).
How this study might affect research, practice or policyOur data provides reassurance that the shifts in outpatient care patterns caused by the COVID-19 pandemic did not negatively impact follow-up, prescribing, or outcomes for patients with ACSC. Further research is needed to define which patients and which conditions are most suitable for virtual outpatient visits and, as with all outpatient care, the optimal frequency of such visits. | primary care research |
10.1101/2022.03.07.22271597 | Appropriateness of the current parasitological control target for hookworm morbidity: a statistical analysis of individual-level data | BackgroundSoil-transmitted helminths affect almost 2 billion people globally. Hookworm species contribute to most of the related morbidity. Hookworms mainly cause anaemia, due to blood loss at the site of the attachment of the adult worms to the human intestinal mucosa. The World Health Organization (WHO) aims to eliminate hookworm morbidity by 2030 through achieving a prevalence of moderate and heavy intensity (M&HI) infections below 2%. In this paper, we aim to assess the suitability of this threshold to reflect hookworm-attributable morbidity.
Methodology/Principal FindingsWe developed a hierarchical statistical model to simulate individual haemoglobin concentrations in association with hookworm burdens, accounting for low haemoglobin values attributable to other causes. The model was fitted to individual-level data within a Bayesian framework. Then, we generated different endemicity settings corresponding to infection prevalence ranging from 10% to 90% (0% to 55% M&HI prevalence), using 1, 2 or 4 Kato-Katz slides. For each scenario, we estimated the prevalence of anaemia due to hookworm. Our results showed that on average, haemoglobin falls below the WHO threshold for anaemia when intensities are above 2000 eggs per gram of faeces. For the different simulated scenarios, the estimated prevalence of anaemia attributable to hookworm ranges from 0% to 30% (95%-PI: 24% - 36%) being mainly associated to the prevalence of M&HI infections. Simulations show that a 2% prevalence of M&HI infections in adults corresponds to a prevalence of hookworm-attributable anaemia lower than 1%.
Conclusions/SignificanceOur results support the use of the current WHO thresholds of 2% prevalence of M&HI as a proxy for hookworm morbidity. A single Kato-Katz slide may be sufficient to assess the achievement of the morbidity target. Further studies are needed to elucidate haemoglobin dynamics pre- and post-control, ideally using longitudinal data in adults and children.
Author summarySoil-transmitted helminths affect almost 2 billion people globally. About 65% of the related morbidity is attributable to hookworm infection, especially anaemia, which is caused by adult worms attaching to the intestinal mucosa of the human host to feed on blood. The World Health Organization (WHO) defines the target for the elimination of morbidity as reaching moderate-to-heavy intensity infections prevalence < 2%. In this paper, we aim to assess the suitability of the current WHO threshold to reflect hookworm-attributable anaemia. To this end, we developed a statistical model to simulate individual haemoglobin concentrations in association with hookworm burdens in adults, accounting for low haemoglobin values attributable to other causes, such as malnutrition and malaria. We used model outcomes to estimate the prevalence of hookworm-attributable anaemia in different endemicity scenarios. Our predictions suggested that anaemia is mainly associated to moderate-to-heavy intensity infection and that the 2% threshold corresponds to a prevalence of hookworm-attributable anaemia <1%. These results support the use of the current WHO threshold as a proxy for hookworm morbidity. Our predictions represent a first step towards better quantifying the prevalence of anaemia due to hookworm infection. | public and global health |
10.1101/2022.03.08.22271465 | Spectral imaging enables contrast agent-free real-time ischemia monitoring in laparoscopic surgery | Laparoscopic surgery has evolved as a key technique for cancer diagnosis and therapy. While characterization of the tissue perfusion is crucial in various procedures, such as partial nephrectomy, doing so by means of visual inspection remains highly challenging. Spectral imaging takes advantage of the fact that different tissue components have unique optical properties to recover relevant information on tissue function such as ischemia. However, clinical success stories for advancing laparoscopic surgery with spectral imaging are lacking to date. To address this bottleneck, we developed the first laparoscopic real-time multispectral imaging (MSI) system featuring a compact and lightweight multispectral camera and the possibility to complement the conventional RGB (Red, Green, and Blue) surgical view of the patient with functional information at a video rate of 25 Hz. To account for the high inter-patient variability of human tissue, we phrase the problem of ischemia detection as an out-of-distribution (OoD) detection problem that does not rely on data from any other patient. Using an ensemble of invertible neural networks (INNs) as a core component, our algorithm computes the likelihood of ischemia based on a short (several seconds) video sequence acquired at the beginning of each surgery. A first-in-human trial performed on 10 patients undergoing partial nephrectomy demonstrates the feasibility of our approach for fully-automatic live ischemia monitoring during laparoscopic surgery. Compared to the clinical state-of-the-art approach based on indocyanine green (ICG) fluorescence, the proposed MSI-based method does not require the injection of a contrast agent and is repeatable if the wrong segment has been clamped. Spectral imaging combined with advanced deep learning-based analysis tools could thus evolve as an important tool for fast, efficient, reliable and safe functional imaging in minimally invasive surgery. | radiology and imaging |
10.1101/2022.03.07.22271661 | Conceptualization, operationalization, and utilization of race and ethnicity in major medical journals 1995-2018: a systematic review | BackgroundSystemic racial and ethnic inequities continue to be perpetuated through scientific methodology and communication norms despite efforts by medical institutions. We characterized methodological practices regarding race and ethnicity in U.S. research published in leading medical journals.
MethodsWe systematically reviewed randomly selected articles from prominent medical journals: Annals of Internal Medicine, BMJ, JAMA, The Lancet, and NEJM within five periods: 1995-99, 2000-04, 2005-09, 2010-14, 2015-18. Original human-subjects research conducted in the U.S. was eligible for inclusion. We extracted information on definitions (conceptualization), measurement/coding (operationalization), use in analysis (utilization), and justifications. We reviewed 1050, including 242 (23%) in analyses.
FindingsThe proportion of U.S. medical research studies including race and/or ethnicity data increased between 1995 and 2018. However, no studies defined race or ethnicity. Studies rarely delineated between race and ethnicity, frequently opting for a combined "ethno-racial" construct. In addition, most studies did not state how race and/or ethnicity was measured. Common coding schemes included: "Black, other, White," "Hispanic, Non-Hispanic," and "Black, Hispanic, other, White." Race and/or ethnicity was most often used as a control variable, descriptive covariate, or matching criteria. Under 30% of studies included a justification for their methodological choices regarding race and/or ethnicity.
InterpretationDespite regular efforts by medical journals to implement new policies around race and ethnicity in medical research, pertinent information around methodology was systematically absent from the majority of reviewed literature. This stymies critical disciplinary reflection and progress towards equitable practice.
FundingFunding was provided through training grants from the Eunice Kennedy Shriver National Institute of Child Health and Human Development [T32 HD091058] and the Department of Sociology, UNC Chapel Hill. Carolina Population Center provided general support [P2C HD050924, P30 AG066615]. NRS received additional support from the National Cancer Institute [T32 CA057711]. | medical education |
10.1101/2022.03.07.22271995 | Strongyloides stercoralis infection induces gut dysbiosis in chronic kidney disease patients | BackgroundStrongyloides stercoralis infection typically causes severe symptoms in immunocompromised patients. However, the progression of infection-driven chronic kidney disease (CKD) is not understood fully. Recent studies have shown that gut dysbiosis plays an important role in the progression of CKD. Hence, this study aims to investigate the effect of S. stercoralis infection on the gut microbiome in CKD patients.
Methodology/Principal FindingsAmong 838 volunteers from Khon Kaen Province, northeastern Thailand, 40 subjects with CKD were enrolled and divided into two groups (S. stercoralis-infected and -uninfected) matched for age, sex and biochemical parameters. Next-generation technology was used to amplify and sequence the V3-V4 region of the 16S rRNA gene to provide a profile of the gut microbiota. Results revealed that members of the S. stercoralis-infected group had lower gut microbial diversity than was seen in the uninfected group. Interestingly, there was significantly greater representation of some pathogenic bacteria in the S. stercoralis-infected CKD group, including Escherichia-Shigella (P = 0.013), Rothia (P = 0.013) and Aggregatibacter (P = 0.03). There was also a trend towards increased Actinomyces, Streptococcus and Haemophilus (P > 0.05) in this group. On the other hand, the S. stercoralis-infected CKD group had significantly lower representation of SCFA-producing bacteria such as Anaerostipes (P = 0.01), Coprococcus_1 (0.043) and a non-significant decrease of Akkermansia, Eubacterium rectale and Eubacterium hallii (P > 0.05) relative to the uninfected group. Interesting, the genera Escherichia-Shigella and Anaerostipes exhibited opposing trends, which were significantly related to sex, age, infection status and CKD stages. The genus Escherichia-Shigella was significantly more abundant in CKD patients over the age of 65 years and infected with S. stercoralis. A correlation analysis showed inverse moderate correlation between the abundance of the genus of Escherichia-Shigella and the level of estimated glomerular filtration rate (eGFR).
Conclusions/SignificanceConclusion, the results suggest that S. stercoralis infection induced gut dysbiosis in the CKD patients, which might be involved in CKD progression.
Author summaryHuman strongyloidiasis is caused by a soil-transmitted helminth, Strongyloides stercoralis, which typically causes severe symptoms in immunocompromised. However the relationship between S. stercoralis and chronic kidney disease patients (CKD) progression was not known. This is the first study to investigate the gut microbiota of CKD patients with and without S. stercoralis using high-throughput sequencing of the V3-V4 region of the 16S rRNA gene. Infection with S. stercoralis was associated with reduced gut microbiota diversity than in the uninfected group. In addition, infection with this nematode led to reduced abundance of SCFA-producing bacteria and enrichment of pathogenic bacteria. In particular, there were significant differences in abundance of the beneficial genus Anaerostipes (a decrease) and the pathogenic taxon Escherichia-Shigella (an increase) in CKD patients infected with S. stercoralis relative to controls. In the infected group, the representation of genus Escherichia-Shigella was significant higher in patients over the age of 65 years. There was a significant inverse moderate correlation of Escherichia-Shigella with the estimated glomerular filtration rate (eGFR). | nephrology |
10.1101/2022.03.08.22271043 | Genome-wide meta-analysis for Alzheimer's disease cerebrospinal fluid biomarkers | Amyloid-beta 42 (A{beta}42) and phosphorylated tau (pTau) levels in cerebrospinal fluid (CSF) reflect core features of the pathogenesis of Alzheimers disease (AD) more directly than clinical diagnosis. Initiated by the European Alzheimer & Dementia Biobank (EADB), the largest collaborative effort on genetics underlying CSF biomarkers was established, including 31 cohorts with a total of 13,116 individuals (discovery n = 8,074; replication n = 5,042 individuals). Besides the APOE locus, novel associations with two other well-established AD risk loci were observed; CR1 was shown a locus for amyloid beta 42 (A{beta}42) and BIN1 for phosphorylated Tau (pTau). GMNC and C16orf95 were further identified as loci for pTau, of which the latter is novel. Clustering methods exploring the influence of all known AD risk loci on the CSF protein levels, revealed 4 biological categories (amyloid, astrocyte, processing & migration, and migration & motility) suggesting multiple A{beta}42 and pTau related biological pathways involved in the etiology of AD. In functional follow-up analyses, GMNC and C16orf95 both associated with lateral ventricular volume, implying an overlap in genetic etiology for tau levels and brain ventricular volume. | neurology |
10.1101/2022.03.07.22270923 | The Sequence Effect Worsens over Time in Parkinson's disease and Responds to Open and Closed-Loop Subthalamic Nucleus Deep Brain Stimulation | BackgroundThe sequence effect is the progressive deterioration in speech, limb movement, and gait that leads to an inability to communicate, manipulate objects or walk without freezing of gait. Many studies have demonstrated a lack of improvement of the sequence effect from dopaminergic medication, however few studies have studied the metric over time or investigated the effect of open and closed-loop deep brain stimulation in people with PD.
ObjectiveTo investigate whether the sequence effect worsens over time and/or is improved on clinical (open-loop) and closed-loop deep brain stimulation (DBS).
MethodsTwenty-one people with PD with bilateral STN DBS performed thirty seconds of instrumented repetitive wrist flexion extension and the MDS-UPDRS III off therapy, prior to activation of DBS and every six months for up to three years. A sub-cohort of ten people performed the task during randomized presentations of different intensities of STN DBS.
ResultsThe sequence effect was highly correlated with the overall MDS-UPDRS III score and the bradykinesia sub-score and worsened over three years. Increasing intensities of STN open-loop DBS improved the sequence effect and one subject demonstrated improvement on both open-loop and even further improvement on closed-loop DBS.
ConclusionsSequence effect in limb bradykinesia worsened over time off therapy due to disease progression but improved on open and closed-loop DBS. These results demonstrate that DBS is a useful treatment of the debilitating effects of the sequence effect in limb bradykinesia and that closed-loop DBS may offer added improvement. | neurology |
10.1101/2022.03.08.22271868 | The influence of genetic predisposition and physical activity on risk of Gestational Diabetes Mellitus in the nuMoM2b cohort | ImportancePolygenic risk scores (PRS) for Type II Diabetes Mellitus (T2DM) can improve risk prediction for Gestational Diabetes Mellitus (GDM), yet the strength of the relationship between genetic and lifestyle risk factors has not been quantified.
ObjectiveTo assess the effects of PRS and physical activity on existing GDM risk models and identify patient subgroups who may receive the most benefits from receiving a PRS or activity intervention.
Design, Settings, and ParticipantsThe Nulliparous Pregnancy Outcomes Study: Monitoring Mothers-to-Be (nuMoM2b) study was established to study individuals without previous pregnancy lasting 20 weeks or more (nulliparous) and to elucidate factors associated with adverse pregnancy outcomes. A sub-cohort of 3,533 participants with European ancestry were used for risk assessment and performance evaluation.
ExposuresSelf-reported total physical activity in early pregnancy was quantified as metabolic equivalent of tasks (METs) in hours/week. Polygenic risk scores were calculated for T2DM using contributions of 85 single nucleotide variants, weighted by their association in the DIAbetes Genetics Replication And Meta-analysis (DIAGRAM) Consortium data.
Main Outcomes and MeasuresPrediction of the development of GDM from clinical, genetic, and environmental variables collected in early pregnancy. The risk model is assessed using measures of model discrimination and calibration. Odds ratio and positive likelihood ratio were used for evaluating the effect of PRS and physical activity on GDM risk.
ResultsIn high-risk population subgroups (body mass index [≥] 25 or age [≥] 35), individuals with PRS in the top 25th percentile or METs below 450 have significantly increased odds of GDM diagnosis. Participants with both high PRS and low METs have three times higher odds of GDM diagnosis than the population. Conversely, participants with high PRS and METs [≥] 450 do not exhibit increased odds of GDM diagnosis, and those with low METs and low PRS have reduced odds of GDM. The relationship between PRS and METs was found to be nonadditive.
Conclusions and RelevanceIn high-risk patient subgroups the addition of PRS resulted in increased risk of GDM diagnosis, suggesting the benefits of targeted PRS ascertainment to encourage early intervention. Increased physical activity is associated with decreased risk of GDM, particularly among individuals genetically predisposed to T2DM.
Key PointsO_ST_ABSQuestionC_ST_ABSDo genetic predisposition to diabetes and physical activity in early pregnancy cooperatively impact risk of Gestational Diabetes Mellitus (GDM) among nulliparas?
FindingsRisk of GDM diagnosis increases significantly for nulliparas with high polygenic risk score (PRS) and with low physical activity. The odds ratio of developing GDM with high PRS was estimated to be 2.2, 1.6 with low physical activity, and 3.5 in combination.
MeaningPhysical activity in early pregnancy is associated with reduced risk of GDM and reversal of excess risk in genetically predisposed individuals. The interaction between PRS and physical activity may identify subjects for targeted interventions. | obstetrics and gynecology |
10.1101/2022.03.07.22271524 | Patient-specific forecasting of post-radiotherapy prostate-specific antigen kinetics enables early prediction of biochemical relapse | The detection of prostate cancer recurrence after external beam radiotherapy relies on the measurement of a sustained rise of serum prostate-specific antigen (PSA). However, this biochemical relapse may take years to occur, thereby delaying the delivery of a secondary treatment to patients with recurring tumors. To address this issue, here we propose to use patient-specific forecasts of PSA dynamics to early predict biochemical relapse. Our forecasts are based on mechanistic models of prostate cancer response to external beam radio-therapy, which are fit to patient-specific PSA data collected during standard post-treatment monitoring. Our results show a remarkable performance of our models in recapitulating the observed changes in PSA and yielding short-term predictions over approximately one year (cohort median RMSE of 0.10 to 0.47 ng/mL and 0.13 to 1.41 ng/mL, respectively). Additionally, we identify three model-based biomarkers that enable an accurate identification of biochemical relapse (AUC > 0.80) significantly earlier than standard practice (p < 0.01). | oncology |
10.1101/2022.03.08.22270800 | Sequential epiretinal stimulation improves discrimination in simple shape discrimination tasks only | ObjectiveElectrical stimulation of the retina can elicit flashes of light called phosphenes, which can be used to restore rudimentary vision for people with blindness. Functional sight requires stimulation of multiple electrodes to create patterned vision, but phosphenes tend to merge together in an uninterpretable way. Sequentially stimulating electrodes in human visual cortex has recently demonstrated that shapes could be "drawn" with better perceptual resolution relative to simultaneous stimulation. The goal of this study was to evaluate if sequential stimulation would also form clearer shapes when the retina is the neural target.
ApproachTwo human participants with retinitis pigmentosa who had Argus(R) II retinal prostheses participated in this study. We evaluated different temporal parameters for sequential stimulation in phosphene shape mapping and forced-choice discrimination tasks. For the discrimination tasks, performance was compared between stimulating electrodes simultaneously versus sequentially.
Main resultsPhosphenes elicited by different electrodes were reported as vastly different shapes. Sequential electrode stimulation outperformed simultaneous stimulation in simple discrimination tasks, in which shapes were created by stimulating 3-4 electrodes, but not in more complex discrimination tasks involving 5+ electrodes. For sequential stimulation, the optimal pulse train duration was 200 ms when stimulating at 20 Hz and the optimal gap interval was tied between 0 and 50 ms. Efficacy of sequential stimulation also depended strongly on selecting electrodes that elicited phosphenes with similar shapes and sizes.
SignificanceAn epiretinal prosthesis can produce coherent simple shapes with a sequential stimulation paradigm, which can be used as rudimentary visual feedback. However, success in creating more complex shapes, such as letters of the alphabet, is still limited. Sequential stimulation may be most beneficial for epiretinal prostheses in simple tasks, such as basic navigation, rather than complex tasks such as object identification. | ophthalmology |
10.1101/2022.03.07.22272050 | Virtual Assessment of Patients with Dry Eye Disease During the COVID-19 Pandemic: One clinicians experience | ObjectivesTo report on 1) the impact of DED on social, mental, and financial well-being, and 2) the use of virtual consultations to assess DED during the COVID-19 pandemic.
Design & MethodsAn exploratory retrospective review of 35 charts. Telephone consultations for patients with DED conducted during the first lock-down period in Ontario in 2020 were reviewed.
ResultsThe most commonly reported DED symptoms were ocular dryness, visual disturbances, and burning sensation. The most common dry eye management practices were artificial tears, warm compresses, and omega-3 supplements. 20.0% of charts documented worsening of DED symptoms since the onset of the pandemic and 17.1% reported the lockdown had negatively affected their ability to perform DED management practices. 42.8% of patients reported an inability to enjoy their daily activities due to DED symptoms. 52.0% reported feeling either depressed, anxious, or both with 26.9% of patients accepting a referral to a social worker for counselling support. More than a quarter of the charts recorded financial challenges associated with the cost of therapy, and more than a fifth of patients reported that financial challenges were a direct barrier to accessing therapy.
ConclusionsPatients living with DED reported that their symptoms negatively affected their daily activities including mental health and financial challenges, that in turn impacted treatment practices. These challenges may have been exacerbated during the COVID-19 pandemic. Telephone consultations may be an effective modality to assess DED symptom severity, the impact of symptoms on daily functioning, and the need for counselling and support.
AUTHOR SUMMARYDry Eye Disease occurs when your tears do not provide enough lubrication for your eyes, which can be caused by either decreased tear production, or by poor quality tears. This study reviewed 35 patient charts to examine 1) the impact of Dry Eye Disease on patients well-being, and 2) the use of telephone appointments to assess Dry Eye Disease during the COVID-19 pandemic. Patients reported an inability to enjoy their daily activities due to symptoms of dry eye including burning sensation and blurred vision. Over half of patients reported mental health challenges. Over a quarter of patients reported that financial challenges prevented them from treating their Dry Eye Disease, such as affording eye drops, dietary supplements, and appointments to see their optometrist. These findings highlight that healthcare providers should considering quality of life, mental health, and financial challenges when treating patients with Dry Eye Disease. Through the experience of an ophthalmologist who specializes in Dry Eye Disease, telephone appointments may be an effective way to assess Dry Eye Disease symptoms, the impact of symptoms on daily functioning, and the need for counselling and support. | ophthalmology |
10.1101/2022.03.08.22272084 | Penetrance of HFE haemochromatosis variants to clinical disease: polygenic risk score associations in UK Biobank | BackgroundThe iron overload condition Hereditary Heamochromatosis (HH) can cause liver cirrhosis and cancer, diabetes and arthritis. In Europeans, most HH disease occurs in male HFE p.C282Y homozygotes, yet only a minority of homozygotes in the general population develop these conditions. We aimed to determine whether common genetic variants influencing iron levels or risks for liver, diabetes or arthritis diagnoses in the general population also modify clinical penetrance in HFE p.C282Y and p.H63D carriers.
Methods1,294 male and 1,596 female UK Biobank European-ancestry HFE p.C282Y homozygous participants with electronic medical records up to 14 years after baseline assessment were studied. Polygenic risk scores (PRS) quantified genetic effects on blood iron biomarkers and relevant diseases (identified in the general population). Analyses were repeated in 10,699 p.C282Y/p.H63D compound heterozygotes.
ResultsIn male p.C282Y homozygotes, higher iron PRS increased risk of liver fibrosis or cirrhosis diagnoses (top 20% of iron PRS had Odds Ratio 4.90: 95% Confidence Intervals 1.63 to 14.73, p=0.005 versus bottom 20%), liver cancer, and osteoarthritis, but not diabetes. The liver cirrhosis PRS also associated with increased liver cancer diagnoses, and greater type-2 diabetes PRS increased risk of type-2 diabetes. In female p.C282Y homozygotes, osteoarthritis PRS was associated with increased osteoarthritis diagnoses, and type-2 diabetes PRS with type-2 diabetes. However, the iron PRS was not robustly associated with diagnoses in p.C282Y homozygote females, or in other p.C282Y/p.H63D genotypes.
ConclusionsHFE p.C282Y homozygote penetrance to clinical disease in a large community cohort was partly explained by common genetic variants that influence iron and risks of related diagnoses in the general population. Including PRS in HH screening and diagnosis may help in estimating prognosis and treatment planning.
Lay SummaryTwo or three sentences summarizing the main message of the article expressed in plain English to describe your findings to a non-medical audience.
O_LIHereditary Haemochromatosis, an iron overload condition, is the most common genetic disease in Northern Europeans; 1 in 150 people carry two copies of the highest risk mutation (called HFE p.C282Y).
C_LIO_LIOnly a minority of those with high risk HH variants actually develop iron overload diseases, such as liver cancer, cirrhosis, diabetes and arthritis. We tested whether known genetic variants with smaller effects on iron levels in the whole population modify risk of iron overload related disease in those with the HH high risk variants. We did similar tests for variants linked to each of the diseases caused by iron overload.
C_LIO_LIIncreased genetic risk for higher iron significantly raised the likelihood of liver fibrosis, cirrhosis and cancer in mutation carriers. In the future this information could help identify at-risk haemochromatosis patients early.
C_LI | genetic and genomic medicine |
10.1101/2022.03.07.22270699 | Adequacy of Self-Collected Anterior Nares Swabs for SARS-CoV-2 Testing by Grade School Children | BackgroundThe goal of this study was to characterize the ability of school-aged children to self-collect adequate anterior nares (AN) swabs for severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) testing.
MethodsFrom July to August 2021, 287 children, age 4-14 years-old, were prospectively enrolled in the Atlanta area. Symptomatic (n=197) and asymptomatic (n=90) children watched a short instructional video before providing a self-collected AN specimen. Health care workers (HCWs) then collected a second specimen, and useability was assessed by the child and HCW. Swabs were tested side-by-side for SARS-CoV-2. RNase P RNA detection was investigated as a measure of specimen adequacy.
ResultsAmong symptomatic children, 87/196 (44.4%) tested positive for SARS-CoV-2 by both self- and HCW-swab. Two children each were positive by self- or HCW-swab; one child had an invalid HCW-swab. Compared to HCW-swabs, self-collected swabs had 97.8% and 98.1% positive and negative percent agreements, respectively, and SARS-CoV-2 Ct values did not differ significantly between groups. Participants [≤]8 years-old were less likely than those >8 to be rated as correctly completing self-collection, but SARS-CoV-2 detection did not differ. Based on RNase P RNA detection, 270/287 children (94.1%) provided adequate self-swabs versus 277/287 (96.5%) HCW-swabs (p=0.24) with no difference when stratified by age.
ConclusionsChildren, aged 4-14 years-old, can provide adequate AN specimens for SARS-CoV-2 detection when presented with age-appropriate instructional material, consisting of a video and a handout, at a single timepoint. These data support the use of self-collected AN swabs among school-age children for SARS-CoV-2 testing. | infectious diseases |
10.1101/2022.03.07.22272036 | Diagnostic accuracy of age-adjusted D-dimer for pulmonary embolism among Emergency Department patients with suspected SARS-COV-2: A Canadian COVID-19 Emergency Department Rapid Response Network study | ImportanceRuling out pulmonary embolism (PE) among patients presenting to the Emergency Department (ED) with suspected or confirmed SARS-COV-2 is challenging due to symptom overlap, known increased pro-thrombotic risk, and unclear D-dimer test interpretation.
ObjectiveOur primary objective was to assess the diagnostic accuracy of standard and age-adjusted D-dimer test thresholds for predicting 30-day pulmonary embolism (PE) diagnosis in patients with suspected SARS-COV-2 infection.
Design, Setting, and ParticipantsThis was a retrospective observational study using data from 50 sites enrolling patients into the Canadian COVID-19 ED Rapid Response Network (CCEDRRN) registry between March 1, 2020 to July 2, 2021. Adults ([≥]18 years) with SARS-COV-2 testing performed at index ED visit were included if they had any of the following presenting complaints: chest pain, shortness of breath, hypoxia, syncope/presyncope, or hemoptysis. We excluded patients with duplicate records or no valid provincial healthcare number.
Main Outcomes and MeasuresOur primary end point was 30-day PE diagnosis based on a positive computed tomography pulmonary angiogram (CTPA) or hospital discharge diagnosis code of PE. The outcome measure was the diagnostic accuracy of an age adjusted D-dimer strategy as compared to absolute D-dimer thresholds (500 - 5000 ng/mL).
Results52,038 patients met inclusion criteria. Age-adjusted D-dimer had a sensitivity (SN) of 96% (95% CI 93-98%) and a specificity (SP) of 48% (95% CI 48-49%) which was comparable to the most sensitive absolute threshold of 500 ng/mL (SN 98%, 95% CI 96-99%; SP 41%, 95% CI 40-42%). Other absolute D-dimer thresholds did not perform well enough for clinical reliability (SN <90%). Both age-adjusted and absolute D-dimer performed better in SARS-COV-2 negative patients as compared to SARS-COV-2 positive patients for predicting 30-day PE diagnosis (c-statistic 0.88 vs 0.80).
Conclusions and RelevanceIn this large Canadian cohort of ED patients with suspected SARS-COV-2 infection, an age-adjusted D-dimer strategy had similar sensitivity and superior specificity to the most sensitive D-dimer threshold of 500 ng/mL for predicting 30-day PE diagnosis irrespective of SARS-COV-2 infection status. Adopting an age-adjusted D-dimer strategy in patients with suspected SARS-COV-2 may help avoid unnecessary CTPA testing without compromising safety.
Trial RegistrationClinicaltrials.gov, NCT04702945
KEY POINTSO_ST_ABSQuestionC_ST_ABSWhat is the diagnostic accuracy of age-adjusted and absolute D-dimer thresholds for investigating PE in ED patients with suspected SARS-COV-2?
FindingsAn age-adjusted D-dimer strategy had comparable sensitivity and higher specificity for 30-day PE diagnosis compared to the most sensitive absolute threshold of 500 ng/mL irrespective of patients SARS-COV-2 status.
MeaningConsider using an age-adjusted D-dimer threshold for PE risk stratification in ED patients with suspected SARS-COV-2. | emergency medicine |
10.1101/2022.03.08.22272020 | Is virtual care the new normal? Evidence supporting Covid-19's durable transformation on healthcare delivery | ObjectiveDespite the surge of telemedicine use during the early stages of the coronavirus-19 (COVID-19) pandemic, research has not evaluated the extent to which the growth of telemedicine has been sustained during recurring pandemic waves. This study provides data on the long-term durability of video-based telemedicine visits and their impact on urgent and non-urgent healthcare delivery from one large health system in New York City.
Materials and MethodsElectronic health record (EHR) data of patients between January 1st, 2020 and November 30th, 2021 were used to conduct the analyses and longitudinal comparisons of telemedicine or in-person visit volumes. Patients diagnosis data were used to differentiate COVID-19 suspected visits from non-COVID-19 ones while comparing the visit types.
ResultsWhile COVID-19 prompted an increase in telemedicine visits and a simultaneous decline in in-person clinic visits, telemedicine use has stabilized since then for both COVID-19 and non-COVID suspected visits. For COVID-19 suspected visits, utilization of virtual urgent care facilities is higher than the trend. The data further suggests that virtual healthcare delivery supplements, rather than replaces, in-person care.
DiscussionThe COVID-19 pandemic has transformed the use of telemedicine as a means of healthcare delivery, and the data presented here suggests that this is an enduring transformation.
ConclusionTelemedicine use increased with the surge of infection cases during the pandemic, but evidence suggests that it will persist after the pandemic, especially for younger patients, for both urgent and non-urgent care. These findings have implications for the healthcare delivery system, insurers and policymakers. | public and global health |
10.1101/2022.03.08.22272020 | Is virtual care the new normal? Evidence supporting Covid-19's durable transformation on healthcare delivery | ObjectiveDespite the surge of telemedicine use during the early stages of the coronavirus-19 (COVID-19) pandemic, research has not evaluated the extent to which the growth of telemedicine has been sustained during recurring pandemic waves. This study provides data on the long-term durability of video-based telemedicine visits and their impact on urgent and non-urgent healthcare delivery from one large health system in New York City.
Materials and MethodsElectronic health record (EHR) data of patients between January 1st, 2020 and November 30th, 2021 were used to conduct the analyses and longitudinal comparisons of telemedicine or in-person visit volumes. Patients diagnosis data were used to differentiate COVID-19 suspected visits from non-COVID-19 ones while comparing the visit types.
ResultsWhile COVID-19 prompted an increase in telemedicine visits and a simultaneous decline in in-person clinic visits, telemedicine use has stabilized since then for both COVID-19 and non-COVID suspected visits. For COVID-19 suspected visits, utilization of virtual urgent care facilities is higher than the trend. The data further suggests that virtual healthcare delivery supplements, rather than replaces, in-person care.
DiscussionThe COVID-19 pandemic has transformed the use of telemedicine as a means of healthcare delivery, and the data presented here suggests that this is an enduring transformation.
ConclusionTelemedicine use increased with the surge of infection cases during the pandemic, but evidence suggests that it will persist after the pandemic, especially for younger patients, for both urgent and non-urgent care. These findings have implications for the healthcare delivery system, insurers and policymakers. | public and global health |
10.1101/2022.03.07.22272051 | Impact of long-term COVID on workers: a systematic review protocol | IntroductionPart of the patients infected by COVID-19 have at least one lasting sequel of the disease and may be framed in the concept of long Covid. These sequelae can compromise the quality of life, increase dependence on other people for personal care, impair the performance of activities of daily living, thus compromising work activities and harming the health of the worker. This protocol aims to critically synthesize the scientific evidence on the effects of Covid-19 among workers and its impact on their health status and professional life.
MethodSearches will be performed in MEDLINE via PubMed, EMBASE, Cochrane Library, Web of Science, Scopus, LILACS and Epistemonikos. Included studies will be those that report the prevalence of long-term signs and symptoms in workers and/or the impact on their health status and work performance, which may be associated with Covid-19 infection. Data extraction will be conducted by 3 reviewers independently. For data synthesis, a results report will be carried out, based on the main outcome of this study.
DiscussionThis review will provide evidence to support health surveillance to help decision makers (i.e. healthcare providers, stakeholders and governments) regarding long-term Covid. | public and global health |
10.1101/2022.03.08.22272091 | A systematic review and meta-analysis of Long COVID symptoms | BackgroundOngoing symptoms or the development of new symptoms following a SARS-CoV-2 diagnosis has caused a complex clinical problem known as ":Long COVID": (LC). This has introduced further pressure on global healthcare systems as there appears to be a need for ongoing clinical management of these patients. LC personifies heterogeneous symptoms at varying frequencies. The most complex symptoms appear to be driven by the neurology and neuropsychiatry spheres.
MethodsA systematic protocol was developed, peer reviewed and published in PROSPERO. The systematic review included publications from the 1st of December 2019-30th June 2021 published in English. Multiple electronic databases were used. The dataset has been analysed using a random-effects model and a subgroup analysis based on geographical location. Prevalence and 95% confidence intervals (CIs) were established based on the data identified.
ResultsOf the 302 studies, 49 met the inclusion criteria, although 36 studies were included in the meta-analysis. The 36 studies had a collective sample size of 11598 LC patients. 18 of the 36 studies were designed as cohorts and the remainder were cross-sectional. Symptoms of mental health, gastrointestinal, cardiopulmonary, neurological, and pain were reported.
ConclusionsThe quality that differentiates this meta-analysis is that they are cohort and cross-sectional studies with follow-up. It is evident that there is limited knowledge available of LC and current clinical management strategies may be suboptimal as a result. Clinical practice improvements will require more comprehensive clinical research, enabling effective evidence-based approaches to better support patients.
FundingNone | infectious diseases |
10.1101/2022.03.08.22272056 | Boosters protect against SARS-CoV-2 infections in young adults during an Omicron-predominant period | BackgroundWhile booster vaccinations clearly reduce the risk of severe COVID-19 and death, the impact of boosters on SARS-CoV-2 infection has not been fully characterized: doing so requires understanding their impact on asymptomatic and mildly symptomatic infections that often go unreported but nevertheless play an important role in spreading SARS-CoV-2. We sought to estimate the impact of COVID-19 booster doses on SARS-CoV-2 infection in a vaccinated and actively surveilled population of young adults during an Omicron-predominant period.
Methods and FindingsWe implemented a cohort study of young adults in a college environment (Cornell Universitys Ithaca campus) from December 5, 2021 through December 31, 2021 when Omicron was deemed the predominant SARS-CoV-2 variant on campus. Participants included 15,102 university students fully vaccinated with an FDA-authorized or approved vaccination (BNT162b2, mRNA-1273, or Ad26.COV2.S) who were enrolled in mandatory at-least-weekly surveillance PCR testing, and having no positive SARS-CoV-2 PCR test within 90 days before the start of the study period. Multivariable logistic regression considering those with full vaccination (a 2-dose series of BNT162b2 or mRNA-1273, or 1 dose of Ad26.COV2.S) with a booster dose versus those without a booster dose. 1,870 SARS-CoV-2 infections were identified in the study population. Controlling for gender, student group membership, full vaccination date and initial vaccine type, our analysis estimates that receiving a booster dose reduces the odds of having a PCR-detected SARS-CoV-2 infection relative to full vaccination by 52% (95% confidence interval [37%, 64%]). This result is robust to the choice of the delay over which a booster dose becomes effective (varied from 1 day to 14 days).
ConclusionsBoosters are effective, relative to full vaccination, in reducing the odds of SARS-CoV-2 infections in a college student population during a period when Omicron was predominant. Therefore, booster vaccinations for this age group can play an important role in reducing community and institutional transmission. | epidemiology |
10.1101/2022.03.08.22272087 | Sustained high prevalence of COVID-19 deaths from a systematic post-mortem study in Lusaka, Zambia: one year later | BackgroundSparse data documenting the impact of COVID-19 in Africa has fostered the belief that COVID-19 skipped Africa. We previously published results from a systematic postmortem surveillance at a busy inner-city morgue in Lusaka, Zambia. Between June-October 2020, we detected COVID-19 in 15-19% of all deaths and concentrated in community settings where testing for COVID-19 was absent. Yet these conclusions rested on a small cohort of 70 COVID-19+ decedents. Subsequently, we conducted a longer and far larger follow-on survey using and expanding on the same methodology.
MethodsWe obtained a nasopharyngeal swab from each enrolled decedent and tested these using reverse transcriptase quantitative PCR (RT-qPCR). A subset of samples with a PCR cycle threshold <30 underwent genotyping to identify viral lineages. We weighted our results to adjust for enrolment ratios and stratified them by setting (facility vs. community), time of year, age, and location.
ResultsFrom 1,118 enrolled decedents, COVID-19 was detected among 32.0% (358/1,116). We observed three waves of transmission that peaked in July 2020, January 2021, and [~]June 2021 (end of surveillance). These were dominated by the AE.1 lineage and the Beta and Delta variants, respectively. During peak transmission, COVID-19 was detected in [~]90% of all deaths. Roughly four COVID-19 deaths occurred in the community for every facility death. Antemortem testing occurred for 52.6% (302/574) of facility deaths but only 1.8% (10/544) of community deaths and overall, only [~]10% of COVID-19+ deaths were identified in life.
ConclusionsCOVID-19 had a devastating impact in Lusaka. COVID-19+ deaths occurred in all age groups and was the leading cause of death during peak transmission periods. Testing was rarely done for the vast majority of COVID-19 deaths that occurred in the community, yielding a substantial undercount.
What is already known on this topicO_LIPreviously, we reported that COVID-19 was present among 15-19% of all decedents passing through a busy city morgue in Lusaka.
C_LIO_LIData documenting the mortal impact of COVID-19 in Africa remain sparse.
C_LIO_LISeveral modeling groups have also argued that COVID-19s impact in Africa has been underreported and hence underestimated.
C_LI
What this study addsO_LIAntemortem testing for COVID-19 captured only [~]10% of COVID-19 positive individuals indicating a substantial gap in surveillance.
C_LIO_LIDuring peak transmission periods, [~]90% of all deceased individuals tested positive for COVID-19.
C_LIO_LIMost COVID-19 positive deceased adults presented with symptoms typical of COVID-19, arguing that COVID-19 caused their deaths and was not a co-incidental finding.
C_LIO_LIDeaths occurred across the age spectrum, including among young children, indicating a different pattern of impact from what has been seen in high income country settings.
C_LIO_LIWe document three waves of transmission, attributable to the AE.1 lineage, and the Beta and Delta variants, respectively.
C_LI | infectious diseases |
10.1101/2022.03.08.22272087 | Sustained high prevalence of COVID-19 deaths from a systematic post-mortem study in Lusaka, Zambia: one year later | BackgroundSparse data documenting the impact of COVID-19 in Africa has fostered the belief that COVID-19 skipped Africa. We previously published results from a systematic postmortem surveillance at a busy inner-city morgue in Lusaka, Zambia. Between June-October 2020, we detected COVID-19 in 15-19% of all deaths and concentrated in community settings where testing for COVID-19 was absent. Yet these conclusions rested on a small cohort of 70 COVID-19+ decedents. Subsequently, we conducted a longer and far larger follow-on survey using and expanding on the same methodology.
MethodsWe obtained a nasopharyngeal swab from each enrolled decedent and tested these using reverse transcriptase quantitative PCR (RT-qPCR). A subset of samples with a PCR cycle threshold <30 underwent genotyping to identify viral lineages. We weighted our results to adjust for enrolment ratios and stratified them by setting (facility vs. community), time of year, age, and location.
ResultsFrom 1,118 enrolled decedents, COVID-19 was detected among 32.0% (358/1,116). We observed three waves of transmission that peaked in July 2020, January 2021, and [~]June 2021 (end of surveillance). These were dominated by the AE.1 lineage and the Beta and Delta variants, respectively. During peak transmission, COVID-19 was detected in [~]90% of all deaths. Roughly four COVID-19 deaths occurred in the community for every facility death. Antemortem testing occurred for 52.6% (302/574) of facility deaths but only 1.8% (10/544) of community deaths and overall, only [~]10% of COVID-19+ deaths were identified in life.
ConclusionsCOVID-19 had a devastating impact in Lusaka. COVID-19+ deaths occurred in all age groups and was the leading cause of death during peak transmission periods. Testing was rarely done for the vast majority of COVID-19 deaths that occurred in the community, yielding a substantial undercount.
What is already known on this topicO_LIPreviously, we reported that COVID-19 was present among 15-19% of all decedents passing through a busy city morgue in Lusaka.
C_LIO_LIData documenting the mortal impact of COVID-19 in Africa remain sparse.
C_LIO_LISeveral modeling groups have also argued that COVID-19s impact in Africa has been underreported and hence underestimated.
C_LI
What this study addsO_LIAntemortem testing for COVID-19 captured only [~]10% of COVID-19 positive individuals indicating a substantial gap in surveillance.
C_LIO_LIDuring peak transmission periods, [~]90% of all deceased individuals tested positive for COVID-19.
C_LIO_LIMost COVID-19 positive deceased adults presented with symptoms typical of COVID-19, arguing that COVID-19 caused their deaths and was not a co-incidental finding.
C_LIO_LIDeaths occurred across the age spectrum, including among young children, indicating a different pattern of impact from what has been seen in high income country settings.
C_LIO_LIWe document three waves of transmission, attributable to the AE.1 lineage, and the Beta and Delta variants, respectively.
C_LI | infectious diseases |
10.1101/2022.03.08.22271889 | Mapping of cis-regulatory variants by differential allelic expression analysis identifies candidate risk variants and target genes of 27 breast cancer risk loci | BackgroundBreast cancer (BC) genome-wide association studies (GWAS) have identified hundreds of risk-loci that require novel approaches to reveal the causal variants and target genes within them. As causal variants are most likely regulators of gene expression, we hypothesize that their identification is facilitated by pinpointing the variants with greater regulatory potential within risk-loci.
MethodsWe performed genome-wide differential allelic expression (DAE) analysis using microarrays data from 64 normal breast tissue samples. Then, we mapped the variants associated with DAE (daeQTLs) and intersected these with GWAS data to reveal candidate risk regulatory variants. Finally, we validated our approach by functionally analysing the 5q14.1 breast cancer risk-locus.
ResultsWe found widespread gene expression regulation by cis-acting variants in breast tissue, with 80% of coding and non-coding expressed genes displaying DAE (daeGenes). We identified over 23K daeQTLs for 2753 (16%) daeGenes, including at 154 known BC risk- loci. And in 31 of these risk-loci, we found risk-associated variant(s) and daeQTLs in strong linkage disequilibrium suggesting that the risk-causing variants are cis-regulatory, and in 27 risk-loci we propose 37 candidate target genes. As validation, we identified five candidate causal variants at the 5q14.1 risk-locus targeting the ATG10, RPS23, and ATP6AP1L genes, likely via modulation of miRNA binding, alternative transcription, and transcription factor binding.
ConclusionOur study shows the power of DAE analysis and daeQTL mapping to identify causal regulatory variants and target genes at BC risk loci, including those with complex regulatory landscapes, and provides a genome-wide resource of variants associated with DAE for future functional studies. | genetic and genomic medicine |
10.1101/2022.03.07.22270992 | Discrimination of sleep and wake periods from a hip-worn raw acceleration sensor using recurrent neural networks | The use of accelerometers has become an established method in population research. Accelerometers are small body-worn sensors that can monitor movement patterns and cycles over several days under free-living conditions. A key requirement for any accelerometer-based analysis is the reliable discrimination of sleep and wake episodes. However, many studies using hip-worn sensors either instruct the participants to remove the sensor over night or use algorithms developed for wrist placement which are known to perform poorly on hip data. Here we present a new algorithm to differentiate sleep from awake time in raw hip acceleration data using a machine learning approach. Validated on sleep estimates from a subset of the Tromso Study, the proposed algorithm outperformed the standard algorithms by discriminating sleep and wake correctly in 93.81% (95% CI: 0.72) of the time (F1 score = 0.93, 95% CI: 0.01). The value of our algorithm lies in the improved recognition of sleep and movement in large studies where manual scoring is unfeasible due to the amount of data. Even though the algorithm was developed to learn characteristic hip movements, it can easily be adapted for other sensor placements or study populations. | health informatics |
10.1101/2022.03.07.22271469 | Fed-GLMM: A Privacy-Preserving and Computation-Efficient Federated Algorithm for Generalized Linear Mixed Models to Analyze Correlated Electronic Health Records Data | Large collaborative research networks provide opportunities to jointly analyze multicenter electronic health record (EHR) data, which can improve the sample size, diversity of the study population, and generalizability of the results. However, there are challenges to analyzing multicenter EHR data including privacy protection, large-scale computation, heterogeneity across sites, and correlated observations. In this paper, we propose a federated algorithm for generalized linear mixed models (Fed-GLMM), which can flexibly model multicenter longitudinal or correlated data while accounting for site-level heterogeneity. Fed-GLMM can be applied to both federated and centralized research networks to enable privacy-preserving data integration and improve computational efficiency. By communicating only a limited amount of summary statistics, Fed-GLMM can achieve nearly identical results as the gold-standard method where the GLMM is directly fitted on the pooled dataset. We demonstrate the performance of Fed-GLMM in both numerical experiments and an application to longitudinal EHR data from multiple healthcare facilities. | health informatics |
10.1101/2022.03.08.22272057 | Physical, psychological and cognitive profile of post-COVID condition in healthcare workers, Quebec, Canada | ImportanceMost adults with COVID-19 do not require hospitalization, but the subsequent risk of post-COVID condition, including associated psychological and cognitive dysfunction, remains poorly understood among non-hospitalized versus hospitalized cases.
ObjectiveTo assess the prevalence and duration of post-COVID condition, including physical, psychological and cognitive symptoms.
DesignCase series and case-control study between December 2020 and May 2021
SettingHealthcare workers in Quebec, Canada
ParticipantsEligible cases were symptomatic healthcare workers with PCR-confirmed COVID-19 between July 2020 and May 2021. Among 17,717 contacted cases, 6061 (34%) participated. A random sample of symptomatic healthcare workers with negative PCR result between November 2020 and May 2021 served as controls. Among 11,498 contacted controls, 4390 (38%) participated.
ExposuresIn multivariable models, sociodemographic and clinical characteristics, as well as vaccine history, were evaluated as potential risk factors. Prevalence ratios compared self-reported cognitive dysfunctions (difficulty concentrating; difficulty organizing oneself; forgetfulness; loss of necessary items) among cases with post-COVID condition to controls, adjusting for psychological distress and fatigue.
OutcomesPost-COVID condition was defined by symptoms persisting [≥]4 weeks or [≥]12 weeks after COVID-19 onset.
ResultsFour-week and 12-week post-COVID condition prevalences of 46% (2,746/5,943) and 40% (653/1,746), respectively, were observed among non-hospitalized cases and 76% (90/118) and 68% (27/37), respectively, among hospitalized cases. Hospitalization, female sex and age were associated with higher risk.
A substantial proportion of non-hospitalized cases with 4-week post-COVID condition often or very often reported cognitive dysfunction, including concentration (33%) or organizing (23%) difficulties, forgetfulness (20%) and loss of necessary items (10%), with no decline at 12 weeks. All four aspects of cognitive dysfunction were 2.2 to 3.0 times more prevalent among cases with post-COVID condition than in controls, but also independently associated with psychological distress and fatigue.
Conclusions and relevancePost-COVID condition may be a frequent sequela of ambulatory COVID-19 in working-age adults, with important effects on cognition. With so many healthcare workers infected since the beginning of the COVID-19 pandemic, the ongoing implications for quality healthcare delivery could be profound should cognitive dysfunction and other severe post-COVID symptoms persist in a professionally-disabling way over the longer term.
Key pointsO_ST_ABSQuestionC_ST_ABSHow common and long-lasting are the physical, psychological and cognitive effects of post-COVID condition in healthcare workers, both hospitalized and non-hospitalized?
FindingsThe prevalence of post-COVID condition was 46% at 4 weeks and 40% at 12 weeks among non-hospitalized cases and 76% and 68% among hospitalized cases. One third of non-hospitalized healthcare workers with post-COVID condition reported cognitive impairment, which was independently associated with persistent physical symptoms, but also psychological distress and fatigue.
MeaningPersistent cognitive and other professionally-disabling sequelae of COVID-19 in essential workers could have critical implications for quality healthcare delivery during and after the pandemic. | infectious diseases |
10.1101/2022.03.01.22271584 | QUALITY CONTROL OF STAPHYLOCOCCUS AUREUS IN RIO GRANDE AND MENONITA CRIOLLO CHEESES FROM THE ALEJO CALATAYUD MARKET | IntroductionThe Staphylococcus aureus count is one of the indicators of the microbiological quality of cheeses according to the parameters of the Bolivian Standard 32004, since this bacterium can cause food poisoning.
ObjectiveTo determine the presence and quantity of Staphylococcus aureus in Rio Grande and Menonita Creole cheeses in the Alejo Calatayud Market in the city of Cochabamba during the month of July 2019 according to parameters established by Bolivian Standard 32004.
MethodsEight samples taken from different cheeses randomly selected from the market were analyzed and subjected to quantitative analysis of CFU/g of Staphylococcus aureus in three different dilutions for each sample, giving a total of 24, 13 were discarded due to impossibility of counting and the remaining samples were subjected to confirmatory tests. The parameters established in the NB32004 were used for the qualification of the results of analysis and tests.
ResultsIn 100% of the samples (8 of 8), the presence of S. aureus bacteria was found in quantities higher than the acceptable CFU/g, determining as deficient the hygienic-sanitary quality of the ripened Creole cheeses sold in the Alejo Calatayud Market.
ConclusionCreole cheeses marketed in the Alejo Calatayud Market have a high degree of contamination by S. aureus and therefore do not comply with the parameters of NB32004 and the population is exposed to food poisoning. | infectious diseases |
10.1101/2022.03.07.22272016 | Premature Cardiovascular Death and its Modifiable Risk Factors: Protocol of a Systematic Review and Meta-Analysis | IntroductionThe burden of cardiovascular disease (CVDs), in the number of premature deaths, continues to increase globally and alarmingly, especially in almost all countries outside high-income countries. An accurate estimate of premature CVD death is of crucial importance for planning, implementing, and evaluating cardiovascular prevention and care interventions. However, existing literature reported the evidence on premature CVD death from selected regions, and there has been less evidence of systematic review with meta-analysis estimating the global premature death. This paper reports the protocol for a systematic review and meta-analysis to derive solid and updated estimates on global and setting-specific premature CVD death prevalence and its associated risk factors.
Methods and analysisPUBMED, EMBASE, Web of Science, CINAHL, and Cochrane Central Register of Controlled Trials (CENTRAL) will be used as the literature database. Retrieved records will be independently screened by two authors and relevant data will be extracted from studies that report data on premature mortality (death before the age of 70 years) related to CVD. Study selection and reporting will follow the Preferred Reporting Items for Systematic Review and Meta-analyses (PRISMA) guideline. Pooled estimates of premature CVD mortality (based on standardised mortality ratio and years of life lost) and the effect size of modifiable risk factors will be computed applying random-effects meta-analysis. Heterogeneity among selected studies will be assessed using the I2 statistic and explored through meta-regression and subgroup analyses. Depending on data availability, we propose to conduct subgroup analyses by geographical area, CVD events, and socio-demographic variables of interest study. The risk of bias for the studies included in the systematic review or meta-analysis will be assessed by the Newcastle-Ottawa Quality Assessment Scale.
Ethics and disseminationEthics approval is not required as the data used in this systematic review will be extracted from published studies. The systematic review will focus on the premature CVD mortality rate and its associated factors. The findings of the final report will be disseminated to the scientific community through publication in a peer-reviewed journal and presentation at conferences.
PROSPERO registration numberCRD42021288415 | cardiovascular medicine |
10.1101/2022.03.07.22272055 | Emergence and Spread of the SARS-CoV-2 Omicron Variant in Alberta Communities Revealed by Wastewater Monitoring | Wastewater monitoring of SARS-CoV-2 allows for early detection and monitoring of COVID-19 burden in communities and can track specific variants of concern. Targeted assays enabled relative proportions of SARS-CoV-2 Omicron and Delta variants to be determined across 30 municipalities covering >75% of the province of Alberta (pop. 4.5M) in Canada, from November 2021 to January 2022. Larger cities like Calgary and Edmonton exhibited a more rapid emergence of Omicron relative to smaller and more remote municipalities. Notable exceptions were Banff, a small international resort town, and Fort McMurray, a more remote northern city with a large fly-in worker population. The integrated wastewater signal revealed that the Omicron variant represented close to 100% of SARS-CoV-2 burden prior to the observed increase in newly diagnosed clinical cases throughout Alberta, which peaked two weeks later. These findings demonstrate that wastewater monitoring offers early and reliable population-level results for establishing the extent and spread of emerging pathogens including SARS-CoV-2 variants. | epidemiology |
10.1101/2022.03.07.22272059 | Evaluation of the Causal Relationship Between Smoking and Schizophrenia in Asia | Cigarette smoking has been suggested to be associated with the risk of schizophrenia (SCZ) in observational studies. A significant causal effect of smoking on SCZ has been reported in the European population using the Mendelian randomization (MR) approach; however, no evidence of causality was found in participants from East Asia (EAS). Using the Taiwan Biobank (TWBB, sample size up to 79,989), we conducted genome-wide association studies (GWAS) to identify susceptibility loci for smoking behavior, which included the initiation of smoking and the onset age. To maximize the power of genetic discovery in the EAS population, we meta-analyzed GWAS from the TWBB and Biobank Japan (BBJ, sample size up to 165,436) for smoking traits. The GWAS for SCZ was taken from the Asia Psychiatric Genomics Consortium, which included 22,778 cases and 35,362 controls. We performed a two-sample MR to estimate the causality of smoking behavior on SCZ in the EAS population. In TWBB, we identified one novel locus that met genome-wide significance for onset age. In a meta-analysis of TWBB and BBJ, we identified two novel loci for smoking initiation. In MR, a marginal significance was found for the causality of smoking initiation on SCZ (odds ratio (OR) = 4.00, 95% confidence interval (CI) = 0.89-18.01, P = 0.071). Later onset age for smoking was causally associated with a lower risk of SCZ (OR for a per-year increase in onset = 0.96, 95% CI = 0.91-1.01) with a marginal significance (P = 0.098). | epidemiology |
10.1101/2022.03.07.22272049 | Growth monitoring and promotion service utilization and associated factors among children under-two years of age in Samara-logia city of Afar Region, Northeast Ethiopia | IntroductionUtilizing growth monitoring and promotion (GMP) during the first two years of birth helps to detect common childhood health problems (e.g., malnutrition and infections) at early stages and offers an opportunity for promoting education and nutritional counselling. However, no previously published study has investigated the level of GMP utilization and associated factors among pastoralist Ethiopian mothers including the Afar National and Regional State (ANRS), where under-nutrition is the major cause of childhood morbidity and mortality. This is the first study aimed to investigate the utilization and associated factors of GMP service for infants and young children in the ANRS of Ethiopia.
MethodsA community-based cross-sectional study was conducted from May to June 202 in the Samara-logia city administration. A total of 416 children aged two years were selected using a random sampling technique, and an interviewer-administered questionnaire was used to collect data. Multivariable logistic regression was applied to examine the influence of explanatory variables (including socio-demographic, obstetric and health service, and health literacy factors) on the utilization of GMP services.
ResultsThe overall utilization of GMP for infants and children was 15.9% with 95% confidence intervals (CI) from 12.0% to 19.5%. Children whose fathers attained college or higher schooling were more likely to utilize GMP services (AOR = 12.40; 95% CI: 4.15, 37.4). Children who resided in households with a higher number of children were less likely to utilize GMP services (AOR = 0.10; 95%CI: 0.03, 0.26 for households with 3-4 children and AOR = 0.16; 95% CI: 0.05, 0.48 for children with 4+). The odds for GMP service utilization was significantly higher among children who received postnatal care (AOR = 9.68; 95% CI: 3.57, 26.20).
ConclusionThe utilization of GMP services is lower to support the reduction of child morbidity and mortality attributed to malnutrition for infants and children in Ethiopia. A higher level of schooling and those who received postnatal care were more likely to utilize GMP, while those who were from households with a higher number of children were less likely to utilize GMP. Our findings suggest strengthening GMP service in Ethiopia, and focused efforts are required on the modifiable associated factors. | epidemiology |
10.1101/2022.03.07.22272026 | Trends, regional variation and clinical characteristics of recipients of antivirals and neutralising monoclonal antibodies for non-hospitalised COVID-19: a descriptive cohort study of 23.4 million people in OpenSAFELY | BackgroundFrom December 16th 2021, antivirals and neutralising monoclonal antibodies (nMABs) were available to treat high-risk non-hospitalised patients with COVID-19 in England.
AimsTo develop a framework for detailed near real-time monitoring of treatment deployment, to ascertain eligibility status for patients and to describe trends and variation in coverage of treatment between geographic, clinical and demographic groups.
MethodsWith the approval of NHS England we conducted a retrospective cohort study using routine clinical data from 23.4m people in the OpenSAFELY-TPP database, approximately 40% of Englands population. We implemented national eligibility criteria and generated descriptive statistics with detailed clinical, demographic and geographic breakdowns for patients receiving an antiviral or nMAB.
ResultsWe identified 50,730 non-hospitalised patients with COVID-19 between 11th December 2021 and 23rd February 2022 who were potentially eligible for antiviral and/or nMAB treatment. 6420 (15%) received treatment (sotrovimab 3600 (56%); molnupiravir 2680 (42%); nirmatrelvir/ritonavir (Paxlovid) 80 (1%); casirivimab 50 (1%); and remdesivir <5). The proportion treated varied by risk group, with the lowest proportion treated in those with liver disease (10%; 95% CI 9-11). Treatment type also varied, with molnupiravir favoured over sotrovimab in only two high risk cohorts: Down syndrome (67%; 95% CI 59-74) and HIV/AIDS (63%; 95% CI 56-70). The proportion treated varied by ethnicity, from White (14%; 95% CI 13-14) or Asian (13%; 95% CI 12-14) to Black (9%; 95% CI 8-11); by NHS Regions (from 6% (95% CI 5-6) in Yorkshire and the Humber to 17% (95% CI 16-18) in the East of England); and by rurality from 16% (95% CI 14-17) in "Rural - village and dispersed" to 10% (95% CI 10-11) in "Urban - conurbation". There was also lower coverage among care home residents (4%; 95% CI 3-4), those with dementia (4%; 95% CI 3-5), those with sickle cell disease (7%; 95% CI 5-8), and in the most socioeconomically deprived areas (9%; 95% CI 8-9, vs least deprived: 15%; 95% CI 15-16). Patients who were housebound, or who had a severe mental illness had a slightly reduced chance of being treated (10%; 95% CI 8-11 and 10%; 95% CI 8-12, respectively). Unvaccinated patients were substantially less likely to receive treatment (5%; 95% CI 4-6).
ConclusionsUsing the OpenSAFELY platform we have developed and delivered a rapid, near real-time data-monitoring framework for the roll-out of antivirals and nMABs in England that can deliver detailed coverage reports in fine-grained clinical and demographic risk groups, using publicly auditable methods, using linked but pseudonymised patient-level NHS data in a highly secure Trusted Research Environment. Targeted activity may be needed to address apparent lower treatment coverage observed among certain groups, in particular (at present): different NHS regions, socioeconomically deprived areas, and care homes. | primary care research |
10.1101/2022.03.07.22271986 | Regional, circuit, and network heterogeneity of brain abnormalities in psychiatric disorders | The substantial individual heterogeneity that characterizes mental illness is often ignored by classical case-control designs that rely on group mean comparisons. Here, we present a comprehensive, multiscale characterization of individual heterogeneity of brain changes in 1294 cases diagnosed with one of six conditions and 1465 matched healthy controls. Normative models identified that person-specific deviations from population expectations for regional grey matter volume were highly heterogeneous, affecting the same area in <7% of people with the same diagnosis. However, these deviations were embedded within common functional circuits and networks in up to 56% of cases. The salience/ventral attention system was implicated transdiagnostically, with other systems selectively involved in depression, bipolar disorder, schizophrenia, and ADHD. Our findings indicate that while phenotypic differences between cases assigned the same diagnosis may arise from heterogeneity in the location of regional deviations, phenotypic similarities are attributable to dysfunction of common functional circuits and networks. | psychiatry and clinical psychology |
10.1101/2022.03.07.22272007 | Reconstructing subdistrict-level population denominators in Yemen after six years of armed conflict and forced displacement | IntroductionYemen has experienced widespread insecurity since 2014, resulting in large-scale internal displacement. In the absence of reliable vital events registration, we tried to reconstruct the evolution of Yemens population between June 2014 and September 2021, at subdistrict (administrative level 3) resolution, while accounting for growth and internal migration.
MethodsWe reconstructed subdistrict-month populations starting from June 2014 WorldPop gridded estimates, as a function of assumed birth and death rates, estimated changes in population density, net internal displacement to and from the subdistrict and assumed overlap between internal displacement and WorldPop trends. Available displacement data from the Displacement Tracking Matrix (DTM) project were subjected to extensive cleaning and imputation to resolve missingness, including through machine learning models informed by predictors such as insecurity. We also modelled the evolution of displaced groups before and after assessment points. To represent parameter uncertainty, we complemented the main analysis with sensitivity scenarios.
ResultsWe estimated that Yemens population rose from about 26.3M to 31.1M during the seven-year analysis period, with considerable pattern differences at sub-national level. We found that some 10 to 14M Yemenis may have been internally displaced during 2015-2016, about five times United Nations estimates. By contrast, we estimated that the internally displaced population had declined to 1-2M by September 2021.
ConclusionsThis analysis illustrates approaches to analysing the dynamics of displacement, and the application of different models and data streams to supplement incomplete ground observations. Our findings are subject to limitations related to data quality, model inaccuracy and omission of migration outside Yemen. We recommend adaptations to the DTM project to enable more robust estimation. | public and global health |
10.1101/2022.03.07.22272009 | VinDr-Mammo: A large-scale benchmark dataset for computer-aided diagnosis in full-field digital mammography | Mammography, or breast X-ray, is the most widely used imaging modality to detect cancer and other breast diseases. Recent studies have shown that deep learning-based computer-assisted detection and diagnosis (CADe/x) tools have been developed to support physicians and improve the accuracy of interpreting mammography. However, most published datasets of mammography are either limited on sample size or digitalized from screen-film mammography (SFM), hindering the development of CADe/x tools which are developed based on full-field digital mammography (FFDM). To overcome this challenge, we introduce VinDr-Mammo - a new benchmark dataset of FFDM for detecting and diagnosing breast cancer and other diseases in mammography. The dataset consists of 5,000 mammography exams, each of which has four standard views and is double read with disagreement (if any) being resolved by arbitration. It is created for the assessment of Breast Imaging Reporting and Data System (BI-RADS) and density at the breast level. In addition, the dataset also provides the category, location, and BI-RADS assessment of non-benign findings. We make VinDr-Mammo publicly available on https://physionet.org/ as a new imaging resource to promote advances in developing CADe/x tools for breast cancer screening. | radiology and imaging |