id
stringlengths 16
27
| title
stringlengths 18
339
| abstract
stringlengths 95
38.7k
| category
stringlengths 7
44
|
---|---|---|---|
10.1101/2022.04.01.22273326 | Ablation of Apparent Diffusion Coefficient Hyperintensity Clusters and Mesial Temporal Lobe Epilepsy Improves Seizure Outcomes after Laser Interstitial Thermal Therapy | ObjectiveMR-guided Laser Interstitial Thermal Therapy (LiTT) is a minimally invasive surgical procedure for intractable mesial temporal epilepsy (mTLE). LiTT is safe and effective but seizure outcomes are highly variable due to patient variability, suboptimal targeting, and incomplete ablation of epileptogenic zone. Apparent Diffusion Coefficient (ADC) is an MRI sequence that can identify potential epileptogenic foci in the mesial temporal lobe to improve ablation and seizure outcome. The objective of this study was to investigate whether ablation of tissue clusters with high ADC values in the mesial temporal structures is associated with seizure outcome in mTLE after LiTT.
MethodsThirty mTLE patients who underwent LiTT at our institution were analyzed. Seizure outcome was categorized as complete seizure freedom (ILAE Class I) and residual seizures (ILAE Class II - VI). Volumes of hippocampus and amygdala were segmented from preoperative T1 MRI sequence. Spatially distinct hyperintensity clusters were identified in the preoperative ADC map. Percent cluster volume and number ablated were associated with seizure outcomes.
ResultsThe mean age at surgery was 36.6 years and mean follow-up duration was 1.9 years. Proportions of hippocampal cluster volume (35.20% vs. 16.5 %, p = 0.014) and cluster number (27.1 % vs 4.2 %, p = 0.0007) ablated were significantly higher in patients with seizure freedom. For amygdala clusters, only proportion of cluster number ablated was significantly associated with seizure outcome (13.2 % vs. 0 %, p = 0.016). Ablation of hippocampal clusters predicted seizure outcome, both by volume (AUC = 0.7679) and number (AUC = 0.8086) ablated.
SignificanceSeizure outcome after LiTT in mTLE patients was significantly associated with the extent of cluster ablation in the hippocampus and amygdala. The results suggest that preoperative ADC analysis may help identify high-yield pathological tissue clusters that represent epileptogenic foci. ADC based cluster analysis can potentially assist ablation targeting and improve seizure outcome after LiTT in mTLE. | neurology |
10.1101/2022.03.31.22272079 | Predicting the future development of diabetic retinopathy using a deep learning algorithm for the analysis of non-invasive retinal imaging | Diabetic retinopathy (DR) is the most common cause of vision loss in the working age. While over 90% of sight-threatening cases may be treated if detected early, prevalence of yearly detective screening is low until advanced presentation of the disease. We developed a machine learning algorithm for the prediction of future DR development using fundus photography of otherwise healthy eyes. Our algorithm achieves 0.81 Area Under Receiver Operating Curve (AUC) averaging scores from multiple images on the task of predicting development of referrable DR, and 0.76 AUC when using a single image. In conclusion, risk of DR may be predicted from fundus photography alone. Prediction of personalized risk of DR may become key in treatment and contribute to patient compliance across the board. Further prospective research is necessary. | ophthalmology |
10.1101/2022.03.31.22272079 | Predicting the future development of diabetic retinopathy using a deep learning algorithm for the analysis of non-invasive retinal imaging | Diabetic retinopathy (DR) is the most common cause of vision loss in the working age. While over 90% of sight-threatening cases may be treated if detected early, prevalence of yearly detective screening is low until advanced presentation of the disease. We developed a machine learning algorithm for the prediction of future DR development using fundus photography of otherwise healthy eyes. Our algorithm achieves 0.81 Area Under Receiver Operating Curve (AUC) averaging scores from multiple images on the task of predicting development of referrable DR, and 0.76 AUC when using a single image. In conclusion, risk of DR may be predicted from fundus photography alone. Prediction of personalized risk of DR may become key in treatment and contribute to patient compliance across the board. Further prospective research is necessary. | ophthalmology |
10.1101/2022.03.31.22273228 | Assessing the potential cost-effectiveness of centralized vs point-of-care testing for hepatitis C virus in Pakistan: a model-based comparison | BackgroundPakistan has a hepatitis C virus (HCV) infection prevalence of 6-9% and aims to achieve World Health Organization (WHO) targets for elimination of HCV by the year 2030 through scaling HCV diagnosis and accelerating access to care. The clinical and economic benefits of various HCV testing strategies have not yet been evaluated in Pakistan.
ObjectiveTo evaluate the potential cost-effectiveness of a reference laboratory-based (CEN) confirmatory testing approach vs a molecular near-patient point-of-care (POC) confirmatory approach to screen the general population for HCV in Pakistan.
MethodsWe developed a decision-analytic model comparing HCV testing under two scenarios: screening with an anti-HCV antibody test (Anti-HCV) followed by either POC nucleic acid testing (NAT) (Anti-HCV-POC), or reference laboratory NAT (Anti-HCV-CEN), using data from published literature, the Pakistan Ministry of Health, and expert judgment. Outcome measures included: number of HCV infections identified per year, percentage of individuals correctly classified, total costs, average costs per individual tested, and cost-effectiveness. Sensitivity analysis was also performed.
ResultsAt a national level for a tested population of 25 million, the Anti-HCV-CEN strategy would identify 142,406 more HCV infections in one year and increase correct classification of individuals by 0.57% compared with the Anti-HCV-POC strategy. The total annual cost of HCV testing was reduced using the Anti-HCV-CEN strategy by $7.68 million ($0.31 per person). Thus, incrementally, the Anti-HCV-CEN strategy costs less and identifies more HCV infections than Anti-HCV-POC.
ConclusionsAnti-HCV-CEN would provide the best value for money when scaling up HCV testing in Pakistan.
Significance statementO_LIHepatitis C virus (HCV) infection constitutes a major medical and public health burden in Pakistan
C_LIO_LIWidespread testing is important to identify those that are chronically infected in order to link them to treatment services
C_LIO_LIThe optimal and most cost-effective testing approach to scale up HCV testing to support elimination efforts in Pakistan has not been established
C_LIO_LIHigh throughput reference laboratory testing would provide the best value for money when scaling-up HCV testing in Pakistan
C_LI | health economics |
10.1101/2022.03.31.22273239 | Epidemiological topology data analysis links severe COVID-19 to RAAS andhyperlipidemia associated metabolic syndrome conditions | The emergence of COVID19 created incredible worldwide challenges but offers unique opportunities to understand the physiology of its risk factors and their interactions with complex disease conditions, such as metabolic syndrome. Epidemiological analysis powered by topological data analysis (TDA) is a novel approach to uncover these clinically relevant interactions. Here TDA utilized Explorys data to discover associations among severe COVID19 and metabolic syndrome, and it explored the probative value of drug prescriptions to capture the involvement of RAAS and hypertension with COVID19 as well as modification of risk factor impact by hyperlipidemia on severe COVID19. | health informatics |
10.1101/2022.04.02.22273333 | High rate of BA.1, BA.1.1 and BA.2 in triple vaccinated | BackgroundBooster vaccine doses offer protection against severe COVID-19 caused by omicron but are less effective against infection. Characteristics such as serological correlates of protection, viral abundance and clearance of omicron infections in triple vaccinated individuals are scarce.
MethodsWe conducted a 4-week twice-weekly SARS-CoV-2 qPCR screening shortly after an mRNA vaccine booster in 375 healthcare workers. Anti-Spike IgG levels and neutralization titers were determined at study start. qPCR-positive participants were sampled repeatedly for two weeks and monitored for symptoms.
ResultIn total 82 (cumulative incidence 22%) omicron infections were detected, divided between BA.1, BA.1.1 and BA.2. Only 10% of infected participants remained asymptomatic. Viral load peaked at day 3 and live virus could be detected for up to 9 days after first PCR-positive sample. Presence of symptoms correlated to elevated viral load (p<0.0001), but despite resolution of symptoms most participants showed Ct levels <30 at day 9. While post-booster antibody titers were similar in those with and without subsequent breakthrough infection (p>0.05), high antibody titers were linked to reduced viral load (p<0.01) and time to viral clearance (p<0.01). No significant differences were observed for viral load and time to viral clearance between BA.1, BA.1.1 and BA.2 infected individuals.
ConclusionWe report high incidence of omicron infections despite recent booster vaccination in triple vaccinated individuals. Vaccine-induced antibody titres seem to play a limited role in risk of omicron infection. High viral load and secretion of live virus for up to nine days may increase transmission in a triple vaccinated population. | infectious diseases |
10.1101/2022.03.31.22273111 | Predictors of all-cause mortality among patients hospitalized with influenza, respiratory syncytial virus, or SARS-CoV-2 | BackgroundShared and divergent predictors of clinical severity across respiratory viruses may support clinical and community responses in the context of a novel respiratory pathogen.
MethodsWe conducted a retrospective cohort study to identify predictors of 30-day all-cause mortality following hospitalization with influenza (N=45,749; 2011-09 to 2019-05), respiratory syncytial virus (RSV; N=24,345; 2011-09 to 2019-04), or severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2; N=8,988; 2020-03 to 2020-12; pre-vaccine) using population-based health administrative data from Ontario, Canada. Multivariable modified Poisson regression was used to assess associations between potential predictors and mortality. We compared the direction, magnitude, and confidence intervals of risk ratios to identify shared and divergent predictors of mortality.
Results3,186 (7.0%), 697 (2.9%) and 1,880 (20.9%) patients died within 30 days of hospital admission with influenza, RSV, and SARS-CoV-2, respectively. Shared predictors of increased mortality included: older age, male sex, residence in a long-term care home, and chronic kidney disease. Positive associations between age and mortality were largest for patients with SARS-CoV-2. Few comorbidities were associated with mortality among patients with SARS-CoV-2 as compared to those with influenza or RSV.
ConclusionsOur findings may help identify patients at greatest risk of illness secondary to a respiratory virus, anticipate hospital resource needs, and prioritize local prevention and therapeutic strategies to communities with higher prevalence of risk factors. | infectious diseases |
10.1101/2022.03.31.22273111 | Predictors of all-cause mortality among patients hospitalized with influenza, respiratory syncytial virus, or SARS-CoV-2 | BackgroundShared and divergent predictors of clinical severity across respiratory viruses may support clinical and community responses in the context of a novel respiratory pathogen.
MethodsWe conducted a retrospective cohort study to identify predictors of 30-day all-cause mortality following hospitalization with influenza (N=45,749; 2011-09 to 2019-05), respiratory syncytial virus (RSV; N=24,345; 2011-09 to 2019-04), or severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2; N=8,988; 2020-03 to 2020-12; pre-vaccine) using population-based health administrative data from Ontario, Canada. Multivariable modified Poisson regression was used to assess associations between potential predictors and mortality. We compared the direction, magnitude, and confidence intervals of risk ratios to identify shared and divergent predictors of mortality.
Results3,186 (7.0%), 697 (2.9%) and 1,880 (20.9%) patients died within 30 days of hospital admission with influenza, RSV, and SARS-CoV-2, respectively. Shared predictors of increased mortality included: older age, male sex, residence in a long-term care home, and chronic kidney disease. Positive associations between age and mortality were largest for patients with SARS-CoV-2. Few comorbidities were associated with mortality among patients with SARS-CoV-2 as compared to those with influenza or RSV.
ConclusionsOur findings may help identify patients at greatest risk of illness secondary to a respiratory virus, anticipate hospital resource needs, and prioritize local prevention and therapeutic strategies to communities with higher prevalence of risk factors. | infectious diseases |
10.1101/2022.03.31.22273171 | Unravelling the transcriptome of the human tuberculosis lesion and its clinical implications | Tuberculosis (TB) disease causes up to 1.5 million deaths every year and represents an important problem of public health at worldwide level. Here, we quantified the gene expression signatures of granuloma biopsies across human TB pulmonary lesions, and validated the best gene candidates using NanoString technology, profiling 157 samples from 40 TB patients who underwent surgery.
We characterised the transcriptional profile of the TB granuloma in comparison to healthy tissue, described an 11-gene signature and measured 7 proteins in plasma associated with it. We demonstrated a gradient of immune-related transcript abundance across the granuloma substructure and evidenced metabolically-active Mycobacterium tuberculosis in the lesions. Patients who converted to sputum negative after two months of starting treatment, showed enriched inflammatory pathways in the lesion several months after, supporting use of sputum culture conversion (SCC) as a prognostic biomarker during clinical management and as a factor to prioritise patients when considering lung surgery. | infectious diseases |
10.1101/2022.03.31.22272957 | Clinical and Economic Impact of Differential COVID-19 Vaccine Effectiveness in the United States | BackgroundIn the United States (US), three vaccines are currently available for primary vaccination and booster doses to prevent coronavirus disease 2019 (COVID-19), including the 2-dose messenger ribonucleic acid (mRNA) BNT162b2 (COMIRNATY(R), Pfizer Inc) and mRNA-1273 (SPIKEVAX(R), Moderna Inc) vaccines, which are preferred by the Centers for Disease Control and Preventions (CDC) Advisory Committee on Immunization Practice (ACIP), and the adenovirus vector Ad26.COV2.S (Johnson & Johnson) vaccine. A substantial body of evidence has now been published on the real-world effectiveness and waning of the primary series and booster doses against specific SARS-CoV2-variants. The study objective was to determine the clinical and economic impact of differences in effectiveness between mRNA-1273 and BNT162b2 booster vaccinations over one year (2022) in US adults [≥]18 years.
MethodsA decision analytic model was used to compare three mRNA booster market share scenarios: (1) Current Scenario, where the booster mix observed in December 2021 continues throughout 2022; (2) mRNA-1273 Scenario, where the only booster administered in 2022 is mRNA-1273, and (3) BNT162b2 Scenario, where the only booster administered in 2022 is BNT162b2. Analyses were performed from the US healthcare system perspective. Sensitivity analyses were performed to explore the impact of COVID-19 incidence in the unvaccinated population and vaccine effectiveness (VE) on model results.
ResultsIn the Current Scenario, the model predicts 65.2 million outpatient visits, 3.4 million hospitalizations, and 636,100 deaths from COVID-19 in 2022. The mRNA-1273 Scenario reduced each of these outcomes compared to the Current Scenario. Specifically, 684,400 fewer outpatient visits, 48,700 fewer hospitalizations and 9,500 fewer deaths would be expected. Exclusive of vaccine costs, the mRNA-1273 Scenario is expected to decrease direct medical costs by $1.3 billion. Conversely, the BNT162b2 Scenario increased outcomes compared to the Current Scenario: specifically, 391,500 more outpatient visits, 34,500 more hospitalizations and 7,200 more deaths would be expected in 2022, costing an additional $946 million in direct medical costs. For both the mRNA-1273 and BNT162b2 booster scenarios, the percent change in direct treatment costs for COVID-19 is similar to the percent change in hospitalizations as the rate of hospitalizations is the driver of the overall costs.
Changing the number of projected COVID-19 cases in 2022 by varying the incidence rate has a direct effect on model outcomes. Higher incidence rates leads to higher outpatient visits, hospitalizations and deaths for all scenarios. Varying VE has an inverse effect on model outcomes. All outcomes increase when VE is lower for all vaccines and decrease when VE is higher. In all cases, additional use of mRNA-1273 leads to fewer infection outcomes while additional use of BNT126b2 results to higher infection outcomes.
ConclusionAs the real-world effectiveness evidence to date indicates that mRNA-1273 may be more effective at preventing COVID-19 infection and hospitalization over time than BNT-162b2, increasing the proportion of people receiving this as a booster are expected to reduce COVID-19-related outcomes and costs in 2022, regardless of COVID-19 incidence or variant. | infectious diseases |
10.1101/2022.04.03.22273353 | Does Metformin Decrease Mortality in Patients with Type 2 Diabetes Mellitus Hospitalized for COVID-19? A Multivariable and Propensity Score-adjusted Meta-analysis | AimsCoronavirus disease 2019 (COVID-19) is a new pandemic that the entire world is facing since December of 2019. Increasing evidence has shown that metformin is linked to favorable outcomes in patients with COVID-19. The aim of this study was to address whether outpatient or inpatient metformin therapy offers low in-hospital mortality in patients with type 2 diabetes mellitus hospitalized for COVID-19.
MethodsWe searched studies published in PubMed, Embase, Google Scholar and Cochrane Library up to October 1, 2021. Raw event data extracted from individual study were pooled using the Mantel-Haenszel approach. Odds ratio (OR) or hazard ratio (HR) adjusted for covariates that potentially confound the association using multivariable regression or propensity score matching was pooled by the inverse-variance method. Random effect models were applied for meta-analysis due to variation among studies.
ResultsNineteen retrospective observational studies were selected. The pooled unadjusted OR for outpatient metformin therapy and in-hospital mortality was 0.54 (95% CI, 0.42-0.68), whereas the pooled OR adjusted with multivariable regression or propensity score matching was 0.72 (95% CI, 0.47-1.12). The pooled unadjusted OR for inpatient metformin therapy and in-hospital mortality was 0.19 (95% CI, 0.10-0.36), whereas the pooled adjusted HR was 1.10 (95% CI, 0.38-3.15).
ConclusionsOur results suggest that there is a significant reduction of in-hospital mortality with metformin therapy in patients with type 2 diabetes mellitus hospitalized for COVID-19 in the unadjusted analysis, but this mortality benefit does not retain after adjustments for confounding bias. | endocrinology |
10.1101/2022.04.02.22273341 | Variation in National COVID-19 Mortality Rates Across Asian Subgroups in the United States, 2020 | Provisional U.S. national COVID-19 mortality data for the year 2020 analyzed by the CDC in March 2021 indicated that non-Hispanic Asians fared markedly better overall than other racial/ethnic minority groups-and marginally better than non-Hispanic Whites-in terms of age-adjusted mortality rates. However, Asians in the United States are composed of diverse array of origin subgroups with highly varying social, economic, and environmental experiences, which influence health outcomes. As such, lumping all Asians together into a single category can mask meaningful health disparities among more vulnerable Asian subgroups. To date, there has not been a national-level analysis of COVID-19 mortality outcomes between Asian subgroups. Utilizing final multiple cause of death data for 2020 and population projections from the U.S. Census Bureaus Current Population Survey Annual Social and Economic Supplement for 2020, crude and age-adjusted national COVID-19 mortality rates, both overall and stratified by sex, were calculated for the six major single-race Asian origin subgroups (Asian Indian, Chinese, Filipino, Japanese, Korean, and Vietnamese) and a catch-all seventh category that comprises the remaining Asian subgroups (Other Asians), contrasting them to the corresponding mortality rates of other racial/ethnic groups. A substantially more nuanced picture emerges when disaggregating Asians into its diverse origin subgroups and stratifying by sex, with Filipino males and Asian males outside of the six major Asian subgroups in particular experiencing markedly higher age-adjusted mortality rates than their White male counterparts, whether comparisons were restricted to their non-Hispanic subsets or not. During the COVID-19 pandemic and in the post-pandemic recovery, it is imperative not to overlook the health needs of vulnerable Asian populations. Public health strategies to mitigate the effects of COVID-19 must avoid viewing Asians as a monolithic entity and recognize the heterogeneous risk profiles within the U.S. Asian population. | public and global health |
10.1101/2022.03.31.22273106 | Safety, Immunologic Effects and Clinical Response in a Phase I Trial of Umbilical Cord Mesenchymal Stromal Cells in Patients with Treatment Refractory Systemic Lupus Erythematosus | BackgroundReports of clinical improvement following mesenchymal stromal cell (MSC) infusions in refractory lupus patients at a single center in China led us to perform an explorative Phase I trial of umbilical cord derived MSCs in patients refractory to six months of immunosuppressive therapy.
MethodsSix women with a SLEDAI>6, having failed standard of care therapy, received one IV infusion of 1x106 MSCs/kg of body weight. They maintained their current immunosuppressives, but their physician was allowed to adjust corticosteroids initially for symptom management. The clinical endpoint was an SRI of 4 with no new BILAG As and no increase in Physician Global Assessment score of >0.3 with tapering of prednisone to 10mg or less by 20 weeks.
ResultsOf 6 patients, 5 (83.3%; 95% CI = 35.9% to 99.6%) achieved the clinical endpoint of an SRI of 4. Adverse events were minimal. Mechanistic studies revealed significant reductions in CD27IgD negative B cells, switched memory B cells and activated naive B cells with increased transitional B cells in the 5 patients who met the endpoint. There was a trend towards decreased autoantibody levels in specific patients. One patient had an increase in their Helios+Treg cells, but no other significant T cell changes were noted. GARP-TGF{beta} complexes were significantly increased following the MSC infusions. The B cell changes and the GARP-TGF{beta} increase were significantly correlated with SLEDAI scores.
ConclusionThis pilot trial suggests that UC MSC infusions are safe and may have efficacy in lupus. The B cell and GARP-TGF{beta} changes provide insight into mechanisms by which MSCs may impact disease.
Trial RegistrationNCT03171194
FundingThis study was funded by a grant from the Lupus Foundation of America and NIH UL1 RR029882 | rheumatology |
10.1101/2022.04.02.22272992 | Associations of current and childhood socioeconomic status and health outcomes amongst patients with knee or hip osteoarthritis in a Mexico City family-practice setting. | ObjectivesTo examine the association of current and childhood socioeconomic status (SES) with patient-reported functional status, quality of life and disability in patients with knee osteoarthritis (OA)
MethodsWe conducted a cross-sectional study amongst individuals seeking care for any medical reason in a primary care family-practice clinic in Mexico City. We included individuals with self-reported doctor-diagnosed arthritis and administered a survey using validated Spanish language versions of the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC), the Osteoarthritis of Lower Limbs and Quality of Life (AMICAL), and the Stanford Health Assessment Questionnaire-Disability Index (HAQ-DI). To estimate current and childhood SES, we and used a validated tool to estimate income quintile, as well as education level and occupation type, for both the patient and their parents.
ResultsWe recruited 154 patients and excluded 8 patients. Estimated income and education levels were correlated with WOMAC, AMICAL and HAQ-DI scores, and significant differences were found in all scores by occupation type. The association for estimated income and all scores remained significant independently of age, sex, BMI, and presence of diabetes or hypertension. Maternal education was best correlated with AMICAL scores, though its effect seemed largely mediated by its association with current SES measures.
ConclusionsCurrent and - to a lesser extent - childhood Socioeconomic Status impacts functional status, quality of life and disability amongst OA patients in Mexico City. Awareness of life-course SES can help identify patients at risk for worse outcomes. | rheumatology |
10.1101/2022.04.01.22273235 | Decoding mitochondrial genes in pediatric AML and development of a novel prognostic mitochondrial gene signature | BackgroundGene expression profile of mitochondrial-related genes is not well deciphered in pediatric acute myeloid leukaemia (AML). We aimed to identify mitochondria-related differentially expressed genes (DEGs) in pediatric AML with their prognostic significance.
MethodsChildren with de novo AML were included prospectively between July 2016-December 2019. Transcriptomic profiling was done for a subset of samples, stratified by mtDNA copy number. Top mitochondria-related DEGs were identified and validated by real-time PCR. A prognostic gene signature risk score was formulated using DEGs independently predictive of overall survival (OS) in multivariable analysis. Predictive ability of the risk score was estimated along with external validation in The Tumor Genome Atlas (TCGA) AML dataset.
ResultsIn 143 children with AML, twenty mitochondria-related DEGs were selected for validation, of which 16 were found to be significantly dysregulated. Upregulation of SDHC (p<0.001), CLIC1 (p=0.013) and downregulation of SLC25A29 (p<0.001) were independently predictive of inferior OS, and included for developing prognostic risk score. The risk score model was independently predictive of survival over and above ELN risk categorization (Harrells c-index: 0.675). High-risk patients (risk score above median) had significantly inferior OS (p<0.001) and event free survival (p<0.001); they were associated with poor-risk cytogenetics (p=0.021), ELN intermediate/poor risk group (p=0.016), absence of RUNX1-RUNX1T1 (p=0.027), and not attaining remission (p=0.016). On external validation, the risk score also predicted OS (p=0.019) in TCGA dataset.
ConclusionWe identified and validated mitochondria-related DEGs with prognostic impact in pediatric AML and also developed a novel 3-gene based externally validated gene signature predictive of survival. | oncology |
10.1101/2022.03.31.22273144 | Associations between e-cigarette use and e-cigarette flavors with cigarette smoking quit attempts and quit success: Evidence from a US large, nationally representative 2018-2019 survey | ObjectivesWhile many studies have examined the association between e-cigarette use and smoking cessation, fewer have considered the impact of e-cigarette flavors on cessation outcomes. This study extends previous studies by examining the effects of e-cigarette use and e-cigarette flavors on smoking quit attempts and quit success.
MethodsWe used data from the 2018-2019 Tobacco Use Supplement-Current Population Survey (TUS-CPS) survey. Multivariate logistic regression analyses were used to investigate the associations between e-cigarette and flavor use with quit attempts among individuals who smoked 12 months ago and quit success. Two current e-cigarette use definitions were considered; currently use every day or some days vs. 20+ days in the past 30-days.
ResultsCompared to those not using e-cigarettes, current everyday or someday e-cigarette use with all non-tobacco flavors had an adjusted odds ratio (AOR) of 2.9 (95% CI: 2.4-3.5) for quit attempts and 1.7 (95% CI: 1.3-2.2) for quit success. 20+ days e-cigarette use with flavors had stronger associations with quit attempts (AOR=4.2, 95% CI: 3.1-5.5) and quit success (AOR=4.0, 95% CI: 2.9-5.4). E-cigarette users with non-tobacco flavors were more likely to succeed in quitting compared to those exclusively using non-flavored or tobacco-flavored e-cigarettes. Menthol/mint flavor users had slightly higher odds of quit attempts and success than users of other non-tobacco flavors.
ConclusionsE-cigarette use is positively associated with both making a smoking quit attempt and quit success. Those using flavored e-cigarettes, particularly menthol/mint, are more likely to quit successfully.
ImplicationsE-cigarette use is positively associated with both making a quit attempt and quit success, and those using flavored e-cigarettes are more likely to successfully quit smoking, with no statistically significant differences between use of menthol or mint flavored e-cigarettes versus use of other non-tobacco flavored products. This suggests that the potential for e-cigarettes to help people who currently smoke quit could be maintained with the availability of menthol or mint flavored e-cigarettes, even if other non-tobacco flavored products, which are associated with e-cigarette use among youth, were removed from the market. | epidemiology |
10.1101/2022.04.03.22272610 | Cardiac impairment in Long Covid 1-year post-SARS-CoV-2 infection | BackgroundLong Covid is associated with multiple symptoms and impairment in multiple organs. Cardiac impairment has been reported to varying degrees by varying methodologies in cross-sectional studies. Using cardiac magnetic resonance (CMR), we investigated the 12-month trajectory of cardiac impairment in individuals with Long Covid.
Methods534 individuals with Long Covid underwent baseline CMR (T1 and T2 mapping, cardiac mass, volumes, function, and strain) and multi-organ MRI at 6 months (IQR 4.3,7.3) since first post-COVID-19 symptoms and 330 were rescanned at 12.6 (IQR 11.4, 14.2) months if abnormal findings were reported at baseline. Symptoms, standardised questionnaires, and blood samples were collected at both timepoints. Cardiac impairment was defined as one or more of: low left or right ventricular ejection fraction (LVEF and RVEF), high left or right ventricular end diastolic volume (LVEDV and RVEDV), low 3D left ventricular global longitudinal strain (GLS), or elevated native T1 in [≥]3 cardiac segments. A significant change over time was reported by comparison with 92 healthy controls.
ResultsThe technical success of this multiorgan assessment in non-acute settings was 99.1% at baseline, and 98.3% at follow up, with 99.6% and 98.8% for CMR respectively. Of individuals with Long Covid, 102/534 [19%] had cardiac impairment at baseline; 71/102 had complete paired data at 12 months. Of those, 58% presented with ongoing cardiac impairment at 12 months. High sensitivity cardiac troponin I and B-type natriuretic peptide were not predictive of CMR findings, symptoms, or clinical outcomes. At baseline, low LVEF, high RVEDV and low GLS were associated with cardiac impairment. Low LVEF at baseline was associated with persistent cardiac impairment at 12 months.
ConclusionCardiac impairment, other than myocarditis, is present in 1 in 5 individuals with Long Covid at 6 months, persisting in over half of those at 12 months. Cardiac-related blood biomarkers are unable to identify cardiac impairment in Long COVID. Subtypes of disease (based on symptoms, examination, and investigations) and predictive biomarkers are yet to be established. Interventional trials with pre-specified subgroup analyses are required to inform therapeutic options. | cardiovascular medicine |
10.1101/2022.04.04.22272731 | Estimating the number of breakthrough COVID-19 deaths in the United States | While there is compelling evidence of the effectiveness of COVID-19 vaccines, increasing attention has also been paid to the fact that, like all vaccines, they are not 100% effective. Therefore, some fully vaccinated people have developed "breakthrough" cases of COVID-19, and some of these individuals have died as a result. The purpose of this study was to estimate the number of fully vaccinated or "breakthrough" deaths from COVID-19 in the United States. Data was compiled from state COVID-19 dashboards and various other sources for as many states as possible. As of March 27, 2022 based on data from 46 U.S. states and the District of Columbia, an estimated minimum of 57,617 breakthrough COVID-19 deaths had occurred in the United States. Furthermore, based on this incomplete data, a total of 12.8% of all COVID-19 deaths in the included regions and time periods were among fully vaccinated individuals (whether boosted or not). Extrapolating this data to the entire United States implies that the minimum total number of such deaths as of March 27, 2022 was 79,917. Data from a MMWR article, if similarly extrapolated to the entire country, implies a significantly larger number of breakthrough deaths throughout the United States: 99,152. | public and global health |
10.1101/2022.04.04.22273336 | Linking cohort data and Welsh routine health records to investigate children at risk of delayed primary vaccination | BackgroundDelayed primary vaccination is one of the strongest predictors of subsequent incomplete immunisation. Identifying children at risk of such delay may enable targeting of interventions, thus decreasing vaccine preventable illness.
ObjectivesTo explore socio-demographic factors associated with delayed receipt of the Diphtheria, Tetanus and Pertussis (DTP) vaccine.
MethodsWe included 1,782 children, born between 2000 and 2001, participating in the Millennium Cohort Study (MCS) and resident in Wales, whose parents gave consent for linkage to National Community Child Health Database records at the age seven years contact. We examined child, maternal, family and area characteristics associated with delayed receipt of the first dose of the DTP vaccine.
Results98.6% received the first dose of DTP. The majority, 79.6% (n=1,429) received it on time (between 8 and 12 weeks of age), 14.2% (n=251) received it early (prior to 8 weeks of age) and 4.8% (n=79) were delayed (after 12 weeks of age); 1.4% (n=23) never received it. Delayed primary vaccination was more likely among children with older natural siblings (risk ratio 3.82, 95% confidence interval (1.97, 7.38)), children admitted to special/intensive care (3.15, (1.65, 5.99)), those whose birth weight was >4Kg (2.02, (1.09, 3.73)) and boys (1.53, (1.01, 2.31)). There was a reduced risk of delayed vaccination with increasing maternal age (0.73, (0.53, 1.00) per 5 year increase) and for babies born to graduate mothers (0.27, (0.08, 0.90)).
ConclusionsAlthough the majority of infants were vaccinated in a timely manner, identification of infants at increased risk of early or delayed vaccination will enable targeting of interventions to facilitate timely immunisation. This is to our knowledge the first study exploring individual level socio-demographic factors associated with delayed primary vaccination in the UK and demonstrates the benefits of linking cohort data to routinely-collected child health data. | public and global health |
10.1101/2022.04.01.22273315 | Aberrant neurophysiological signaling underlies speech impairments in Parkinson's disease | Difficulty producing intelligible speech is a common and debilitating symptom of Parkinsons disease (PD). Yet, both the robust evaluation of speech impairments and the identification of the affected brain systems are challenging. We examine the spectral and spatial definitions of the functional neuropathology underlying reduced speech quality in patients with PD using a new approach to characterize speech impairments and a novel brain-imaging marker. We found that the interactive scoring of speech impairments in PD (N=59) is reliable across non-expert raters, and better related to the hallmark motor and cognitive impairments of PD than automatically-extracted acoustical features. By relating these speech impairment ratings to neurophysiological deviations from healthy adults (N=65), we show that articulation impairments in patients with PD are robustly predicted from aberrant activity in the left inferior frontal cortex, and that functional connectivity of this region with somatomotor cortices mediates the influence of cognitive decline on speech deficits. | neurology |
10.1101/2022.04.01.22272964 | Multimodal Hypersensitivity Derived from Quantitative Sensory Testing Predicts Long-Term Pelvic Pain Outcome | Multimodal hypersensitivity (MMH)--greater sensitivity across multiple sensory modalities (e.g., light, sound, temperature, pressure)--is hypothesized to be responsible for the development of chronic pain and pelvic pain. However, previous studies of MMH are restricted given their reliance on biased self-report questionnaires, limited use of multimodal quantitative sensory testing (QST), or limited follow-up. Therefore, we conducted multimodal QST on a cohort of 200 reproductive age women at elevated risk for developing or maintaining chronic pelvic pain conditions and pain-free controls. Pelvic pain self-report was examined over a four-year follow-up period. Multimodal QST was comprised of visual, auditory, bodily pressure, pelvic pressure, thermal, and bladder testing. A principal component analysis of QST measures resulted in three orthogonal factors that explained 43% of the variance: MMH, pressure stimulus-response, and bladder hypersensitivity. MMH and bladder hypersensitivity factors correlated with baseline self-reported menstrual pain, genitourinary symptoms, depression, anxiety, and health. Baseline self-report pain ratings were significant predictors of pelvic pain up to three years after assessment but decreased in their predictive ability of pelvic pain outcome over time. In contrast, MMH increased its predictive ability of pelvic pain outcome over time and was the only factor to predict outcome up to four years later. These results suggest that a "centralized" component of MMH is an important long-term risk factor for pelvic pain. Further research on the modifiability of MMH could provide options for future treatment avenues for chronic pain. | pain medicine |
10.1101/2022.03.31.22273250 | Assessment of Referral System on Maternal Services in Cagayan De Oro City, Philippines | A retrospective document analysis study aimed to assess the referral system on maternal services based on the referral forms of referred women of reproductive age from the referral facilities to the receiving hospital from January to December 2019. The specific objectives of the study were to describe the use of standard referral forms, compliance of the healthcare workers in using the standard form based on 14 criteria, and the utilization of relevant data related to maternal services based on 16 criteria. There were 3330 referral forms received by the receiving facility on different formats of forms during the study period. A random sampling of 384 referral forms was used as study population. Among 384 referral forms (random sampling), only 126 (31.8%) used the standard referral forms. The compliance of these referral forms using 14 criteria showed that 116 (92.06%) referral forms complied with only 51-75% of the criteria, and none of the referral forms complied with all the 14 criteria. On assessing the data entries among 384 referral forms with different formats, there were six data entries consistently used more than 60% by the healthcare providers which were not part of the printed form: last menstrual period (67.87%), expected date of confinement (64.84%), fundic height (63.04%), fetal heart beat (60.76%), birthweight (62.59%), and age of gestation (60%). Based on the 16 criteria, majority of referral forms (210) utilized 51-75% of the 16 criteria, 122 referral forms utilized 76-99%, 51 forms utilized 26-50%, and 1 form utilized less than 25% of the data entries. Several studies documented that referral forms and functional referral systems are vital to an improved maternal mortality rate (MMR) and infant mortality rate (IMR). Therefore, as part of a continued quality referral system, it is highly recommended that the required referral forms be re-assessed, revised, and regularly monitored on its form compliance and utilization. | health systems and quality improvement |
10.1101/2022.04.03.22273355 | Clinical Variables Correlate with Serum Neutralizing Antibody Titers after COVID-19 mRNA Vaccination in an Adult, US-based Population | BackgroundWe studied whether comorbid conditions impact strength and duration of immune responses after SARS-CoV-2 mRNA vaccination in a US-based, adult population.
MethodsSera (pre-and-post-BNT162b2 vaccination) were tested serially up to 12 months after two doses of vaccine for SARS-CoV-2-anti-Spike neutralizing capacity by pseudotyping assay in 124 individuals; neutralizing titers were correlated to clinical variables with multivariate regression. Post-booster (third dose) effect was measured at 1 and 3 months in 72 and 88 subjects respectively.
ResultsAfter completion of primary vaccine series, neutralizing antibody IC50 values were high at one month (14-fold increase from pre-vaccination), declined at six months (3.3-fold increase), and increased at one month post-booster (41.5-fold increase). Three months post-booster, IC50 decreased in COVID-naive individuals (18-fold increase) and increased in prior COVID-19+ individuals (132-fold increase). Age >65 years ({beta}=-0.94, p=0.001) and malignancy ({beta}=-0.88, p=0.002) reduced strength of response at 1 month. Both strength and durability of response at 6 months, respectively, were negatively impacted by end-stage renal disease [({beta}=-1.10, p=0.004); ({beta}=-0.66, p=0.014)], diabetes mellitus [({beta}=-0.57, p=0.032); ({beta}=-0.44, p=0.028)], and systemic steroid use [({beta}=-0.066, p=0.032); ({beta}=-0.55, p=0.037)]. Post-booster IC50 was robust against WA-1 and B.1.617.2, but the immune response decreased with malignancy ({beta} =-0.68, p=0.03) and increased with prior COVID-19 (p-value < 0.0001).
ConclusionMultiple clinical factors impact the strength and duration of neutralization response post-primary series vaccination, but not the post-booster dose strength. Prior COVID-19 infection enhances the booster-dose response except in individuals with malignancy, suggesting a need for clinically guiding vaccine dosing regimens.
SummaryMultiple clinical factors impact the strength and duration of neutralization response post-primary series vaccination. All subjects, irrespective of prior COVID infection, benefited from a third dose. Malignancy decreased response following third dose, suggesting the importance of clinically guided vaccine regimens. | infectious diseases |
10.1101/2022.04.03.22273355 | Clinical Variables Correlate with Serum Neutralizing Antibody Titers after COVID-19 mRNA Vaccination in an Adult, US-based Population | BackgroundWe studied whether comorbid conditions impact strength and duration of immune responses after SARS-CoV-2 mRNA vaccination in a US-based, adult population.
MethodsSera (pre-and-post-BNT162b2 vaccination) were tested serially up to 12 months after two doses of vaccine for SARS-CoV-2-anti-Spike neutralizing capacity by pseudotyping assay in 124 individuals; neutralizing titers were correlated to clinical variables with multivariate regression. Post-booster (third dose) effect was measured at 1 and 3 months in 72 and 88 subjects respectively.
ResultsAfter completion of primary vaccine series, neutralizing antibody IC50 values were high at one month (14-fold increase from pre-vaccination), declined at six months (3.3-fold increase), and increased at one month post-booster (41.5-fold increase). Three months post-booster, IC50 decreased in COVID-naive individuals (18-fold increase) and increased in prior COVID-19+ individuals (132-fold increase). Age >65 years ({beta}=-0.94, p=0.001) and malignancy ({beta}=-0.88, p=0.002) reduced strength of response at 1 month. Both strength and durability of response at 6 months, respectively, were negatively impacted by end-stage renal disease [({beta}=-1.10, p=0.004); ({beta}=-0.66, p=0.014)], diabetes mellitus [({beta}=-0.57, p=0.032); ({beta}=-0.44, p=0.028)], and systemic steroid use [({beta}=-0.066, p=0.032); ({beta}=-0.55, p=0.037)]. Post-booster IC50 was robust against WA-1 and B.1.617.2, but the immune response decreased with malignancy ({beta} =-0.68, p=0.03) and increased with prior COVID-19 (p-value < 0.0001).
ConclusionMultiple clinical factors impact the strength and duration of neutralization response post-primary series vaccination, but not the post-booster dose strength. Prior COVID-19 infection enhances the booster-dose response except in individuals with malignancy, suggesting a need for clinically guiding vaccine dosing regimens.
SummaryMultiple clinical factors impact the strength and duration of neutralization response post-primary series vaccination. All subjects, irrespective of prior COVID infection, benefited from a third dose. Malignancy decreased response following third dose, suggesting the importance of clinically guided vaccine regimens. | infectious diseases |
10.1101/2022.04.04.22273384 | A simple algorithm based on initial Ct values predicts the duration to SARS-CoV-2 negativity and allows more efficient test-to-release and return-to-work schedules | Especially during global pandemics but also in the context of epidemic waves, the capacity for diagnostic qRT-PCRs rapidly becomes a limiting factor. Furthermore, excessive testing incurs high costs and can result in an overstrained work force in diagnostics departments. Obviously, people aim to shorten their isolation periods, hospitals need to discharge convalescent people, and re-employ staff members after infection. The aim of the study was to optimize retesting regimens for test-to-release from isolation and return-to-work applications. For this purpose, we investigated the association between Ct values at the first diagnosis of SARS-CoV-2 infection and the period until test negativity was reached, or at least until the Ct value exceeded 30, which is considered to indicate the transition to a non-infectious state. We included results from the testing of respiratory material samples for the detection of SARS-CoV-2 RNA, tested from 01 March 2020 to 31 January 2022.
Lower initial Ct values were associated with longer periods of SARS-CoV-2 RNA positivity. Starting with Ct values of <20, 20-25, 25-30, 30-35, and >35, it took median intervals of 20 (interval: 14-25), 16 (interval: 10-21), 12 (interval: 7-16), 7 (interval: 5-14), and 5 (interval: 2-7) days, respectively, until the person tested negative. Accordingly, a Ct threshold of 30 was surpassed after 13 (interval: 8-19), 9 (interval: 6-14), 7 (interval: 6-11), 6 (interval: 4-10), and 3 (interval: 1-6) days, respectively, in individuals with aforementioned start Ct values. Furthermore, the time to negativity was longer for adults versus children, wild-type SARS-CoV-2 variant versus other variants of concern, and in patients who were treated in the intensive care units.
Based on these data, we propose an adjusted retesting strategy according to the initial Ct value in order to optimize available PCR resources. | infectious diseases |
10.1101/2022.04.04.22273381 | Priorities setting in mental health research: a scoping review | ObjectiveResearch processes are opening to stakeholders beyond the scientific community. We analyse the user involvement in the definition of research priorities in the field of mental health. Mental disorders represent a significant disease burden at a global scale and their identification and treatment involves caregivers, patients and related social groups such as family and friends. Therefore it is an area conducive to the application of participatory methods in priority setting. We present a scoping review of participatory methods in mental health priority setting for the period 2010-2020 to shed light on their spread and characteristics, the types of groups involved and the link with the priorities identified.
MethodsFirst we describe the eligibility criteria for the scoping review. We selected peer-reviewed documents published between 2010 and 2020 using MEDLINE/PubMed, PsycINFO, the Core Collection of the Web of Science and Scopus, applying controlled terms of search. We initially identified 330 documents from which we selected seventy-four after further discarding studies that were not specifically addressing priority setting in mental disorders research. We noted and classified the interest groups participating in every study.
ResultsPriority setting partnerships are becoming the most frequent participatory instruments for priority setting in mental health. We identify regional differences in the extent to which such methods are being applied. When research beneficiaries participate in priority setting, prioritised research focuses on therapy, standards, education and psychology of mental disorders. When participation is limited to scientists, therapy, diagnosis, methods and standards, receive more attention. | psychiatry and clinical psychology |
10.1101/2022.04.03.22273352 | Thirteen-fold variation between states in clozapine prescriptions to United States Medicaid patients | BackgroundClozapine was the first atypical antipsychotic for treating schizophrenia, with a long history of controversy over its usage. Guidelines currently recommend clozapine for patients diagnosed with refractory schizophrenia. However, this agent may be underutilized because of the costs associated with close monitoring of its adverse effects, particularly agranulocytosis. This is unfortunate because clozapine has demonstrated greater effectiveness compared with other antipsychotics. It is essential to examine clozapine usage to determine if it is being adequately utilized among United States (US) Medicaid patients.
MethodsMedicaid data, including the number of quarterly clozapine prescriptions and the number of Medicaid enrollees in each state from 2015-2019, was collected and used to evaluate clozapine use over time. Data-analysis and figures were prepared with Excel and GraphPad Prism. Exploratory correlations were completed between prescriptions per enrollee and other factors.
ResultsThe number of prescriptions, corrected for the number of enrollees in Medicaid, was generally consistent over time. However, average prescriptions per quarter were markedly lower in 2017 compared with other years, decreasing by 44.4% from 2016 average prescriptions per quarter. From 2015 to 2019, states from the upper Midwest and Northeast regions of the country had the highest average clozapine prescriptions per 10,000 Medicaid enrollees (ND: 190.0, SD: 176.6, CT: 166.2). States from the Southeast and Southwest had much lower average rates (NV: 17.9, KY: 19.3, MS: 19.7). There was an over ten-fold difference in clozapine prescriptions between states from 2015-2019 (2015 = 19.9-fold, 2016=11.4 fold, 2017=11.6 fold, 2018=13.3 fold, and 2019=13.0 fold). There was a moderate correlation of (r(48) = 0.49, p < .05) between prescriptions per 10,000 enrollees and the Medicaid spending per enrollee in each state in 2019. There was a small, but significant, correlation between prescriptions per enrollee and percent white (r(48) = 0.30, p < .05).
ConclusionClozapine is an important pharmacotherapy for refractory schizophrenia. Overall, clozapine use tends to be highest among the upper Midwest and Northeast states. Further research is ongoing to better understand the origins of the thirteen-fold regional disparities in clozapine use in 2019 and the state level variation in Medicaid spending. | psychiatry and clinical psychology |
10.1101/2022.04.04.22273385 | Significant impacts of the COVID-19 pandemic on race/ethnic differences in USA mortality | The COVID-19 pandemic triggered declines in life expectancy at birth around the world. The United States of America (USA) was hit particularly hard among high income countries. Early data from the USA showed that these losses varied greatly by race/ethnicity in 2020, with Hispanic and Black Americans suffering much larger losses in life expectancy compared to white people. We add to this research by examining trends in lifespan inequality, average years of life lost, and the contribution of specific causes of death and ages to race/ethnic life expectancy disparities in the USA from 2010 to 2020. We find that life expectancy in 2020 fell more for Hispanic and Black males (4.5 years and 3.6 years, respectively) compared to white males (1.5 years). These drops nearly eliminated the previous life expectancy advantage for the Hispanic compared to white population, while dramatically increasing the already large gap in life expectancy between Black and white people. While the drops in life expectancy for the Hispanic population were largely attributable to official COVID-19 deaths, Black Americans additionally saw increases in cardiovascular disease and "deaths of despair" over this period. In 2020, lifespan inequality increased slightly for Hispanic and white populations, but decreased for Black people, reflecting the younger age pattern of COVID-19 deaths for Hispanic people. Overall, the mortality burden of the COVID-19 pandemic hit race/ethnic minorities particularly hard in the USA, underscoring the importance of the social determinants of health during a public health crisis.
Significance statementPublic interest in social and health inequalities is increasing. We examine the impact of COVID-19 on mortality in the USA across racial/ethnic groups and present four key findings. First, all groups suffered sizable life-expectancy losses and increases in years of life lost. Mortality from cardiovascular diseases, "deaths of despair", and COVID-19 explained most of these losses. Second, working-age mortality accounted for substantial life-expectancy losses, especially among Hispanic males. Third, lifespan inequality increased for Hispanic and white people, but decreased slightly for Black people. Fourth, the pandemic shifted racial/ethnic mortality differentials in favor of white people: narrowing the Hispanic advantage and widening the Black disadvantage. Our results provide a comprehensive assessment of mortality trends to inform policies targeting inequalities. | public and global health |
10.1101/2022.04.04.22273378 | Self-supervised neural network improves tri-exponential intravoxel incoherent motion model fitting in non-alcoholic fatty liver disease. | Neural network generated intravoxel incoherent motion (IVIM) parameter maps are visually more detailed, less noisy and have a higher test-retest repeatability than conventional least-squares (LSQ) generated images. Recent literature suggests that tri-exponential models may fit liver IVIM data more accurately than conventional bi-exponential models, warranting investigation into the use of a neural network (NN) to estimate tri-exponential IVIM parameters. Here, we developed a tri-exponential IVIM unsupervised physics-informed deep neural network (IVIM3-NET), assessed its performance in non-alcoholic fatty liver disease (NAFLD) and compared outcomes with bi-exponential LSQ and NN fits and tri-exponential LSQ fits. Scanning was performed using a 3.0T free-breathing multi-slice diffusion-weighted single-shot echo-planar imaging sequence with 18 b-values. Images were analysed for visual quality, comparing the bi- and tri-exponential IVIM models for LSQ fits and NN fits using parameter-map signal-to-noise ratios (SNR) and adjusted R2. IVIM parameters were compared to histological fibrosis, disease activity and steatosis grades. Parameter map quality improved with bi- and tri-exponential NN approaches, with average parameter-map SNR increasing from 3.38 to 5.59 and 2.45 to 4.01 for bi- and tri-exponential LSQ and NN models respectively. In 33 out of 36 patients, the tri-exponential model exhibited higher adjusted R2 values than the bi-exponential model. Correlating IVIM data to liver histology showed that the bi- and tri-exponential NN outperformed both LSQ models for the majority of IVIM parameters (10 out of 15 significant correlations). Overall, our results support the use of a tri-exponential IVIM model in NAFLD. We show that the IVIM3-NET can be used to improve image quality compared to a tri-exponential LSQ fit and provides promising correlations with histopathology. | radiology and imaging |
10.1101/2022.03.31.22273188 | Performance in Fundamentals of Laparoscopic Surgery (FLS): Does it Reflect Global Rating Scales in Objective Structured Assessment of Technical Skills (OSATS) in Porcine Laparoscopic Surgery? | BackgroundTo correlate the utility of Fundamentals of Laparoscopic Surgery (FLS) manual skills program with the Objective Structured Assessment of Technical Skills (OSATS) global rating scale in evaluating operative performance.
Materials and MethodsThe Asian Urological Surgery Training and Educational Group (AUSTEG) Laparoscopic Upper Tract Surgery Course (LUTSC) implemented and validated the FLS program for its usage in laparoscopic surgical training. Delegates basic laparoscopic skills were assessed using three different training models (Peg Transfer, Precision Cutting and Intra-corporeal Suturing). They also performed live porcine laparoscopic surgery at the same workshop. Live surgery skills were assessed by blinded faculty using the OSATS rating scale.
ResultsFrom 2016 to 2019, a total of 81 certified urologists participated in the course, with a median of 5 years experience post residency. Although differences in task timings did not reach statistical significance, those with more surgical experience were visibly faster at completing the peg transfer and intra-corporeal suturing FLS tasks. However, they took longer to complete the precision cutting task than participants with less experience. Overall OSATS scores correlated weakly with all three FLS tasks (Peg Transfer Time: R = -0.331, R2 = 0.110; Precision Cutting Time: R = - 0.240, R2 = 0.058; Suturing with Intra-corporeal Knot Time: R = -0.451, R2 = 0.203).
ConclusionFLS task parameters did not correlate strongly with OSATS globing rating scale performance. Although the FLS task models demonstrated strong validity, it is important to assimilate the inconsistencies when benchmarking technical proficiency against real-life operative competence, as evaluated by FLS and OSATS respectively. | medical education |
10.1101/2022.03.31.22273212 | Formative usability assessment of a new connector design for peritoneal dialysis | BackgroundThis paper presents a first, formative study to explore the usability of a new peritoneal dialysis connector design intended for use by patients. The study was conducted with a user population of both naive users and experienced peritoneal dialysis patients across a range of ages. The goals of the study were to evaluate the usability of the key user interfaces of this connector design by test participants representative of new and experienced peritoneal dialysis patients, as well as to evaluate the use of the connector as it interacts with other components of the peritoneal dialysis system including peritoneal dialysis fluid bags and tubing. Further objectives were to capture any usability issues and obtain participants feedback on the design.
MethodsA total of 7 patient and non-patient participants received brief training and performed simulated connection and disconnection of peritoneal catheter extension sets for therapy with the new design.
ResultsAll 7 participants completed the simulated connection and disconnection tasks successfully, with only one use error (0.22%), 18 close calls (4.0%), 6 use difficulties (1.3%) observed from the total of 449 use steps performed by all participants. Other findings include usability improvement with repeated use, participants feedback and suggestions for the protective enclosure, a novel feature of the touchless connector design.
ConclusionThe studied connector design showed minimal use errors or difficulties and based on participant feedback, the usability can be significantly improved with minimal modifications in future prototypes. | nephrology |
10.1101/2022.03.31.22273255 | Diagnostic Accuracy of Simple Postoperative AKI Risk (SPARK) Classification and General Surgery AKI (GS AKI) Index in Predicting Postoperative Acute Kidney Injury among Patients Undergoing Non-Cardiac Surgery at a Tertiary Hospital in the Philippines | BackgroundPostoperative AKI is a significant postoperative complication. Clinical risk prediction models are lacking for patients undergoing non-cardiac surgery. SPARK Classification and GS AKI Index are tools that have shown fair discriminative ability to predict post-operative AKI in non-cardiac surgery and have external validation in their original cohorts. There is no study that compares the diagnostic accuracy of both tools.
ObjectivesThis study aims to compare the diagnostic accuracy of SPARK Classification vs GS-AKI Risk Index in predicting post-operative AKI among patients who will undergo non-cardiac surgery at a tertiary hospital in the Philippines.
MethodsThis is a cross-sectional study, including adult patients who underwent non-cardiac surgeries from January 2019 to July 2021. The individual risk of post-operative AKI for both models were determined. Descriptive data was described using t-test and logistic regression. Measures of accuracy were described using sensitivity, specificity, positive and negative predictive value, positive and negative likelihood ratio, and discriminative ability using concordance (c) statistic.
ResultsOf the 340 patients in this study, 77 (22.65%) developed post-operative AKI and 24 (7.06%) developed critical AKI. Based on demographic data, older age, pre-existing renal disease, longer duration of surgery, anemia, hypoalbuminemia, and hyponatremia were associated with higher incidence of post-operative AKI. SPARK had a sensitivity ranging from 17-43% and specificity ranging from 58-93% for Class B to C. GS AKI had a sensitivity ranging from 10-26% and specificity ranging from 61-97% for Class I to V. SPARK had a discriminative power (c statistic) ranging from 0.46 to 0.61 while GS AKI had a discriminative power ranging from 0.41 to 0.54.
ConclusionBased on this study, there is an association between higher risk classification in both SPARK and GS AKI and postoperative AKI. However, both clinical prediction models demonstrate poor discriminative power to predict post-operative AKI. | nephrology |
10.1101/2022.04.04.22273380 | Neuromodulation through Brain Stimulation-assisted Cognitive Training in Patients with Post-Chemotherapy Cognitive Impairment (Neuromod-PCCI): Study Protocol of a Randomized Controlled Trial | IntroductionBreast cancer is the most common form of cancer in women. A considerable number of women with breast cancer who have been treated with chemotherapy, subsequently develop neurological symptoms such as concentration and memory difficulties (also known as chemobrain). Currently, there are no validated therapeutic approaches available to treat these symptoms. Cognitive training holds the potential to counteract cognitive impairment. Combining cognitive training with concurrent transcranial direct current stimulation (tDCS) could enhance and maintain the effects of this training, potentially providing a new approach to treat post-chemotherapy cognitive impairment (PCCI). With this study, we aim to investigate the effects of multi-session tDCS over the left dorsolateral prefrontal cortex in combination with cognitive training on cognition and quality of life in women with PCCI.
Methods and analysisThe Neuromod-PCCI trial is a monocentric, randomized, double-blind, placebo-controlled study. Fifty-two women with PCCI after breast cancer therapy will receive a 3-week tDCS-assisted cognitive training with anodal tDCS over the left dorsolateral prefrontal cortex (target intervention), compared to cognitive training plus sham tDCS (control intervention). Cognitive training will consist of a letter updating task. Primary outcome will be the performance in an untrained task (n-back task) after training. In addition, feasibility, safety and tolerability, as well as quality of life and performance in additional untrained tasks will be investigated. A follow-up visit will be performed one month after intervention to assess possible long-term effects. In an exploratory approach, structural and functional magnetic resonance imaging (MRI) will be acquired before the intervention and at post-intervention to identify possible neural predictors for successful intervention.
Ethics and disseminationEthical approval was granted by the ethics committee of the University Medicine Greifswald (BB236/20). Results will be available through publications in peer-reviewed journals and presentations at national and international conferences.
Trial registrationThis trial was prospectively registered at ClinicalTrials.gov (Identifier: NCT04817566, registered March 26, 2021).
Strength and limitations of this study- This is the first randomized controlled trial to investigate the feasibility and effects of combined cognitive training and tDCS on cognitive outcomes and quality of life in patients with post-chemotherapy cognitive impairment
- Results will help the development of treatment options for breast cancer patients with post-chemotherapy cognitive impairment
- Results may not be generalizable to male cancer patients
- Monocentric trial design may increase risk of bias | neurology |
10.1101/2022.04.03.22273375 | Retinal aging transcriptome and cellular landscape in association with the progression of age-related macular degeneration | Age is the main risk factor for age-related macular degeneration (AMD), a leading cause of blindness in the elderly, with limited therapeutic options. Here we systematically analyzed the transcriptomic characteristics and cellular landscape of the aging retina from controls and patients with AMD. We identify the aging genes in the retina that are associated with innate immune response and inflammation. Deconvolution analysis reveals that the estimated proportion of M2 and M0 macrophages is increased and decreased, respectively with both age and AMD severity. Moreover, we find that Muller glia are increased with age but not with disease severity. Several genes associated with both age and disease severity in AMD, particularly C1s and MR1, are strong positively correlated with the proportions of Muller glia. Our studies expand the genetic and cellular landscape of AMD and provide avenues for further studies on the relationship between age and AMD. | genetic and genomic medicine |
10.1101/2022.04.01.22273328 | WikiProject Clinical Trials for Wikidata | WikiProject Clinical Trials is a Wikidata community project to integrate clinical trials metadata with the Wikipedia ecosystem. Using Wikidata methods for data modeling, import, querying, curating, and profiling, the project brought ClinicalTrials.gov records into Wikidata and enriched them. The motivation for the project was gaining the benefits of hosting in Wikidata, which include distribution to new audiences and staging the content for the Wikimedia editor community to develop it further. Project pages present options for engaging with the content in the Wikidata environment. Example applications include generation of web-based profiles of clinical trials by medical condition, research intervention, research site, principal investigator, and funder.
The projects curation workflows including entity disambiguation and language translation could be expanded when there is a need to make subsets of clinical trial information more accessible to a given community. This projects methods could be adapted for other clinical trial registries, or as a model for using Wikidata to enrich other metadata collections. | health informatics |
10.1101/2022.03.31.22272986 | Frequent Cocaine Use is Associated with Larger HIV Latent Reservoir Size | BackgroundWith the success of combination antiretroviral therapy, HIV is now treated as a chronic disease, including among drug users. Cocaine--one of the most frequently abused illicit drugs among persons living with HIV (PLWH)-- slows the decline of viral production after ART, and is associated with higher HIV viral load, more rapid HIV progression, and increased mortality. We examined the impact of cocaine use on the CD4+ T-cell HIV Latent Reservoir (HLR) in virally suppressed PLWH.
MethodsCD4+ T-cell genomic DNA was isolated from peripheral blood mononuclear cells collected from 434 women of diverse ancestry (i.e., 75% Black, 14% Hispanic, 12% White) who self-reported cocaine use (i.e., 160 cocaine users, 59 prior users, 215 non-users). Participants had to have an undetectable HIV RNA viral load measured by commercial assay for at least 6 months. The Intact Proviral HIV DNA Assay (IPDA) provided estimates of intact provirus per 106 CD4+ T-cells.
ResultsThe HLR size differed by cocaine use (i.e., median [interquartile range]: 72 [14, 193] for never users, for prior users 165 [63, 387], 184 [28, 502] for current users), which was statistically significantly larger in both prior (p=0.023) and current (p=0.001) cocaine users compared with never users.
ConclusionOur study is the first to provide evidence that cocaine use may contribute to a larger replication competent HLR in CD4* T-cells among virologically suppressed women living with HIV. Our findings are important, because women are under-represented in HIV reservoir studies and in studies of the impact of cocaine use on outcomes among PLWH. | hiv aids |
10.1101/2022.04.03.22273290 | The mobilome associated with Gram-negative bloodstream infections: A large-scale observational hybrid sequencing based study | Plasmids carry genes conferring antimicrobial resistance (AMR), and other clinically important traits; their ability to move within and between species may provide the machinery for rapid dissemination of such genes. Existing studies using complete plasmid assemblies, which are essential for reliable inference, have been small and/or limited to those carrying particularly antimicrobial resistance genes (ARGs). In this study, we sequenced 1,880 complete plasmids from 738 isolates from bloodstream infections (BSI) in 2009 (194 isolates) and 2018 (368 isolates) in Oxfordshire, UK, plus a stratified selection from intervening years (176 isolates). We demonstrate that plasmids are largely, but not entirely, constrained to host species, although there is substantial overlap between species of plasmid gene-repertoire. Most ARGs are carried by a relatively small number of plasmid groups with biological features that are predictable. Plasmids carrying ARGs (including those encoding carbapenemases) share a putative backbone of core genes with those carrying no such genes. These findings suggest that future surveillance should, in addition to tracking plasmids currently associated with clinically important genes, focus on identifying and monitoring the dissemination of high-risk plasmid groups with the potential to rapidly acquire and disseminate these genes. | infectious diseases |
10.1101/2022.04.03.22273290 | The mobilome associated with Gram-negative bloodstream infections: A large-scale observational hybrid sequencing based study | Plasmids carry genes conferring antimicrobial resistance (AMR), and other clinically important traits; their ability to move within and between species may provide the machinery for rapid dissemination of such genes. Existing studies using complete plasmid assemblies, which are essential for reliable inference, have been small and/or limited to those carrying particularly antimicrobial resistance genes (ARGs). In this study, we sequenced 1,880 complete plasmids from 738 isolates from bloodstream infections (BSI) in 2009 (194 isolates) and 2018 (368 isolates) in Oxfordshire, UK, plus a stratified selection from intervening years (176 isolates). We demonstrate that plasmids are largely, but not entirely, constrained to host species, although there is substantial overlap between species of plasmid gene-repertoire. Most ARGs are carried by a relatively small number of plasmid groups with biological features that are predictable. Plasmids carrying ARGs (including those encoding carbapenemases) share a putative backbone of core genes with those carrying no such genes. These findings suggest that future surveillance should, in addition to tracking plasmids currently associated with clinically important genes, focus on identifying and monitoring the dissemination of high-risk plasmid groups with the potential to rapidly acquire and disseminate these genes. | infectious diseases |
10.1101/2022.04.03.22273290 | The mobilome associated with Gram-negative bloodstream infections: A large-scale observational hybrid sequencing based study | Plasmids carry genes conferring antimicrobial resistance (AMR), and other clinically important traits; their ability to move within and between species may provide the machinery for rapid dissemination of such genes. Existing studies using complete plasmid assemblies, which are essential for reliable inference, have been small and/or limited to those carrying particularly antimicrobial resistance genes (ARGs). In this study, we sequenced 1,880 complete plasmids from 738 isolates from bloodstream infections (BSI) in 2009 (194 isolates) and 2018 (368 isolates) in Oxfordshire, UK, plus a stratified selection from intervening years (176 isolates). We demonstrate that plasmids are largely, but not entirely, constrained to host species, although there is substantial overlap between species of plasmid gene-repertoire. Most ARGs are carried by a relatively small number of plasmid groups with biological features that are predictable. Plasmids carrying ARGs (including those encoding carbapenemases) share a putative backbone of core genes with those carrying no such genes. These findings suggest that future surveillance should, in addition to tracking plasmids currently associated with clinically important genes, focus on identifying and monitoring the dissemination of high-risk plasmid groups with the potential to rapidly acquire and disseminate these genes. | infectious diseases |
10.1101/2022.03.31.22273219 | A mixed cross-sectional and case-control study approach to investigate the risk-factors of Noma/Cancrum oris in Ethiopia | IntroductionNoma is a polymicrobial gangrenous facial disease affecting people living in the most impoverished areas of low- and middle-income countries. If left untreated, the disease is fatal or else severely disfigure people with the condition. The compromised immune system, poor oral hygiene, measle infection, diarrheal disease, inaccessibility to health education and proper medical care, and lack of a balanced diet and good sanitary facilities are found to be some of the predisposing factors for the development and progression of the disease. Furthermore, debilitating diseases like malaria and measles were considered as significant precursors to Noma.
Materials and MethodA mix of cross-sectional and case-control study approaches was conducted to assess the risk factors of Noma in Ethiopia. The raw data of the cases were obtained from Yekatik 12 Hospital, Facing Africa, and the Harar project Ethiopia. Three controls were selected per single case. The Odd ratio (ORs) and Chi-square test were calculated to rule out the statistical significance of the association observed between the factors and the disease.
ResultsA total of 64 cases were selected for the case-control study. Considering the 1:3 case to control ratio, 192 matching controls were identified. Malaria, helminths, measle, diarrheal diseases, and living with domestic animals were found to be risk factors for Noma with a respective p-value < 0.01. Contrarily, the analysis has identified vaccination (p < 0.01) as a protective factor.
DiscussionNoma/face of poverty is mostly preventable by providing proper nutrition, sanitary and water facilities, awareness about the disease, oral health education, and vaccinations. Poverty-related diseases such as malaria, helminths infection, measle, diarrheal diseases, and unfavorable living conditions were identified to be the risk factor for Noma. As such the disease is truly preventable. Prevention of the disease can be achieved through promoting overall awareness of the disease, poverty reduction, improved nutrition, and promotion of exclusive breastfeeding in the first 3-6 months of life. Furthermore, optimum prenatal care, timely immunizations against common childhood diseases, initiating vaccination, and improving the social living conditions are the other preventive mechanisms. Moreover, long-lasting economic development should be considered to effectively and sustainably prevent the disease. | epidemiology |
10.1101/2022.04.04.22273372 | The trustworthiness and impact of trial preprints for COVID-19 decision-making: A methodological study | PurposeTo assess the trustworthiness and impact of preprint trial reports during the COVID-19 pandemic.
Data sourcesWHO COVID-19 database and the L-OVE COVID-19 platform by the Epistemonikos Foundation (up to August 3rd, 2021)
DesignWe compare the characteristics of COVID-19 trials with and without preprints, estimate time to publication of COVID-19 preprint reports, describe discrepancies in key methods and results between preprint and published trial reports, report the number of retracted preprints and publications, and assess whether including versus excluding preprint reports affects meta-analytic estimates and the certainty of evidence. For the effects of eight therapies on mortality and mechanical ventilation, we performed meta-analyses including preprints and excluding preprints at 1 month, 3 months, and 6 months after the first trial addressing the therapy became available either as a preprint or publication (120 meta-analyses in total).
ResultsWe included 356 trials, 101 of which are only available as preprints, 181 as journal publications, and 74 as preprints first and subsequently published in journals. Half of all preprints remain unpublished at six months and a third at one year. There were few important differences in key methods and results between trial preprints and their subsequent published reports. We identified four retracted trials, three of which were published in peer-reviewed journals. With two exceptions (2/60; 3.3%), point estimates were consistent between meta-analyses including versus excluding preprints as to whether they indicated benefit, no appreciable effect, or harm. There were nine comparisons (9/60; 15%) for which the rating of the certainty of evidence differed when preprints were included versus excluded, for four of these comparisons the certainty of evidence including preprints was higher and for five of these comparisons the certainty of evidence including preprints was lower.
LimitationsThe generalizability of our results is limited to COVID-19. Preprints that are subsequently published in journals may be the most rigorous and may not represent all trial preprints.
ConclusionWe found no compelling evidence that preprints provide less trustworthy results than published papers. We show that preprints remain the only source of findings of many trials for several months, a length of time that is unacceptable in a health emergency. We show that including preprints may affect the results of meta-analyses and the certainty of evidence. We encourage evidence users to consider data from preprints in contexts in which decisions are being made rapidly and evidence is being produced faster than can be peer-reviewed.
O_TEXTBOXSummary Box 1O_ST_ABSWhat is already known on this topicC_ST_ABSO_LIClinicians and decision-makers need rapidly available and credible data addressing the comparative effectiveness of treatments and prophylaxis for COVID-19.
C_LIO_LIInvestigators have adopted preprint servers, which allow the rapid dissemination of research findings before publication in peer-reviewed journals.
C_LI
What this study addsO_LIWe found no compelling evidence that preprints provide less trustworthy results than published papers.
C_LIO_LIWe show that including preprints may affect the results of meta-analyses and the certainty of evidence and we encourage evidence users to consider data from preprints in contexts in which decisions are being made rapidly and evidence is being produced faster than can be peer-reviewed.
C_LI
C_TEXTBOX | epidemiology |
10.1101/2022.03.27.22273016 | Haemolysis detection in microRNA-seq from clinical plasma samples | The abundance of cell-free microRNA (miRNA) has been measured in many body fluids, including blood plasma, which has been proposed as a source with novel, minimally invasive biomarker potential for several diseases. Despite improvements in quantification methods for plasma miRNAs, there is no consensus on optimal reference miRNAs or to what extent haemolysis may affect plasma miRNA content. Here we propose a new method for the detection of haemolysis in miRNA high-throughput sequencing (HTS) data from libraries prepared using human plasma. To establish a miRNA haemolysis signature in plasma we first identified differentially expressed miRNAs between samples with known haemolysis status and selected miRNA with statistically significant higher abundance in our haemolysed group. Given there may be both technical and biological reasons for differential abundance of signature miRNA, and to ensure the method developed here was relevant outside of our specific context, that is women of reproductive age, we tested for significant differences between pregnant and non-pregnant groups. Here we report a novel 20 miRNA signature (miR-106b-3p, miR-140-3p, miR-142-5p, miR-532-5p, miR-17-5p, miR-19b-3p, miR-30c-5p, miR-324-5p, miR-192-5p, miR-660-5p, miR-186-5p, miR-425-5p, miR-25-3p, miR-363-3p, miR-183-5p, miR-451a, miR-182-5p, miR-191-5p, miR-194-5p, miR-20b-5p) that can be used to identify the presence of haemolysis, in silico, in high throughput miRNA sequencing data. Given the potential for haemolysis contamination, we recommend that assay for haemolysis detection become standard pre-analytical practice and provide here a simple method for haemolysis detection. | genetic and genomic medicine |
10.1101/2022.03.28.22272969 | Using Geographically Weighted Linear Regression for County-Level Breast Cancer Modeling in the United States | Due to the continued disparities in breast cancer, improved models are being needed to inform policy related to existing social disparities related to cancer. First ordinal least squares regression was used to determine the relationship of sociodemographic measures (i.e. poverty rate and social inequity) on breast cancer incidence in the United States. Gini coefficient was used as a measure of income inequality. Next, Geographically Weighted Regression (GWR), a local spatial model, was used to explore the impact location has on the relationship between sociodemographic measures and breast cancer. Mappings of the results are presented, which can assist policymakers to address inequities and social determinants when funding cancer interventions. The GWR model is then compared to linear regression models that do not take into consideration location, highlighting the benefits of spatial models in cancer policy research. More studies applying spatial regression techniques are needed in order to accurately inform policy. | public and global health |
10.1101/2022.03.28.22272122 | Identification of novel microcephaly-linked protein ABBA that mediates cortical progenitor cell division and corticogenesis through NEDD9-RhoA | The cerebral cortex is responsible for higher cognitive functions. Its correct development depends on coordinated asymmetric division cycles by polarized radial glial progenitor cells. Mitotic defects in neuronal stem cells are linked to the aetiology of microcephaly, but the underlying mechanisms remain incompletely understood. Here we describe a novel role of the membrane deforming cytoskeletal regulator protein Abba (MTSS1L/MTSS2) in cortical development. Loss of Abba in the developing brain resulted in radial glia proliferation arrest, disorganized radial fibers and abnormal migration of neuronal progenitors. During cell division, Abba localized to the cleavage furrow where it recruited the scaffolding protein Nedd9 and positively regulated RhoA activity. Importantly, we identified a human variant Abba (R671W) in a patient with microcephaly and intellectual disability. Ectopic expression of this mutant protein in mice phenocopied Abba knockdown. Taken together, these data provide novel mechanistic insight into human microcephaly and cortical development by identifying ABBA as a novel regulator of faithful mitotic progression of neuronal progenitor cells. | neurology |
10.1101/2022.03.28.22272848 | Biochemical, Biophysical, and Immunological Characterization of Respiratory Secretions in Severe SARS-CoV-2 (COVID-19) Infections | Thick, viscous respiratory secretions are a major pathogenic feature of COVID-19 disease, but the composition and physical properties of these secretions are poorly understood. We characterized the composition and rheological properties (i.e. resistance to flow) of respiratory secretions collected from intubated COVID-19 patients. We find the percent solids and protein content are greatly elevated in COVID-19 compared to heathy control samples and closely resemble levels seen in cystic fibrosis, a genetic disease known for thick, tenacious respiratory secretions. DNA and hyaluronan (HA) are major components of respiratory secretions in COVID-19 and are likewise abundant in cadaveric lung tissues from these patients. COVID-19 secretions exhibit heterogeneous rheological behaviors with thicker samples showing increased sensitivity to DNase and hyaluronidase treatment. In histologic sections from these same patients, we observe increased accumulation of HA and the hyaladherin versican but reduced tumor necrosis factor-stimulated gene-6 (TSG6) staining, consistent with the inflammatory nature of these secretions. Finally, we observed diminished type I interferon and enhanced inflammatory cytokines in these secretions. Overall, our studies indicate that increases in HA and DNA in COVID-19 respiratory secretion samples correlate with enhanced inflammatory burden and suggest that DNA and HA may be viable therapeutic targets in COVID-19 infection. | infectious diseases |
10.1101/2022.03.28.22272907 | Identification and Genomic Assessment of Daptomycin-, Linezolid-, Vancomycin-resistant Enterococcus faecium During Protracted Infection | Linezolid and daptomycin resistance among Enterococccus faecium (Efm) isolates, while rare, is a major challenge for clinicians who are often limited to broad-spectrum or combination antibiotic therapies for management. Combination therapy with a beta-lactam has been reported, but limited clinical evidence exists to support its use. We describe the clinical management of a prolonged Efm intraabdominal (IA) infection and subsequent bacteremia, along with observed multidrug resistance development and use of serial whole genome sequencing to better understand resistance mechanisms. Combination antimicrobial therapy with daptomycin (DAP) and ceftaroline (CPT) was used to treat the patients catheter-associated daptomycin-, linezolid-, vancomycin-resistant Efm (DLVRE) bloodstream infection. In vitro antimicrobial testing of this DLVRE revealed only minor synergy between daptomycin and ceftaroline; However, the patients bacteremia cleared following initiation of combination therapy in conjunction with catheter removal. Sequencing of the patients DLVRE revealed multiple genomic mutations which explain both linezolid and daptomycin resistance and the presence of a plasmid containing known resistance determinants for vancomycin. Daptomycin resistance was attributed to the presence of chromosomal mutations in liaS (Thr120Ala), liaR (Trp73Cys), and cls (Asp13Ile), while linezolid resistance was attributed to the presence of the G2576T variant allele in some of 23S rRNA gene copies. Sequential whole genome sequencing of two additional bacterial isolates from the same patient revealed protracted colonization with a single DLVRE clone and suggested the development of bacterial subpopulations. Pairing clinical isolate susceptibilities with whole genome sequencing should be encouraged in clinical practice to better inform antimicrobial management in cases of multidrug resistance. | infectious diseases |
10.1101/2022.03.28.22272975 | Vaccine-induced immune thrombotic thrombocytopenia (VITT) is mediated by a stereotyped clonotypic antibody | Vaccine-induced immune thrombotic thrombocytopenia (VITT) is a rare thromboembolic complication of adenoviral-vectored SARS-CoV2 vaccines, mediated by antibodies directed against platelet factor 4 (PF4). Given their causal role in VITT, identification of the molecular composition of anti-PF4 antibodies is crucial for developing better diagnostics and treatments. Here, we utilised a novel proteomic workflow to analyse the immunoglobulin variable (IgV) region composition of anti-PF4 antibodies at the level of the secreted proteome. Serum anti-PF4 IgG antibodies from five patients with VITT triggered by ChAdOx1 nCoV-19 vaccination were affinity purified by PF4-coupled magnetic beads and sequenced by mass spectrometry. We revealed a single IgG heavy (H)-chain species paired with a single lambda light (L)-chain species in all five unrelated patients. Remarkably, all L-chains were encoded by the identical IGLV3-21*02 gene subfamily with identical L-chain third complementarity determining region (LCDR3) lengths. Moreover, striking stereotypic features were also identified in heavy-chains anti-PF4 antibodies characterised by identical HCDR3 length and homologous sequences. In summary, we unravelled the molecular signature of highly stereotyped clonotypic anti-PF4 antibodies, indicating shared pathways of antibody production in VITT patients. These discoveries are critical to understand the molecular basis of this serious condition and develop novel therapies aimed at removing pathogenic clones.
KEY POINTSO_LIAnti-PF4 antibodies in VITT comprise highly stereotyped clonotype
C_LIO_LIA single IGLV3-21*02 encoded light chain is found in unrelated patients
C_LI | allergy and immunology |
10.1101/2022.03.29.22272997 | Glasses and risk of COVID-19 transmission - analysis of the Virus Watch Community Cohort study. | BackgroundRespiratory viruses, including SARS-CoV-2, can infect the eyes or pass into the nose via the nasolacrimal duct. The importance of transmission via the eyes is unknown but might plausibly be reduced in those who wear glasses. Previous studies have mainly focussed on protective eyewear in healthcare settings.
MethodsParticipants from the Virus Watch prospective community cohort study in England and Wales responded to a questionnaire on the use of glasses and contact lenses. This included frequency of use, purpose, and likelihood of wearing a mask with glasses. Infection was confirmed through data linkage with Second Generation Surveillance System (Pillar 1 and Pillar 2), weekly questionnaires to self-report positive polymerase chain reaction or lateral flow results, and, for a subgroup, monthly capillary blood testing for antibodies (nucleocapsid and spike). A multivariable logistic regression model, controlling for age, sex, income and occupation, was used to identify odds of infection depending on the frequency and purpose of using glasses or contact lenses.
Findings19,166 Virus Watch participants responded to the questionnaire, with 13,681 (71.3%, CI 70.7-72.0) reporting they wore glasses. A multivariable logistic regression model showed a 15% lower odds of infection for those who reported using glasses always for general use (OR 0.85, 95% 0.77-0.95, p = 0.002) compared to those who never wore glasses. The protective effect was reduced in those who said that wearing glasses interfered with mask wearing. No protective effect was seen for contact lens wearers.
InterpretationPeople who wear glasses have a moderate reduction in risk of COVID-19 infection highlighting the importance of the eye as a route of infection. Eye protection may make a valuable contribution to the reduction of transmission in community and healthcare settings.
FundingThe research costs for the study have been supported by the MRC Grant Ref: MC_PC 19070 awarded to UCL on 30 March 2020 and MRC Grant Ref: MR/V028375/1 awarded on 17 August 2020. The study also received $15,000 of Facebook advertising credit to support a pilot social media recruitment campaign on 18th August 2020. The study also received funding from the UK Government Department of Health and Social Cares Vaccine Evaluation Programme to provide monthly Thriva antibody tests to adult participants. This study was supported by the Wellcome Trust through a Wellcome Clinical Research Career Development Fellowship to RA [206602]. Funding from the HSE Protect study, GOSH Childrens Charity and the Great Ormond Street Hospital BRC supported the involvement of CO in the project.
Research in contextO_ST_ABSEvidence before the studyC_ST_ABSDespite the risk of SARS-CoV-2 transmission via the eyes, very few countries have advocated eye protection to reduce transmission amongst the public and, except when providing close care for those known or suspected to be infected, is variable and based on case-by-case assessment of exposure risk. The mechanism, but not the extent, of the transmission route through the eyes is well described in the literature, with several studies reporting detection of SARS-CoV-2 RNA in the tear film, conjunctiva and conjunctival sac. There have been a small number of hospital based observational studies suggesting that eye protection may help prevent COVID-19 infection. A literature search was carried out on 23rd February 2022 across Medline and Embase using the search terms eyewear, glasses, SARS-CoV-2, COVID-19, SARS, transmission and infectivity, providing 105 manuscripts. Of these, only eight investigated the risk of infection associated with eye protection, all in hospital settings or followed a cohort of healthcare workers. Among the studies was a systematic review that identified 5 observational studies from 898 articles that were screened. The cohort study with the largest sample size, 345 healthcare professionals, demonstrated a relative risk of 10.25 (95% CI 1.28-82.39; P = 0.009) for infection when not using eye protection. No studies of the potential protective effect of glasses wearing, for visual correction, in community settings were identified.
Added value of this studyThe Virus Watch study is a prospective community household study across England and Wales. 19,166 participants responded to the monthly questionnaire on glasses and contact lens use, assessing reported frequency, the purpose of use and how likely they were to wear a mask with glasses. Infections were identified in data linked to the Second Generation Surveillance System (Pillar 1 and Pillar 2 testing), weekly surveys seeking self-reports of polymerase chain reaction or lateral flow device results and, in a subset of 11,701, self-collected capillary blood testing for antibodies (nucleocapsid and spike - nucleocapsid antibodies were taken as evidence of prior infection as these are unaffected by vaccination). Our multivariable logistic regression model, controlling for age, sex, household income and occupation, demonstrated 15% lower odds of infection for those who reported always using glasses for general use compared to those who never wear glasses. The protective effect was not observed in those who strongly agreed with the statement, I am less likely to wear a face covering when I have my glasses on because my glasses steam up. Counterfactual analysis of contact lenses did not suggest a protective effect regardless of frequency of use.
Implications of all the available evidenceThe findings of this study demonstrate a moderate reduction in risk of SARS-CoV-2 infection in those who always wear glasses compared to never. Unlike other studies, our results are representative of a community setting, adjust for potential confounders and provide a counterfactual analysis with contact lenses. This extends the current evidence to community settings and validates proposed biological mechanisms of eye protection reducing the risk of SARS-CoV-2 transmission. | epidemiology |
10.1101/2022.03.29.22273042 | The new normal? Dynamics and scale of the SARS-CoV-2 variant Omicron epidemic in England | The SARS-CoV-2 pandemic has been characterised by the regular emergence of genomic variants which have led to substantial changes in the epidemiology of the virus. With natural and vaccine-induced population immunity at high levels, evolutionary pressure favours variants better able to evade SARS-CoV-2 neutralising antibodies. The Omicron variant was first detected in late November 2021 and exhibited a high degree of immune evasion, leading to increased infection rates in many countries. However, estimates of the magnitude of the Omicron wave have relied mainly on routine testing data, which are prone to several biases. Here we infer the dynamics of the Omicron wave in England using PCR testing and genomic sequencing obtained by the REal-time Assessment of Community Transmission-1 (REACT-1) study, a series of cross-sectional surveys testing random samples of the population of England. We estimate an initial peak in national Omicron prevalence of 6.89% (5.34%, 10.61%) during January 2022, followed by a resurgence in SARS-CoV-2 infections in England during February-March 2022 as the more transmissible Omicron sub-lineage, BA.2 replaced BA.1 and BA.1.1. Assuming the emergence of further distinct genomic variants, intermittent epidemics of similar magnitude as the Omicron wave may become the new normal. | infectious diseases |
10.1101/2022.03.30.22273135 | Effect of xenon and argon inhalation on erythropoiesis and steroidogenesis: a systematic review | BackgroundXenon and argon inhalation were included in the WADA Prohibited List in 2014 due to the reported positive effects on erythropoiesis. There has also been suggestions that steroidogenesis stimulation occurs as a result of the use of these substances. Currently, Xenon is on the WADA Prohibited List notable affecting erythropoiesis as a Hypoxia-inducible factor (HIF) activating agent. Thus, the systematic review of studies supporting these notions is of great interest.
MethodsA thorough search for articles on the effects of Xenon and argon inhalation on erythropoiesis and steroidogenesis, as well as their negative effects on human health and methods of their detection in body fluids was conducted. Pubmed and Google Scholar databases were researched, as well as the special research section of the WADA website. The search was conducted in accordance with PRISMA guidelines. All articles in English published between 2000 and 2021 were analyzed, as well as reference studies meeting the search criteria.
ResultsOnly two publications in healthy human subjects evaluating the effects of Xenon inhalation on erythropoiesis were found with no conclusive evidence of a positive effect on erythropoiesis. Both articles were published after 2014 when the gases were included on the WADA Prohibited List. Three more studies were conducted in animal models and one further study was conducted with human patients who had undergone Xenon anesthesia. There were no studies on the effect of argon inhalation on erythropoiesis. No studies were found on the effect of Xenon or argon inhalation on steroidogenesis in healthy subjects. No studies related to the effects of Xenon or argon inhalation on erythropoiesis and steroidogenesis were found on the WADA website.
ConclusionThere is still inconclusive evidence to support the administration of Xenon and argon inhalations on erythropoiesis and steroidogenesis and their positive effects on health. Further research is needed to establish the effects of these gases. Additionally, improved communication between the anti-doping authorities and all key stakeholders is required to support the inclusion of various substances on the Prohibited List. | sports medicine |
10.1101/2022.03.29.22272996 | A non-muscle myosin heavy chain 9 genetic variant is associated with graft failure following kidney transplantation. | BackgroundDespite current matching efforts to identify optimal donor-recipient pairs in kidney transplantation, alloimmunity remains a major proponent of late transplant failure. While kidney allocation based on human leukocyte antigen (HLA) matching has markedly prolonged short-term graft survival, new data suggests that additional genetic parameters in donor-recipient matching could help improve the long-term outcomes. Here, we studied the impact of a recently discovered non-muscle myosin heavy chain 9 gene (MYH9) polymorphism on kidney allograft failure.
MethodsWe conducted a prospective observational cohort study, analyzing the DNA of 1,271 kidney donor-recipient transplant pairs from a single academic hospital for the MYH9 rs11089788 C>A polymorphism. The association of the MYH9 genotype with the risk of graft failure (primary outcome), biopsy-proven acute rejection (BPAR), and delayed graft function (DGF) (secondary outcomes) were determined.
ResultsThe MYH9 polymorphism in the donor was not associated with 15-year death-censored kidney graft survival, whereas a trend was seen for the association between the MYH9 polymorphism in the recipient and graft failure (recessive model, P=0.056). Having the AA-genotype of the MYH9 polymorphism in recipients was associated with a higher risk of DGF (P=0.031) and BPAR (P=0.021), although the significance was lost after adjustment for potential confounders (P=0.15 and P=0.10, respectively). The combined presence of the MYH9 polymorphism in donor-recipient pairs was significantly associated with long-term kidney allograft survival (P=0.036), in which recipients with an AA-genotype receiving a graft with an AA-genotype had the worst outcome. After adjustment for covariates, this combined genotype remained significantly associated with 15-year death-censored kidney graft survival (HR 1.68, 95%-CI: 1.05 - 2.70, P=0.031).
ConclusionsOur results reveal that recipients with an AA-genotype MYH9 polymorphism receiving a donor kidney with an AA-genotype, have a significantly elevated risk of graft failure after kidney transplantation.
Key pointsO_LIIn recipients, the MYH9 SNP was associated with delayed graft function and biopsy-proven acute rejection after kidney transplantation, although the significance was lost in multivariable analysis.
C_LIO_LIPresence of the MYH9 variant in both the donor and recipient significantly associated with long-term kidney allograft survival in multivariable analysis.
C_LIO_LIOur present findings suggests that matching donor-recipient transplant pairs based on the MYH9 polymorphism may attenuate the risk of graft loss.
C_LI | transplantation |
10.1101/2022.03.29.22273035 | Estimating the impact of the wMel release program on dengue and chikungunya incidence in Rio de Janeiro, Brazil | BackgroundIntrogression of the insect bacterium Wolbachia into Aedes aegypti mosquito populations been shown in randomised and non-randomised trials to reduce the incidence of dengue in treated communities, however evidence for the real-world effectiveness of large-scale Wolbachia mosquito deployments for arboviral disease control in endemic settings is still limited and no effectiveness studies have been conducted for chikungunya virus. A large Wolbachia (wMel strain) program was implemented in 2017 in Rio de Janeiro, Brazil. Here we assess the impact of the release program on dengue and chikungunya incidence.
Methods and findingsThe program released 67 million wMel infected mosquitoes across 28,489 release locations over a 86.8km2 area in Rio de Janeiro between August 2017 and the end of 2019. Following releases, mosquitoes were trapped and the presence of wMel determined. To assess the impact of the release program on dengue and chikungunya incidence, we used spatiotemporally explicit models applied to geocoded dengue (N=194,330) and chikungunya cases (N=58,364) from 2010 (2016 for chikungunya) to 2019 from across the city. On average, 32% of mosquitoes collected from the release zones between 1 and 29 months after releases were positive for wMel. Reduced wMel introgression occurred in locations and seasonal periods when dengue and chikungunya cases were historically high. Despite incomplete introgression, we found that the releases were associated with a 38% (95%CI: 32-44%) reduction in dengue incidence and a 10% (95%CI: 4-16%) reduction in chikungunya incidence.
ConclusionsStable establishment of wMel in this diverse, urban setting appears more complicated than has been observed elsewhere. However, even intermediate levels of wMel appear to reduce the incidence of two different arboviruses. | infectious diseases |
10.1101/2022.03.29.22273047 | Cerebrovascular risk factors impact brain phenotypes and cognitive function in healthy population | Cognitive decline is a major characteristic of ageing. Studies show that cardiovascular risk factors (CVR) are associated with cognitive declines and brain phenotypes, but the causality between CVR and cognitive function needs further understanding. In this study, we seek to investigate the causalities between CVR, brain phenotypes and cognitive function. We first generate a general factor (gCVR) representing common CVR and a score representing the polygenic risk (PRS). We then identify phenotypes of brain and cognitive functions associated with gCVR and PRS. Moreover, we conduct causal mediation analysis to evaluate the indirect effect of PRS through CVR, which infers the causality of gCVR on brain phenotypes and cognition. Further, we test the mediation effect of gCVR on the total effect of brain phenotypes on cognitive function. Finally, the causality between CVR and brain phenotypes is cross validated using Mendelian randomization (MR) with genetic instruments. The results show that CVR mediates the effect of PRS on brain phenotypes and cognitive function, and CVR also mediates the effect of brain phenotypes on cognitive changes. Additionally, we validate that the variation in a few brain phenotypes., e.g., volume of grey matter, are caused by CVR. | epidemiology |
10.1101/2022.03.28.22272733 | Deep learning applied to 4-electrode EEG resting-state data detects depression in an untrained external population | In this study we trained and tested several deep learning algorithms to classify depressive individuals and controls based on their electroencephalography data. Traditionally, classification methods based on electroencephalography resting-state are based primarily on linear features or a combination of linear and non-linear features. Based on different theoretical grounds, some authors claim that the more electrodes, the more accurate the classifiers, while others consider that working on a selection of electrodes is a better approach{square}. In this study, a data-driven approach was initially applied on a selection of electrodes to classify 25 depressive and 24 control participants. Using a classifier with just four electrodes, based on non-linear features with high temporo-spatial complexity, proved accurate enough to classify depressive and control participants. After the classifier was internally trained and tested, it was applied to electroencephalography resting-state data of control and depressive individuals available from a public database, obtaining a classifier accuracy of 93% in the depressive and 100% in the control group. This validates the generalizability of the classifier to untrained data from different teams, populations and settings. We conclude that time-window span analysis is a promising approach to understand the neural dynamics of depression and to develop an independent biomarker. | psychiatry and clinical psychology |
10.1101/2022.04.01.22273321 | Framework for prioritizing variants of unknown significance from clinical genetic testing in kidney disease: utility of multidisciplinary approach to gather evidence of pathogenicity for Hepatocyte Nuclear Factor-1 (HNF1B) p.Arg303His | Monogenic causes in over 300 kidney-associated genes account for roughly 12% of end stage kidney disease (ESKD) cases. Advances in next generation sequencing, and large customized panels enable the diagnosis of monogenic kidney disease noninvasively at relatively low cost, allowing for more precise management for patients and their families. A major challenge is interpreting rare variants, many of which are classified as variants of unknown significance (VUS). We present a framework in which we thoroughly evaluated and provided evidence of pathogenicity for HNF1B-p.Arg303His, a VUS returned from clinical genetic testing for a kidney transplant candidate. This blueprint, designed by a multi-disciplinary team of clinicians, molecular biologists, and diagnostic geneticists, includes using a health system-based cohort with genetic and clinical information to perform deep phenotyping of VUS carriers, examination of existing genetic databases, as well as functional testing. With our approach, we demonstrate evidence for pathogenicity for HNF1B-p.Arg303His by showing similar burden of kidney manifestations in this variant to known HNF1B pathogenic variants, and greater burden compared to non-carriers. Determination of a molecular diagnosis for the example family allows for proper surveillance and management of HNF1B-related manifestations such as kidney disease, diabetes, and hypomagnesemia with important implications for safe living-related kidney donation. The candidate gene-variant pair also allows for clinical biomarker testing for aberrations of linked pathways. This working model may be applicable other diseases of genetic etiology. | nephrology |
10.1101/2022.03.31.22273233 | AI-Driven Longitudinal Characterization of Neonatal Health and Morbidity | While prematurity is the single largest cause of death in children under 5 years of age, the current definition of prematurity, based on gestational age, lacks the precision needed for guiding care decisions. Here we propose a longitudinal risk assessment for adverse neonatal outcomes in newborns based on a multi-task deep learning model that uses electronic health records (EHRs) to predict a wide range of outcomes over a period starting shortly after the time of conception and ending months after birth. By linking the EHRs of the Lucile Packard Childrens Hospital and the Stanford Healthcare Adult Hospital, we developed a cohort of 22,104 mother-newborn dyads delivered between 2014 and 2018. This enabled a unique linkage between long-term maternal information and newborn outcomes. Maternal and newborn EHRs were extracted and used to train a multi-input multi-task deep learning model, featuring a long short-term memory neural network, to predict 24 different neonatal outcomes. An additional set of 10,250 mother-newborn dyads delivered at the same Stanford Hospitals from 2019 to September 2020 was used to independently validate the model, followed by a separate analysis of 12,256 mothers-newborn dyads at the University of California, San Francisco. Moreover, comprehensive association analysis identified multiple known and new associations between various maternal and neonatal features and specific neonatal outcomes. To date, this is the largest study utilizing linked EHRs from mother-newborn dyads and would serve as an important resource for the investigation and prediction of neonatal outcomes. An interactive website is available for independent investigators to leverage this unique dataset: https://maternal-child-health-associations.shinyapps.io/shiny_app/. | pediatrics |
10.1101/2022.04.03.22273360 | Real-World Evidence of the Neutralizing Monoclonal Antibody Sotrovimab for Preventing Hospitalization and Mortality in COVID-19 Outpatients | BackgroundIt is not known whether sotrovimab, a neutralizing monoclonal antibody (mAb) treatment authorized for early symptomatic COVID-19 patients, is effective against the SARS-CoV-2 Delta variant to prevent progression to severe disease and mortality.
MethodsObservational cohort study of non-hospitalized adult patients with SARS-CoV-2 infection from October 1st 2021 - December 11th 2021, using electronic health records from a statewide health system plus state-level vaccine and mortality data. We used propensity matching to select 3 patients not receiving mAbs for each patient who received outpatient sotrovimab treatment. The primary outcome was 28-day hospitalization; secondary outcomes included mortality and severity of hospitalization.
ResultsOf 10,036 patients with SARS-CoV-2 infection, 522 receiving sotrovimab were matched to 1,563 not receiving mAbs. Compared to mAb-untreated patients, sotrovimab treatment was associated with a 63% decrease in the odds of all-cause hospitalization (raw rate 2.1% versus 5.7%; adjusted OR 0.37, 95% CI 0.19-0.66) and an 89% decrease in the odds of all-cause 28-day mortality (raw rate 0% versus 1.0%; adjusted OR 0.11, 95% CI 0.0-0.79), and may reduce respiratory disease severity among those hospitalized.
ConclusionReal-world evidence demonstrated sotrovimab effectiveness in reducing hospitalization and all-cause 28-day mortality among COVID-19 outpatients during the Delta variant phase. | infectious diseases |
10.1101/2022.03.30.22273194 | Safety and Efficacy of Dupilumab for the Treatment of Hospitalized Patients with Moderate to Severe COVID 19: A Phase IIa Trial | BackgroundA profound need remains to develop further therapeutics for treatment of those hospitalized with COVID-19 to prevent respiratory decline or death. Based on data implicating the type 2 cytokine interleukin (IL)-13 as a significant factor leading to critical COVID-19, this trial was designed to assess the safety and efficacy of dupilumab, a monoclonal antibody that blocks IL-13 and IL-4 signaling, in those hospitalized with COVID-19.
MethodsWe conducted a phase IIa randomized double-blind placebo-controlled trial to assess the safety and efficacy of dupilumab plus standard of care versus placebo plus standard of care in mitigating respiratory failure and death in those hospitalized with COVID-19. Subjects were followed prospectively for 60 days, with collection of clinical outcomes, adverse events and immunologic biomarkers at multiple time points throughout the study period. The primary endpoint was the proportion of patients alive and free of invasive mechanical ventilation at 28 days analyzed in logistic regression.
ResultsForty eligible subjects were enrolled from June 23, 2021 through November 11, 2021. There was no difference in adverse events nor in proportion of patients alive and free of mechanical ventilation at day 28 between study treatment groups. However, for the secondary endpoint of mortality at day 60, subjects randomized to dupilumab had a higher survival rate: 89.5% of subjects in the dupilumab group were alive compared to 76.2% in the placebo group (adjusted HR 0.05, 95% CI: 0.0-0.72, p=0.03). There was a trend toward reduction in ICU admission in the dupilumab group compared to the placebo group (33.3% vs 66.7%; adjusted HR 0.44, 95% CI: 0.09-2.09, p=0.30). Lastly, we saw downstream evidence of IL-4 and IL-13 signaling blockade through analysis of immune biomarkers at multiple study time points.
ConclusionsDupilumab was well tolerated and improved 60-day survival in patients hospitalized with moderate to severe COVID-19. Blockade of type 2 immunity offers promise as a novel treatment for COVID-19. | infectious diseases |
10.1101/2022.04.04.22273314 | SARS-CoV-2 hyperimmune globulin for severely immunocompromised patients with COVID-19: a randomised, controlled, double-blind, phase 3 trial | BackgroundSeverely immunocompromised patients are at risk for severe COVID-19. Benefit from convalescent plasma in these patients is suggested but data from randomised trials are lacking. The aim of this study is to determine efficacy of SARS-CoV-2 hyperimmune globulin ("COVIG") in treatment of severely immunocompromised, hospitalised COVID-19 patients.
MethodsIn this randomised, controlled, double-blind, multicentre, phase 3 trial, severely immunocompromised patients who were hospitalised with symptomatic COVID-19 were randomly assigned (1:1) to receive 15 grams of COVIG or 15 grams of intravenous immunoglobulin without SARS-CoV-2 antibodies (IVIG, control). Patients included were solid organ transplant patients with three drugs from different immunosuppressive classes or patient with disease or treatment severely affecting B-cell function. Patients that required mechanical ventilation or high flow nasal oxygen were excluded. All investigators, research staff, and participants were masked to group allocation. The primary endpoint was occurrence of severe COVID-19 evaluated up until day 28 after treatment, defined as the need for mechanical ventilation, high-flow nasal oxygen, readmission for COVID-19 after hospital discharge or lack of clinical improvement on day seven or later. This trial is registered with Netherlands Trial Register (NL9436).
FindingsFrom April, 2021, to July, 2021, 18 participants were enrolled at three sites in the Netherlands; 18 patients were analysed. Recruitment was halted prematurely when casirivimab/imdevimab became the recommended therapy in the Dutch COVID-19 treatment guideline for seronegative, hospitalised COVID-19 patients. Median age was 58 years and all but two were negative for SARS-CoV-2 spike IgG at baseline. Severe COVID-19 was observed in two out of ten (20%) patients treated with COVIG compared to seven of eight (88%) in the IVIG control group (p = 0{middle dot}015, Fishers exact test).
InterpretationCOVIG reduced the incidence of severe COVID-19 in severely immunocompromised patients, hospitalised with COVID-19. COVIG may be a valuable treatment in this patient group and can be used when no monoclonal antibody therapies are available.
FundingThe Netherlands Organisation for Health Research and Development, Sanquin Blood Supply Foundation. | infectious diseases |
10.1101/2022.03.30.22273026 | A comparative analysis of serial measurements of Soluble Urokinase-type Plasminogen Activator Receptor (suPAR) and C-reactive protein in patients with moderate COVID-19: a single center study from India | Soluble urokinase plasminogen-activator receptor (suPAR) is a secreted protein associated with inflammation and proven its usefulness in triage/risk stratifications. This prospective study aimed to evaluate the utility of suPAR in comparison to C-reactive protein (CRP) in hospitalized moderate COVID-19 patients.
This is a prospective comparative study during second pandemic wave. Serum suPAR level and CRP were measured serially in 31 confirmed COVID-19 hospitalized patients (20 males, 11 females) on day-1 (24-hours of admission), day-3 and day-5 using suPARnostic AUTO flex ELISA and Nephelometry (ThermoFischer) respectively. Schapiro Wilk test verified the data distribution; Wilcoxon signed rank test compared CRP and suPAR between deceased/alive subject and identified link between co-morbidity and COVID-19 severity.
In our study, the mean age was 61.8 ranging from 28-82 with 9.7% (n=3/31) mortality rate. Deceased patients showed significant higher suPAR levels correlating with increasing severity from day 1 to 5 (p<0.016-0.006) than CRP (p=0.717). Patients with pre-existing co-morbidities showed significantly elevated suPAR levels on days 1-5, especially those with hypertension (HTN;p<0.03) and chronic kidney disease (CKD;p<0.001).
In conclusion, levels of suPAR were higher in deceased patients with severe symptoms of COVID-19 during hospitalization and in patients with pre-existing co-morbid conditions, HTN and CKD. This preliminary study provides evidence suggesting that circulating suPAR can be a potential biomarker to assess the severity of COVID 19 compared to CRP. | infectious diseases |
10.1101/2022.04.04.22273382 | Accurate diagnosis of atopic dermatitis by applying random forest and neural networks with transcriptomic data | Atopic dermatitis (AD) is one of the most common inflammatory skin diseases. But the great heterogeneity of AD makes it difficult to design an accurate diagnostic pipeline based on traditional diagnostic methods. In other words, the AD diagnosis has suffered from an inaccurate bottleneck. Thus, it is necessary to develop a novel and accurate diagnostic model to supplement existing methods. The recent development of advanced gene sequencing technologies enables potential in accurate AD diagnosis. Inspired by this, we developed an accurate AD diagnosis based on transcriptomic data in skin tissue. Using these data of 149 subjects, including AD patients and healthy controls, from Gene Expression Omnibus (GEO) database, we screened differentially expressed genes (DEGs) of AD and identified six critical genes (PPP4R1, SERPINB4, S100A7, S100A9, BTC, and GALNT6) by random forest classifier. In a follow-up study of these genes, we constructed a neural network model (average AUC=0.943) to automatically distinguish subjects with AD from healthy controls. Among these critical genes, we found that PPP4R1 and GALNT6 had never been reported to be associated with AD. Although further replications in other cohorts are needed, our findings suggest that these genes may be developed into useful biomarkers of AD diagnosis and may provide invaluable clues or perspectives for further researches on the pathogenesis of AD. | dermatology |
10.1101/2022.04.04.22273011 | A prospective cohort study of presenteeism and increased risk of unemployment among Japanese workers | ObjectiveWe examined the association between presenteeism and risk of job resignations and unemployment among Japanese workers during the COVID-19 pandemic.
MethodsA prospective study of 27,036 Internet monitors was conducted, starting in December 2020, with 18,560 (68.7%) participating in the follow-up by December 2021. The Work Functioning Impairment Scale (WFun) was used to measure the degree of work function impairment.
ResultsThe group with the highest WFun scores had higher odds ratios (ORs) for both retirement and unemployment for health reasons than the group with the lowest WFun scores. ORs were 2.97 (95%CI: 2.46-3.59, p<0. 001) and 1.80 (95%CI: 1.64-1.98, p<0.001), respectively.
ConclusionsWorkers with work functioning impairment were at increased risk of resignation or unemployment. Management strategies for workers with work functioning impairment are needed to reduce their disadvantages in employment. | epidemiology |
10.1101/2022.04.01.22273244 | Why is leptospirosis hard to avoid for the impoverished? Deconstructing leptospirosis transmission risk and the drivers of knowledge, attitudes, and practices in a disadvantaged community in Salvador, Brazil | Several studies have identified socioeconomic and environmental risk factors for infectious disease, but the relationship between these and knowledge, attitudes, and practices (KAP), and more importantly their web of effects on individual infection risk, have not previously been evaluated. We conducted a cross-sectional KAP survey in an urban disadvantaged community in Salvador, Brazil, leveraging on simultaneously collected fine-scale environmental and epidemiological data on leptospirosis transmission. Residents knowledge influenced their attitudes which influenced their practices. However, different KAP variables were driven by different socioeconomic and environmental factors; and while improved KAP variables reduced risk, there were additional effects of socioeconomic and environmental factors on risk. For example, males and those of lower socioeconomic status were at greater risk, but once we controlled for KAP, male gender and lower socioeconomic status themselves were not direct drivers of seropositivity. Employment was linked to better knowledge and a less contaminated environment, and hence lower risk, but being employed was independently associated with a higher, not lower risk of leptospirosis transmission, suggesting travel to work as a high risk activity. Our results show how such complex webs of influence can be disentangled. They indicate that public health messaging and interventions should take into account this complexity and prioritize factors that limit exposure and support appropriate prevention practices. | epidemiology |
10.1101/2022.04.01.22273280 | SEN support from the start of school and its impact on unplanned hospital utilisation in children with cleft lip and palate: a demonstration target trial emulation protocol using ECHILD | Special Educational Needs (SEN) provision for school children provides extra support and reasonable adjustments for children and young people with additional educational, behavioural or health needs to ensure equal education opportunities; for example those born with a healthcare need such as cleft lip and palate may be provided SEN to aid with challenges in communications. However, there is limited knowledge of whether SEN provisions impact academic or health outcomes in such a population and conducting a randomised controlled trial to establish this evidence is not plausible. In lieu of randomised controlled trials, target trial emulation methods can be used in attempt to answer causal questions using observational data whilst reducing confounding and other biases likely to arise with such data. The Education and Child Health Insights from Linked Data (ECHILD) dataset could be used as part of trial emulation methods to understand the impact of SEN provisions on academic and healthcare outcomes. ECHILD is the first dataset to hold longitudinal school, health and social care data on all pupils in England, obtained by linking the National Pupil Database (NPD) with Hospital Episode Statistics (HES). In this protocol, we describe how the ECHILD dataset could be used to explore and conduct a target trial emulation to evaluate whether children who were born with cleft lip and palate would have different unplanned hospital utilisation if they received or did not receive SEN provisions by Year 1 (specifically by January in their second year of school) when they are aged 5 or 6.
MethodsFocussing on the population of children who are identified as having been born with cleft lip and palate, an intervention of varying levels of SEN provision (including no SEN provision) by January of the second year of school, and an outcome of unplanned hospital utilisation, we apply a trial emulation design to reduce confounding when using observational data to investigate the causal impact of SEN on unplanned hospital admissions. Our target population is children born 2001-2014 who had a recording of cleft lip and palate in HES and who started their second year of primary school (Year 1) in a state school between 2006 and 2019; children with a first recording of cleft lip and palate after Year 1 were excluded (these were pupils who likely immigrated to England after birth). We intend to use a time window of SEN provision assignment between the start of school (reception) and by the January school census in Year 1. Using target trial emulation, we aim to estimate the average treatment effect of SEN provision on the number of unplanned hospital admissions (including admissions to accident and emergency) between the January school census in Year 1 and Year 6 (the end of primary school, when children are 10-11 years old).
Ethics and disseminationPermissions to use linked, de-identified data from Hospital Episode Statistics and the National Public Database were granted by DfE (DR200604.02B) and NHS Digital (DARS-NIC-381972). Ethical approval for the ECHILD project was granted by the National Research Ethics Service (17/LO/1494), NHS Health Research Authority Research Ethics Committee (20/EE/0180) and UCL Great Ormond Street Institute of Child Healths Joint Research and Development Office (20PE06). Stakeholders (academics, clinicians, educators and child/young people advocacy groups) will consistently be consulted to refine populations, interventions and outcomes of studies that use the ECHILD dataset to conduct target trial emulation. Scientific, lay and policy briefings will be produced to inform public health policy through partners in the Department of Education and the Department of Health and Social Care. | pediatrics |
10.1101/2022.04.01.22269254 | Clinicians perspective on use of immune checkpoint inhibitors and related biomarkers for solid tumors | BackgroundImmune checkpoint inhibitors (ICIs) are a valuable treatment option for patients with malignant tumors, but only selected patients respond to ICIs. Available biomarkers are of limited use in guiding ICI therapy.
ObjectiveTo examine clinicians perspective on the use of ICIs and biomarkers for treatment of malignant tumors and to identify unmet needs related to their use.
MethodsWe conducted in-depth telephone interviews of eight oncologists, and 100 oncologists completed online surveys.
ResultsOncologists have a positive attitude toward use of ICIs, and 98% of them prescribe them in all approved indications. Clinicians report that only about half of the patients with solid tumors responded to treatment, overestimated the response rate to ICIs across most types of tumors they treat compared with data in the literature. They ranked the lack of reliability of biomarkers to guide treatment (rating of 4.4 out of 7) as the top challenge with use of ICIs, followed by lack of overall efficacy and toxicity or occurrence of immune-related adverse events. The biomarkers most often used by survey participants were: a comprehensive panel including driver mutations and tumor mutational burden(69% of respondents), programmed cell death ligand-1 (PD-L1) expression (62%), and microsatellite instability (MSI) (56%). Oncologists indicated that they ordered biomarkers for each type of cancer according to their perceived usefulness of each biomarker in predicting the outcomes for ICI therapy, being more likely to use those perceived as useful or very useful.
ConclusionClinicians indicate that more reliable therapy-response prediction biomarkers would have a great impact on treatment decisions for patients with solid tumors, reducing unnecessary treatments, side effects, and health care expenditures. | pharmacology and therapeutics |
10.1101/2022.04.04.22273398 | Health conditions associated with sexual assault in a large hospital population | ObjectiveTo develop a clinical informatics approach to identify patients with a history of sexual assault and to characterize the clinical risk factors and comorbidities of this population in a sex-stratified manner.
MethodsWe developed and applied a keyword-based approach to clinical notes to identify patients with a history of sexual assault in the Vanderbilt University Medical Center (VUMC) electronic health record from 1989 to 2021. Using a phenome-wide association study (PheWAS), we then examined diagnoses that co-occurred with evidence of sexual assault. We also examined whether sex assigned at birth modified any of these associations.
ResultsOur keyword-based algorithm achieved a positive predictive value of 90.4%, as confirmed by manual patient chart review. Out of 1,703 diagnoses tested across all subgroup analyses, we identified a total of 465 associated with sexual assault, many of which have been previously observed in the literature. Interaction analysis revealed 55 sex-differential phenotypic associations.
ConclusionsIn a large hospital setting, disclosures of sexual assault were associated with increased rates of hundreds of health conditions. | public and global health |
10.1101/2022.04.04.22273398 | Health conditions associated with sexual assault in a large hospital population | ObjectiveTo develop a clinical informatics approach to identify patients with a history of sexual assault and to characterize the clinical risk factors and comorbidities of this population in a sex-stratified manner.
MethodsWe developed and applied a keyword-based approach to clinical notes to identify patients with a history of sexual assault in the Vanderbilt University Medical Center (VUMC) electronic health record from 1989 to 2021. Using a phenome-wide association study (PheWAS), we then examined diagnoses that co-occurred with evidence of sexual assault. We also examined whether sex assigned at birth modified any of these associations.
ResultsOur keyword-based algorithm achieved a positive predictive value of 90.4%, as confirmed by manual patient chart review. Out of 1,703 diagnoses tested across all subgroup analyses, we identified a total of 465 associated with sexual assault, many of which have been previously observed in the literature. Interaction analysis revealed 55 sex-differential phenotypic associations.
ConclusionsIn a large hospital setting, disclosures of sexual assault were associated with increased rates of hundreds of health conditions. | public and global health |
10.1101/2022.04.04.22273330 | Effectiveness of a nation-wide COVID-19 vaccination program in Mexico | BACKGROUNDVaccination has been effective in ameliorating the impact of COVID-19. However, estimation of vaccine effectiveness (VE) is still unavailable for some widely used vaccines and underrepresented groups. Here, we report on the effectiveness of a nation-wide COVID-19 vaccination program in Mexico.
METHODSWe used a test-negative design within a national COVID-19 surveillance system to assess VE of the BNT162b2, mRNA-12732, Gam-COVID-Vac, Ad5-nCoV, Ad26.COV2.S, ChAdOx1 and CoronaVac vaccines, against SARS-CoV-2 infection, COVID-19 related hospitalization and death for adults [≥]18 years in Mexico. VE was estimated using Cox proportional hazard models considering time-varying vaccination status in partial and fully vaccinated individuals compared to unvaccinated adults, adjusted by age, sex, comorbidities and municipality. We also estimated VE for adults [≥]60 years, for cases with diabetes and comparing periods with predominance of variants B.1.1.519 and B.1.617.2.
RESULTSWe assessed 793,487 vaccinated compared to 4,792,338 unvaccinated adults between December 24th, 2020, and September 27th, 2021. VE against SARS-CoV-2 infection was highest for fully vaccinated individuals with mRNA-12732 (91.5%, 95%CI 90.3-92.4) and Ad26.COV2.S (82.2%, 95%CI 81.4-82.9), whereas for COVID-19 related hospitalization were BNT162b2 (84.3%, 95%CI 83.6-84.9) and Gam-COVID-Vac (81.4% 95%CI 79.5-83.1) and for mortality BNT162b2 (89.8%, 95%CI 89.2-90.2) and mRNA-12732 (93.5%, 95%CI 86.0-97.0). VE for all evaluated vaccines was reduced for adults [≥]60 years, people with diabetes, and in periods of Delta variant predominance.
CONCLUSIONSAll evaluated vaccines were effective against SARS-CoV-2 infection and COVID-19 related hospitalization and death. Mass vaccination campaigns with multiple vaccine products are feasible and effective to maximize vaccination coverage. | public and global health |
10.1101/2022.03.29.22271489 | Linking private health sector to public COVID-19 response in Kisumu, Kenya: Lessons Learnt | BackgroundCOVID-19 is overwhelming health systems universally. Increased capacity to combat the epidemic is important, while continuing regular healthcare services. This paper describes an innovative Public Private Partnership (PPP) against COVID-19 that from the onset of the epidemic was established in Kisumu County, Western Kenya.
MethodsAn explanatory research design was used. Qualitative in-depth interviews (n=49) were conducted with purposively selected participants including patients, health workers, and policy makers. Thematic analysis was undertaken on interview transcripts and triangulation was performed.
ResultsThe PPP hinged through the provision of central diagnostic COVID-19 services through a parastatal institute (KEMRI). Complementary tasks were divided between Kisumu Department of Health and public and private healthcare providers, supported by an NGO. Facilitators to this PPP included implementation of MoH Guidelines, digitalization of data, strengthening of counseling services and free access to COVID-19 testing services in private facilities. Barriers included, data accessibility, sub optimal financial management.
ConclusionCoordinated PPP can rapidly enhance capacity and quality of COVID-19 epidemic management in African settings. Our PPP model appears scalable, as proven by current developments. Lessons learnt from this initial PPP in Kisumu County will be beneficial to expanding epidemic preparedness to other Counties in Kenya and beyond. | public and global health |
10.1101/2022.03.30.22273051 | A novel immersive virtual reality environment for the motor rehabilitation of stroke patients: A feasibility study | We designed and implemented an immersive virtual reality environment for upper limb rehabilitation, which possesses several notable features. First, by exploiting modern computer graphics its can present a variety of scenarios that make the rehabilitation routines challenging yet enjoyable for patients, thus enhancing their adherence to the therapy. Second, immersion in a virtual 3D space allows the patients to execute tasks that are closely related to everyday gestures, thus enhancing the transfer of the acquired motor skills to real-life routines. Third, in addition to the VR environment, we also developed a client app running on a PC that allows to monitor in real-time and remotely the patients routines thus opening the door to telerehabilitation scenarios.
Here, we report the results of a feasibility study in a cohort of 16 stroke patients. All our patients showed a high degree of comfort in our immersive VR system and they reported very high scores of ownership and agency in embodiment and satisfaction questionnaires. Furthermore, and notably, we found that behavioral performances in our VR tasks correlated with the patients clinical scores (Fugl-Meyer scale) and they can thus be used to assess improvements during the rehabilitation program. While further studies are needed, our results clearly support the feasibility and effectiveness of VR-based motor rehabilitation processes.
Significance statementApproximately 80% of stroke patients suffer from a hemiparesis of the contralateral upper limb. Motor rehabilitation has been proven to be of key importance to regain, partially or totally, the impaired motor skills. Rehabilitation techniques are based on the repetitive and intense execution of simple motor behaviors. As such they can become taxing and cumbersome for the patients. This often produces non-adherence issues with an obvious negative impact on motor recovery.
Here we describe a novel immersive virtual environment for upper limb motor rehabilitation and we report the results that we obtained in a cohort of 16 stroke patients. Our system was designed to turn rehabilitation routines into engaging games and to allow the remote monitoring of the patients exercises thus allowing telerehabilitation.
All our patients showed a high degree of comfort in our immersive VR system and they reported very high scores of ownership and agency in embodiment and satisfaction questionnaires. Furthermore, and notably, we found that behavioral performances in our VR tasks correlated with the patients clinical scores (Fugl-Meyer scale) and they can thus be used to assess improvements during the rehabilitation program. | rehabilitation medicine and physical therapy |
10.1101/2022.04.04.22273424 | Hematopoietic and Lung Platelet Biogenesis as a Prognostic Indicator in Idiopathic Pulmonary Fibrosis (IPF) | Rationale and objectivesThe role of human lung megakaryocytes in homeostasis and their dynamics in disease states remain unknown. We sought to investigate whether megakaryocyte/platelet gene signatures are altered in IPF.
MethodsWe analyzed publicly available transcriptome datasets of lung tissue, bronchoalveolar lavage (BAL) cells, and peripheral whole blood from IPF patients and healthy controls. Enrichment of megakaryocyte and platelet gene signatures in those datasets were estimated using xCell, a novel computational method. Furthermore, we analyzed whether mean platelet volume (MPV) and platelet counts in peripheral blood are associated with lung transplant-free survival in our IPF cohort.
ResultsIn lung tissue, megakaryocyte scores were significantly lower in IPF than in controls. In BAL cells, platelet scores were significantly lower in IPF than in controls, and lower platelet scores were associated with lower lung transplant-free survival in IPF. In contrast, in blood, megakaryocyte scores were significantly higher in IPF than in controls, and higher megakaryocyte scores were associated with lower disease progression-free survival in IPF. Furthermore, higher MPV was associated with lower transplant-free survival in our IPF cohort, independent of age, sex, forced vital capacity (FVC), and diffusing capacity of the lung for carbon monoxide (DLCO).
ConclusionsIn IPF, megakaryocyte/platelet gene signatures were altered in a compartment-specific manner. Moreover, those signatures and MPV in blood were associated with important clinical outcomes such as transplant-free survival. These findings provide new insights into altered megakaryocyte/platelet biogenesis in IPF and suggest the potential utility of megakaryocyte/platelet-based biomarkers in IPF. | respiratory medicine |
10.1101/2022.04.01.22273155 | Patient Perceptions by Race of Educational Animations About Living Kidney Donation Made for a Diverse Population | IntroductionThis qualitative study sought to identify potential design and delivery alterations to inform cultural adaptation of educational animations about living donor kidney transplantation (LDKT) - previously developed for a diverse population - to better fit Black Americans needs.
MethodsWe conducted a secondary analysis of 88 transcripts derived from interviews and focus groups conducted with diverse target users (62 kidney failure patients, 36 prior/potential donors, and 11 care partners) to develop 12 animations about LDKT, named KidneyTIME. Statements were abstracted and coded pertaining to cognitive and communication barriers to LDKT, and the perceived value of using the videos to learn and share the information with social network members using content analysis. Incidence counts of each content code were also calculated to assess differences between Black and non-Black patients.
ResultsCognitive barrier codes included lack of knowledge, ambivalence, and concern for donor. Communication barrier codes included reluctance and difficulty talking about LDKT. Cognitive facilitating codes included attention-getting, efficient learning, manageable content, emotional impact, and new knowledge. Communication facilitating codes included delivery through many dissemination channels and broadly shareable. Compared to non-black patients (n=33) Black patients (n=29) more often stated concern for donor and reluctance/difficulty talking about LDKT as barriers, and less often stated efficient learning and manageable content as facilitators.
ConclusionFindings highlight the value of LDKT informational content that is visually appealing, digestible, non-threatening, and highly shareable. Heterogeneity may exist when considering access and intervention preferences in using KidneyTIME videos and highlight a potential for further cultural targeting or tailoring. | transplantation |
10.1101/2022.04.01.22273253 | Effect of reduction in brain amyloid levels on change in cognitive and functional decline in randomized clinical trials: an updated instrumental variable meta-analysis | ObjectiveTo update a recently published analysis exploring the causal association between positron emission tomography (PET)-measured change in brain {beta}-amyloid plaque and cognitive decline in patients with Alzheimers disease (AD) enrolled in randomized clinical trials (RCTs).
DesignUpdated instrumental variable meta-analysis.
SettingSixteen RCTs were included in this updated meta-analysis versus 14 in the original publication by Ackley et al.1 Data sources were ClinicalTrials.org, Alzheimer Research Forum (alzforum.org), PubMed and clinical study reports from 2015 to March 1, 2022. Three researchers extracted data from the data sources independently and subsequently resolved any discrepancy.
PopulationRCTs that evaluated {beta}-amyloid targeting therapies and enrolled adult patients with AD dementia or mild cognitive impairment due to AD with data on {beta}-amyloid as measured by PET and clinical outcome measures.
Main outcome measuresAn instrumental variable meta-analysis was performed to compute trial and drug-specific estimates and pooled estimates of the effect of change in PET {beta}-amyloid standard uptake value ratio (SUVR) on cognitive and functional decline with 95% confidence intervals (CIs) and associated p-values. This analysis updated and expanded a prior meta-analysis by Ackley et al.1 using the same methodology and clinical outcome measures: Clinical Dementia Rating-Sum of Boxes (CDR-SB), Alzheimers Disease Assessment Scale-Cognitive Subscale (ADAS-Cog), and Mini-Mental State Examination (MMSE).
ResultsThe reduction of PET-measured {beta}-amyloid induced a statistically significant reduction in cognitive and functional decline. The effect size was characterized by an estimated change (95% CI) of 0.09 (0.034, 0.15) on the CDR-SB; 0.33 (0.12, 0.55) on the ADAS-Cog; and 0.13 (0.017, 0.24) on the MMSE for each decrease of 0.1-unit in PET {beta}-amyloid SUVR.
ConclusionThis updated instrumental variable meta-analysis of 16 RCTs provides statistically significant evidence of a causal relationship between the reduction in brain {beta}-amyloid plaque and the reduction in cognitive and functional decline in patients with Alzheimers disease. | neurology |
10.1101/2022.04.03.22272935 | Exploring the relationship between womens experience of postnatal care and reported staffing measures: an observational study | BackgroundWomen have reported dissatisfaction with care received on postnatal wards and this area has been highlighted for improvement. Studies have shown an association between midwifery staffing levels and postnatal care experiences, but so far, the influence of registered and support staff deployed in postnatal wards has not been studied. This work is timely as the number of support workers has increased in the workforce and there has been little research on skill mix to date.
MethodsCross sectional secondary analysis including 13,264 women from 123 postnatal wards within 93 hospital Trusts. Staffing was measured at organisational level as Full Time Equivalent staff, and at ward level using Care Hours Per Patient Day. Womens experiences were assessed using four items from the 2019 national maternity survey. Multilevel logistic regression models were used to examine relationships and adjust for maternal age, parity, type of birth, medical staff and number of births per year in the Trust.
ResultsTrusts with higher levels of midwifery staffing had higher rates of women reporting positive experiences of postnatal care. However, when staffing was measured at a ward level, there was no evidence of an association between registered staffing and patient experience. Wards with higher levels of support worker staffing were associated with higher rates of women reporting they had help when they needed it and were treated with kindness and understanding.
ConclusionThe relationship between reported registered staffing levels on postnatal wards and womens experience is uncertain. Further work should be carried out to examine why relationships observed at an organisational level were not replicated closer to the patient, at ward level. It is possible that reported staffing levels do not reflect staff as deployed if midwives are floated to cover delivery units. This study highlights the potential contribution of support workers in providing quality care on postnatal wards. | nursing |
10.1101/2022.04.03.22273366 | Deep learning-based prognosis prediction among preeclamptic pregnancies using electronic health record data | BackgroundPreeclampsia (PE) is one of the leading factors in maternal and perinatal mortality and morbidity worldwide. The only cure for PE to date is to deliver the placenta and stop gestation. However, the timing of delivery among PE patients is essential to minimize the risk of severe maternal morbidities, and at the same time ensure the survival of the baby.
MethodsIn this study, we constructed a series of deep learning-based models to predict the prognosis, or the time to delivery, since the initial diagnosis of PE using electronic health record (EHR) data. We extracted and processed 1578 pregnancies in Michigan Medicine at the University of Michigan in 2015-2021 as the discovery cohort. Using the Cox-nnet v2 algorithm, we built the baseline model with EHR information prior to diagnosis, as well as the full model including baseline information and lab testing results and vital signs at the time of diagnosis. We evaluated the models using the C-index and log-rank p-values in KM survival curves, using both 20% testing data of the Michigan cohort, as well as 1177 PE pregnancy EHR data from the Medical Center of the University of Florida.
ResultsThe baseline prognosis model for time to delivery since PE diagnosis achieved C-index values of 0.75 and 0.72 on the training and testing set respectively. While the full model reached C-indices of 0.77 and 0.74 in the same training and testing sets. Both models performed better than their Cox-PH model counterparts. The seven most important features in the baseline model in descending order were diagnosis gestational age, severe PE, past PE, age, parity, gravidity, and uncomplicated diabetes. Meanwhile, 14 most important features were selected and interpreted in the full model, including diagnosis gestational age, parity, severe PE, past PE, features in lab tests (white blood cell, platelet, and red blood cell counts, AST value), min respiratory rate, and features measuring blood pressure (minimum, mean and standard deviation of systolic blood pressure, and maximum and standard deviation of diastolic blood pressure).
ConclusionThe time to delivery predicting models provide clinicians valuable tools and options to quantify the delivery risks and make better decisions on the optimal delivery time of PE patients at the time of diagnosis. Implementation of these actionable models into PE clinical care practice is expected to significantly improve the management of PE patients. | obstetrics and gynecology |
10.1101/2022.04.03.22273366 | Deep learning-based prognosis prediction among preeclamptic pregnancies using electronic health record data | BackgroundPreeclampsia (PE) is one of the leading factors in maternal and perinatal mortality and morbidity worldwide. The only cure for PE to date is to deliver the placenta and stop gestation. However, the timing of delivery among PE patients is essential to minimize the risk of severe maternal morbidities, and at the same time ensure the survival of the baby.
MethodsIn this study, we constructed a series of deep learning-based models to predict the prognosis, or the time to delivery, since the initial diagnosis of PE using electronic health record (EHR) data. We extracted and processed 1578 pregnancies in Michigan Medicine at the University of Michigan in 2015-2021 as the discovery cohort. Using the Cox-nnet v2 algorithm, we built the baseline model with EHR information prior to diagnosis, as well as the full model including baseline information and lab testing results and vital signs at the time of diagnosis. We evaluated the models using the C-index and log-rank p-values in KM survival curves, using both 20% testing data of the Michigan cohort, as well as 1177 PE pregnancy EHR data from the Medical Center of the University of Florida.
ResultsThe baseline prognosis model for time to delivery since PE diagnosis achieved C-index values of 0.75 and 0.72 on the training and testing set respectively. While the full model reached C-indices of 0.77 and 0.74 in the same training and testing sets. Both models performed better than their Cox-PH model counterparts. The seven most important features in the baseline model in descending order were diagnosis gestational age, severe PE, past PE, age, parity, gravidity, and uncomplicated diabetes. Meanwhile, 14 most important features were selected and interpreted in the full model, including diagnosis gestational age, parity, severe PE, past PE, features in lab tests (white blood cell, platelet, and red blood cell counts, AST value), min respiratory rate, and features measuring blood pressure (minimum, mean and standard deviation of systolic blood pressure, and maximum and standard deviation of diastolic blood pressure).
ConclusionThe time to delivery predicting models provide clinicians valuable tools and options to quantify the delivery risks and make better decisions on the optimal delivery time of PE patients at the time of diagnosis. Implementation of these actionable models into PE clinical care practice is expected to significantly improve the management of PE patients. | obstetrics and gynecology |
10.1101/2022.04.04.22271935 | Workplace ventilation improvement to address coronavirus disease 2019 cluster occurrence in a manufacturing factory | Aim and MethodsA coronavirus disease 2019 (COVID-19) cluster emerged in a manufacturing factory in early August 2021. In November 2021, a ventilation survey using tracer gas and data analysis was performed to reproduce the situation at the time of cluster emergence and verify that ventilation in the office increased the risk of aerosol transmission; verify the effectiveness of measures implemented immediately in August; and verify the effectiveness of additional measures when previously enforced measures proved inadequate.
ResultsAt the time of cluster emergence, the average ventilation frequency was 0.73 times/h, less than the 2 times/h recommended by the Ministry of Health, Labour, and Welfare; as such, the factorys situation was deemed to have increased the risk of aerosol transmission. Due to the measures already taken at the time of the survey, the ventilation frequency increased to 3.41 times/h on average. It was confirmed that ventilation frequency increased to 8.33 times/h on average, when additional measures were taken.
ConclusionTo prevent the re-emergence of COVID-19 clusters, it is necessary to continue the measures that have already been implemented. Additionally, introduction of real-time monitoring that visualizes CO2 concentrations is recommended. Furthermore, we believe it is helpful that external researchers in multiple fields and internal personnel in charge of health and safety department and occupational health work together to confirm the effectiveness of conducted measures, such as this case. | occupational and environmental health |
10.1101/2022.03.31.22273116 | Accurate genome-wide germline profiling from decade-old archival tissue DNA reveals the contribution of common variants to precancer disease outcome | BackgroundInherited variants have been shown to contribute to cancer risk, disease progression, and response to treatment. Such studies are, however, arduous to conduct, requiring large sample sizes, cohorts or families, and more importantly, a long follow-up to measure a relevant outcome such as disease onset or progression. Unless collected for a dedicated study, germline DNA from blood or saliva are typically not available retrospectively, in contrast to surgical tissue specimens which are systematically archived.
ResultsWe evaluated the feasibility of using DNA extracted from low amounts of fixed-formalin paraffin-embedded (FFPE) tumor tissue to obtain accurate germline genetic profiles. Using matching blood and archival tissue DNA from 10 individuals, we benchmarked low-coverage whole-genome sequencing (lc-WGS) combined with genotype imputation and measured genome-wide concordance of genotypes, polygenic risk scores (PRS), and HLA haplotypes. Concordance between blood and tissue was high (r2>0.94) for common genome-wide single nucleotide polymorphisms (SNPs) and across 22 disease-related PRS (mean r=0.93). HLA haplotypes imputed from tissue DNA were 96.7% (Class I genes) and 82.5% (Class II genes) concordant with deep targeted sequencing of HLA from blood DNA. Using the validated methodology, we estimated breast cancer PRS in 36 patients diagnosed with breast ductal carcinoma in situ (11.7 years median follow-up time) including 22 who were diagnosed with breast cancer subsequent event (BSCE). PRS was significantly associated with BCSE (HR=2.5, 95%CI: 1.4-4.5) and the top decile patients were modeled to have a 24% chance of BCSE at 10 years, hence suggesting the addition of PRS could improve prognostic models which are currently inadequate.
ConclusionsThe abundance and broad availability of archival tissue specimens in oncology clinics, paired with the effectiveness of germline profiling using lc-WGS and imputation, represents an alternative cost and resource-effective alternative in the design of long-term disease progression studies. | oncology |
10.1101/2022.04.04.22273159 | Equally low blood metal ion levels at 10-years follow up of total hip arthroplasties with Oxinium, CoCrMo and stainless steel femoral heads. Data from a randomized clinical trial | Background and purposeThe use of inert head materials such as ceramic heads has been proposed as a method of reducing wear and corrosion products from the articulating surfaces in total hip arthroplasty (THA), as well as from the stem-head taper connection. In the present study, we wanted to evaluate the blood metal ion levels of Oxinium modular femoral heads at 10 years postoperatively and compare with the CoCrMo modular femoral head counterpart, and the monoblock stainless steel Charnley prosthesis.
Patients and methods150 patients with osteoarthritis of the hip joint previously included in a randomized clinical trial that had the primary aim of examining polyethylene wear in cemented THAs were re-grouped according to femoral head material. One group (n=30) had received the Charnley monoblock stainless steel stem with 22.2 mm head (DePuy, UK). The other groups (n=120) received a Spectron EF CoCrMo stem with either a 28 mm CoCrMo or Oxinium modular head (Smith & Nephew, USA). After 10 years, 19 patients had died and 38 withdraw from the study. Five patients were revised due to infection in their hip and 7 were revised due to aseptic loosening of either the cup or stem or both. 81 patients with median age of 79 years (70-91) were available for whole blood metal ion analysis.
ResultsThe levels of Co, Cr, Ni and Zr in the blood were generally low with all head materials (medians <0.3 micrograms/L). We found no statistically significant difference in the concentration of metal ions between the group with Charnley prosthesis (n=17) and those with Spectron EF stems with either CoCrMo (n=32) or Oxinium heads (n=32) (p=0.18-0.81).
InterpretationIn this study of patients with non-revised THAs, no indication of severe trunnion corrosion was found. The blood ion levels with the use of ceramic Oxinium modular femoral heads were similar to CoCrMo or stainless steel heads at 10 years follow-up. | orthopedics |
10.1101/2022.04.04.22272893 | Age acquired skewed X Chromosome Inactivation is associated with adverse health outcomes in humans | BackgroundAgeing is a heterogenous process characterised by cellular and molecular hallmarks, including changes to haematopoietic stem cells, and is a primary risk factor for chronic diseases. X chromosome inactivation (XCI) randomly transcriptionally silences either the maternal or paternal X in each cell of XX,46 females to balance the gene expression with XY,46 males. Age acquired XCI-skew describes the preferential inactivation of one X chromosome across a tissue, which is particularly prevalent in blood tissues of ageing females and yet its clinical consequences are unknown.
MethodsWe assayed XCI in 1,575 females from the TwinsUK population cohort and employed prospective, cross-sectional, and intra-twin designs to characterise the relationship of XCI-skew with molecular, cellular, and organismal measures of ageing, and cardiovascular disease risk and cancer diagnosis.
ResultsWe demonstrate that XCI-skew is independent of traditional markers of biological ageing and is associated with a haematopoietic bias towards the myeloid lineage. Using an atherosclerotic cardiovascular disease risk score, which captures traditional risk factors, XCI-skew is associated with an increased cardiovascular disease risk both cross-sectionally and within XCI-skew discordant twin pairs. In a prospective 10-year follow-up study, XCI-skew is predictive of future cancer incidence.
ConclusionsOur study demonstrates that age acquired XCI-skew captures changes to the haematopoietic stem cell population and has clinical potential as a unique biomarker of chronic disease risk.
FundingKSS acknowledges funding from the Medical Research Council (MR/M004422/1 and MR/R023131/1). JTB acknowledges funding from the ESRC (ES/N000404/1). MM acknowledges funding from the National Institute for Health Research (NIHR)-funded BioResource, Clinical Research Facility and Biomedical Research Centre based at Guys and St Thomas NHS Foundation Trust in partnership with Kings College London. TwinsUK is funded by the Wellcome Trust, Medical Research Council, European Union, Chronic Disease Research Foundation (CDRF), Zoe Global Ltd and the National Institute for Health Research (NIHR)-funded BioResource, Clinical Research Facility and Biomedical Research Centre based at Guys and St Thomas NHS Foundation Trust in partnership with Kings College London. | genetic and genomic medicine |
10.1101/2022.04.05.22273167 | Low-dose IL-2 reduces IL-21+ T cells and induces a long-lived anti-inflammatory gene expression signature inversely modulated in COVID-19 patients | Despite early clinical success, the mechanisms of action of low-dose interleukin-2 (LD-IL-2) immunotherapy remain only partly understood. Here, we examine the effects of interval administration of low-dose recombinant IL-2 (iLD-IL-2) using high-resolution, single-cell multiomics. We confirmed that iLD-IL-2 selectively expands thymic-derived FOXP3+HELIOS+ Tregs and CD56br NK cells, and provide new evidence for an IL-2-induced reduction of highly differentiated IL-21-producing CD4+ T cells. We also discovered that iLD-IL-2 induces an anti-inflammatory gene expression signature, which was detected in all T and NK cell subsets even one month after treatment. The same signature was present in COVID-19 patients, but in the opposite direction. These findings indicate that the sustained Treg and CD56br NK cell increases induced by our 4-week iLD-IL2 treatment create a long-lasting and global anti-inflammatory environment, warranting further investigations of the potential clinical benefits of iLD-IL-2 in immunotherapy, including the possibility of reversing the pro-inflammatory environment in COVID-19 patients. | genetic and genomic medicine |
10.1101/2022.04.01.22273008 | Digital health and quality of home-based primary care for older adults: A scoping review protocol | IntroductionDigital health strategies have expanded in home care, particularly in home-based primary care (HBPC) following the increase in the older adult population and the need to respond to the higher demand of chronic conditions and health frailties of this population. There was an even greater demand with COVID-19 and subsequent isolation/social distancing measures for this group considered at risk. The objective of this study is to map digital health strategies and analyze their impacts on the quality of home-based primary care for older adults around the world.
Methods and analysisThis is a scoping review protocol which will enable a rigorous, transparent and reliable synthesis of knowledge. The review will be developed in the theoretical perspective of Arksey and Omalley, with updates by Levac et al. and Peters et al., based on the Joanna Briggs Institute manual, and guided by the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews (PRISMA-ScR). Data from white literature will be extracted from multidisciplinary health databases such as: the Virtual Health Library, LILACS, PubMed, Scopus, Web of Science, CINAHL and Embase; while Google Scholar will be used for gray literature. The quantitative data will be analyzed through descriptive statistics and qualitative data through thematic analysis. The results will be submitted to stakeholder consultation for preliminary sharing of study findings, identification of gaps, effective dissemination strategies and suggestions for future studies.
Ethics and disclosureStakeholder consultation was approved by the Research Ethics Committee of Hospital Onofre Lopes, Federal University of Rio Grande do Norte, CAEE 54853921.0.0000.5292, and will not involve direct patient participation. It is registered in the Open Science Framework (OSF) (https://osf.io/vgkhy/). The results will be disseminated through publication in open access scientific journals, in scientific events and academic and community journals.
Scoping review registration OSF:https://osf.io/vgkhy/. DOI 10.17605/OSF.IO/VGKHY.
This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/
Strengths and limitations of this studyO_LIThis scoping review will be the first to explore the types, uses and impacts of digital health strategies on home-based primary care (HBPC) for older adults around the world.
C_LIO_LIThe geographic mapping of the studies will provide a global overview of the use of digital health strategies in providing home care.
C_LIO_LIThe search strategy for studies will be broad, including the main multidisciplinary health databases and gray literature.
C_LIO_LIThe expansion of the use of digital health strategies during COVID-19 may be a limiting factor to the scientific quality of studies.
C_LIO_LIA large number of publications in the gray literature may imply an important limitation of the study but also meaning that digital home care still lacks maturity.
C_LI | health informatics |
10.1101/2022.03.29.22273078 | Extraction of Sleep Information from Clinical Notes of Alzheimer's Disease Patients Using Natural Language Processing | Alzheimers Disease (AD) is the most common form of dementia in the United States. Sleep is one of the lifestyle-related factors that has been shown critical for optimal cognitive function in old age.. However, there is a lack of research studying the association between sleep and AD incidence. A major bottleneck for conducting such research is that the traditional way to acquire sleep information is time-consuming, inefficient, non-scalable, and limited to patients subjective experience. In this study, we developed a rule-based NLP algorithm and machine learning models to automate the extraction of sleep-related concepts, including snoring, napping, sleep problem, bad sleep quality, daytime sleepiness, night wakings, and sleep duration, from the clinical notes of patients diagnosed with AD. We trained and validated the proposed models on the clinical notes retrieved from the University of Pittsburgh of Medical Center (UPMC). The results show that the rule-based NLP algorithm consistently achieved the best performance for all sleep concepts. | health informatics |
10.1101/2022.03.29.22273078 | Extraction of Sleep Information from Clinical Notes of Alzheimer's Disease Patients Using Natural Language Processing | Alzheimers Disease (AD) is the most common form of dementia in the United States. Sleep is one of the lifestyle-related factors that has been shown critical for optimal cognitive function in old age.. However, there is a lack of research studying the association between sleep and AD incidence. A major bottleneck for conducting such research is that the traditional way to acquire sleep information is time-consuming, inefficient, non-scalable, and limited to patients subjective experience. In this study, we developed a rule-based NLP algorithm and machine learning models to automate the extraction of sleep-related concepts, including snoring, napping, sleep problem, bad sleep quality, daytime sleepiness, night wakings, and sleep duration, from the clinical notes of patients diagnosed with AD. We trained and validated the proposed models on the clinical notes retrieved from the University of Pittsburgh of Medical Center (UPMC). The results show that the rule-based NLP algorithm consistently achieved the best performance for all sleep concepts. | health informatics |
10.1101/2022.04.03.22273268 | Classification of Omicron BA.1, BA.1.1 and BA.2 sublineages by TaqMan assay consistent with whole genome analysis data | ObjectiveRecently, the Omicron strain of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) has spread and replaced the previously dominant Delta strain. Several Omicron sublineages (BA.1, BA.1.1 and BA.2) have been identified, with in vitro and preclinical reports showing that the pathogenicity and therapeutic efficacy differs between BA.1 and BA.2. We sought to develop a TaqMan assay to identify these subvariants.
MethodsA TaqMan assay was constructed for rapid identification and genotyping of Omicron sublineages. We analyzed three characteristic mutations of the spike gene, {Delta}69-70, G339D and Q493R, by TaqMan assay. The accuracy of the TaqMan assay was examined by comparing its results with the results of whole genome sequencing (WGS) analysis.
ResultsA total of 169 SARS-CoV-2 positive samples were analyzed by WGS and TaqMan assay. The 127 samples determined as BA.1/BA.1.1 by WGS were all positive for {Delta}69-70, G339D and Q493R by TaqMan assay. Forty-two samples determined as BA.2 by WGS were negative for {Delta}69-70 but positive for G339D and Q493R by TaqMan. The concordance rate between WGS and the TaqMan assay was 100% (169/169).
ConclusionTaqMan assays targeting characteristic mutations are useful for identification and discrimination of Omicron sublineages. | infectious diseases |
10.1101/2022.03.29.22273028 | Dengue-2 Cosmopolitan genotype detection and emergence in South America. | We report the first confirmed case of DENV-2 Cosmopolitan genotype isolated from a male patient from Aparecida de Goiania, Goias state, Midwest Brazil. Using nanopore sequencing, and phylogenetic analyses our findings provide the first preliminary insight regarding the introduction of this emerging genotype in Brazil and South America.
Article Summary LineMolecular detection of Dengue virus type 2 Cosmopolitan genotype in South America | infectious diseases |
10.1101/2022.04.04.22273429 | Viral dynamics of Omicron and Delta SARS-CoV-2 variants with implications for timing of release from isolation: a longitudinal cohort study | BackgroundIn January 2022, United States guidelines shifted to recommend isolation for 5 days from symptom onset, followed by 5 days of mask wearing. However, viral dynamics and variant and vaccination impact on culture conversion are largely unknown.
MethodsWe conducted a longitudinal study on a university campus, collecting daily anterior nasal swabs for at least 10 days for RT-PCR and culture, with antigen rapid diagnostic testing (RDT) on a subset. We compared culture positivity beyond day 5, time to culture conversion, and cycle threshold trend when calculated from diagnostic test, from symptom onset, by SARS-CoV-2 variant, and by vaccination status. We evaluated sensitivity and specificity of RDT on days 4-6 compared to culture.
ResultsAmong 92 SARS-CoV-2 RT-PCR positive participants, all completed the initial vaccine series, 17 (18.5%) were infected with Delta and 75 (81.5%) with Omicron. Seventeen percent of participants had positive cultures beyond day 5 from symptom onset with the latest on day 12. There was no difference in time to culture conversion by variant or vaccination status. For the 14 sub-study participants, sensitivity and specificity of RDT were 100% and 86% respectively.
ConclusionsThe majority of our Delta- and Omicron-infected cohort culture-converted by day 6, with no further impact of booster vaccination on sterilization or cycle threshold decay. We found that rapid antigen testing may provide reassurance of lack of infectiousness, though masking for a full 10 days is necessary to prevent transmission from the 17% of individuals who remain culture positive after isolation.
Main PointBeyond day 5, 17% of our Delta and Omicron-infected cohort were culture positive. We saw no significant impact of booster vaccination on within-host Omicron viral dynamics. Additionally, we found that rapid antigen testing may provide reassurance of lack of infectiousness. | infectious diseases |
10.1101/2022.04.01.22271763 | Preliminary diagnostic performance of the VIDAS(R) TB-IGRA for the detection of Mycobacterium tuberculosis infection and disease | ImportanceAccurate diagnosis of tuberculosis (TB) infection can be achieved with interferon-{gamma} release assays. Their performance can be improved by utilizing fully automated, single-patient formats.
ObjectiveEstablish clinical thresholds for a new interferon-{gamma} release assay, the VIDAS(R) TB-IGRA, and compare diagnostic performance in detecting tuberculosis infection and disease with the established QuantiFERON-TB Gold Plus (QFT-Plus).
DesignPreliminary diagnostic performance study (October 2nd, 2019-February 20th, 2020).
SettingMulticenter.
ParticipantsParticipants were divided into TB disease, high-risk, and low-risk populations. The confirmed TB disease population included 107 patients. The high-risk population included 162 individuals with flagged risk factors on a questionnaire but without objective clinical confirmation of TB. The Low-risk population included 117 healthy blood donors from the French National Blood Bank.
ExposuresTuberculosis.
Main Outcomes and MeasuresPositive and negative percent agreement (PPA, NPA) were determined between the VIDAS(R) TB-IGRA and QFT-Plus. In the TB disease and low-risk populations, sensitivity was also measured against bacterial culture and PCR.
ResultsThe VIDAS(R) TB-IGRA produced fewer indeterminate results than the QFT-Plus (1/107 vs. 23/107) in the TB disease population. One analysis included indeterminate results as false negatives (94 positives and 10 false negatives vs. 56 positives and 48 false negatives), and the VIDAS(R) TB-IGRA exhibited higher sensitivity than the QFT-Plus (90.4% vs. 53.8%) (p<0.0001). Another analysis excluded indeterminate results (76 positives and 4 false negatives vs. 55 positives and 25 false negatives), and the VIDAS(R) TB-IGRA again exhibited higher sensitivity than the QFT-Plus (95.0% vs. 68.8%) (p<0.0001). A 98.2% PPA was calculated between the two tests with this dataset.
In the high-risk population, the VIDAS(R) TB-IGRA exhibited a strong PPA (94.4%) with the QFT-Plus. However, a lower than expected NPA was observed (85.2%). In the low-risk population, the VIDAS(R) TB-IGRA demonstrated high specificity (94.9%) and a strong NPA (98.2%) with the QFT-Plus.
Conclusions and RelevanceThe fully automated VIDAS(R) TB-IGRA is a promising diagnostic test for both TB infection and disease. It exhibits higher sensitivity while maintaining specificity and produces fewer indeterminate interpretations. Its easy-to-use, single-patient format may lead to increased TB testing to help with the worldwide eradication of the disease.
KEY POINTSO_ST_ABSQuestionC_ST_ABSWhat is the diagnostic performance of the VIDAS(R) TB-IGRA in detecting tuberculosis infection and disease?
FindingsThe VIDAS(R) TB-IGRA exhibited high sensitivity in individuals with tuberculosis disease (90.4-95.0%), high specificity in healthy blood donors (94.9%), a high positive percent agreement (PPA) in individuals with a high risk of tuberculosis infection (94.4%), and it produced a low number of indeterminate results (1/386).
MeaningThe VIDAS(R) TB-IGRA is a promising diagnostic test for both tuberculosis infection and disease. | infectious diseases |
10.1101/2022.04.04.22273344 | Determinants of mortality among COVID-19 patients with diabetes mellitus in Addis Ababa, Ethiopia, 2022: An unmatched case-control study | IntroductionCOVID-19 remains one of the leading causes of death seeking global public health attention. Diabetes mellitus is associated with severity and lethal outcomes up to death independent of other comorbidities. Nevertheless, information regarding the determinant factors that contributed to the increased mortality among diabetic COVID-19 patients is limited. Thus, this study aimed at identifying the determinants of mortality among COVID-19 infected diabetic patients.
MethodsAn unmatched case-control study was conducted on 340 randomly selected patients by reviewing patient records. Data were collected using a structured extraction checklist, entered into Epi data V-4.4.2.2, and analyzed using SPSS V-25. Then, binary logistic regression was used for bivariate and multivariable analysis. Finally, an adjusted odds ratio with 95% CI and a p-value of less than 0.05 was used to determine the strength of association and the presence of a statistical significance consecutively.
ResultsThe study was conducted on 340 COVID-19 patients (114 case and 226 controls). Patient age (AOR=4.90; 95% CI: 2.13, 11.50), severity of COVID-19 disease (AOR=4.95; 95% CI: 2.20, 11.30), obesity (AOR=7.78; 95% CI: 4.05, 14.90), hypertension (AOR=5.01; 95% CI: 2.40, 10.60), anemia at presentation (AOR=2.93; 95% CI: 1.29, 6.65), and AKI after hospital admission (AOR=2.80; 95% CI: 1.39, 5.64) had statistically significant association with increased mortality of diabetic patients with COVID-19 infection. Conversely, presence of RVI co-infection was found to be protective against mortality (AOR=0.35; 95% CI: 0.13, 0.90).
ConclusionPatient age (<65years), COVID-19 disease severity (mild and moderate illness), presence of hypertension, obesity, anemia at admission, and AKI on admission was independently associated with increased mortality of diabetic COVID-19 patients. Contrariwise, the presence of RVI co-infection was found to be protective against patient death. Consequently, COVID-19 patients with diabetes demand untiring efforts, and focused management of the identified factors will substantially worth the survival of diabetic patients infected with COVID-19.
What is already known on this topic?Diabetes mellitus is associated with severity and lethal outcomes up to death independent of other comorbidities. Previous studies indicated that diabetic patients have almost four times increased risk of severe disease and death due to COVID-19 infection. Consequently, with this increased mortality and other public health impacts, numerous reports have been evolved worldwide on the link between COVID-19 and DM, and diabetes management during the COVID-19 pandemic. However, information regarding the determinant factors that lead to the increased mortality among diabetic COVID-19 patients is not well-studied yet.
What this study adds?O_LIPatient age (<65years), COVID-19 disease severity (mild and moderate illness), presence of hypertension, obesity, anemia at admission, and AKI on hospital admission were independently associated with increased mortality of COVID-19 patients with DM.
C_LIO_LIIn addition, RVI co-infection was found to be protective against patient death.
C_LI | infectious diseases |
10.1101/2022.04.04.22273425 | Combination of Baricitinib plus Remdesivir and Dexamethasone improves time to recovery and mortality among hospitalized patients with severe COVID-19 infection | BackgroundThere seems to be a gap in the therapeutic options for severe Covid-19 pneumonia. Though the beneficial effect of combination treatment with baricitnib and remdesivir in accelerating clinical status improvement is described, the impact of the triple therapy with baricitinib + remdesivir/dexamethasone is not known.
MethodsA retrospective observational study comparing the effect of baricitinib plus standard treatment (remdesivir and dexamethasone) with standard therapy in patients requiring [≥] 5 L/min O2 was conducted. The primary outcome was to compared time to recovery in both groups, and the secondary outcomes was to determine mortality rate at discharge.
ResultsOf 457 patients hospitalized during the study period, 51 patients received standard treatment while 88 patients received baricitinib plus standard treatment. In baricitinib group, the rate ratio of recovery was 1.28 (95%CI 0.84-1.94, p=0.24) with a reduction in median time to recovery of 3 days compared to standard treatment group. Subgroup analysis based on Ordinal Scale showed reduction in median time to recovery by 4 and 2 days with rate ratio of recovery of 2.95 (1.03-8.42, p =0.04) and 1.80 (1.09-2.98, p=0.02) in Ordinal Scale 5 and 6 respectively. No benefit was found in the Ordinal Scale 7 subgroup. An overall decrease in rate (15.9% vs 31.4% p=0.03) a likelihood (OR 0.41, 95%CI 0.18-0.94, p=0.03) of mortality was observed in the baricitinib group. Bacteremia and thrombosis were noted in the Baricitinib group, but comparable with the Standard of care group.
ConclusionBaricitinib with standard therapy reduced time to recovery and offer mortality benefit in patients with severe COVID-19 pneumonia." | intensive care and critical care medicine |
10.1101/2022.03.31.22272425 | Hyperinflammatory ARDS is characterized by interferon-stimulated gene expression, T-cell activation, and an altered metatranscriptome in tracheal aspirates | Two molecular phenotypes of the acute respiratory distress syndrome (ARDS) with substantially different clinical trajectories have been identified. Classification as "hyperinflammatory" or "hypoinflammatory" depends on plasma biomarker profiling. Differences in the biology underlying these phenotypes at the site of injury, the lung, are unknown. We analyze tracheal aspirate (TA) transcriptomes from 46 mechanically ventilated subjects to assess differences in lung inflammation and repair between ARDS phenotypes. We then integrate these results with metatranscriptomic sequencing, single-cell RNA sequencing, and plasma proteomics to identify distinct features of each ARDS phenotype. We also compare phenotype-specific differences in gene expression to experimental models of acute lung injury and use an in silico analysis to identify candidate treatments for each phenotype. We find that hyperinflammatory ARDS is associated with increased integrated stress response and interferon gamma signaling, distinct immune cell polarization, and differences in microbial community composition in TA. These findings demonstrate that each phenotype has distinct respiratory tract biology that may be relevant to developing effective therapies for ARDS. | intensive care and critical care medicine |
10.1101/2022.04.04.22273407 | Effectiveness of Oral Health Education Interventions on Oral Health Literacy Levels in Adults; A Systematic Review | BackgroundOral health literacy within the construct of health literacy may be instrumental in decreasing oral health disparities and promoting oral health. Even though current research links oral health literacy to oral health knowledge and education, the impact of educational intervention on oral health literacy remains controversial. We aimed to identify effective health education interventions delivered with a focus on oral health literacy.
MethodsAn electronic systematic search in PubMed, Scopus, Web of Science and Cochrane library and gray literatue was performed for relevant studies (1995-2021). Experimental study designs of randomized controlled trials, non-randomized controlled trials, and quasi-experimental studies in which adults aged 18 years or older, male, or female (participants) trained under a health education intervention (intervention) were compared with those with no health education or within the usual care parameters (comparison). An assessment of oral health literacy levels (outcome) were included according to the PICO question. The search was conducted by applying filters for the title, abstract and methodological quality of the data, and English language. Study screening, extraction and critical appraisal was performed by two independent reviewers. Data was extracted from the included studies whereas a meta-analysis was not possible since findings were mostly presented as a narrative format.
ResultsEight studies out of the 2783 potentially eligible articles met the selection criteria for this systematic review. The aim of interventions in these studies was 1) improving oral health literacy as the first outcome or 2) improving oral health behavior and oral health skills as the first outcome and assessing oral health literacy as the second outcome. The strength of evidence from the reviewed articles was high and there was an enormous heterogeneity in the study design, OHL measurement instruments and outcomes measure. Interventions were considerably effective in improving oral health literacy.
ConclusionHealth education that is tailored to the needs and addresses patients barrier to care can improve their oral health literacy level. | dentistry and oral medicine |
10.1101/2022.03.31.22273198 | What (if anything) is going on? Examining longitudinal associations between social media use and mental ill-health among young people | ObjectivesIn mid-adolescence, to 1) examine cyclical associations between social media use and mental ill health by investigating longitudinal and bidirectional associations, dose response relationships, and changes in social media use and in mental health; 2) assess potential interaction effects between social media use and mental health with pre-existing early life vulnerabilities.
MethodsLongitudinal data on 12,114 participants from the Millennium Cohort Study on social media use, depressive symptoms, self-harm and early life risk factors were used.
ResultsWe found little support for the existence of cyclical relationships between social media use and mental health. Where detected, effect sizes were small. Dose response associations were seen in the direction of mental health to social media (depressive symptoms 1 time OR=1.22, 2 times OR=1.71; self-harm 1 time OR=1.17, 2 times OR=1.53), but not for social media to mental health. Changes in social media use and changes in mental health were not associated with each other. We found no evidence to suggest that either social media use or mental health interacted with pre-existing risk for mental ill health.
ConclusionsFindings highlight the possibility that observed longitudinal associations between social media use and mental health might reflect other risk processes or vulnerabilities and/or the precision of measures used. More detailed and frequently collected prospective data about online experiences, including those from social media platforms themselves will help to provide a clearer picture of the relationship between social media use and mental health. | epidemiology |
10.1101/2022.03.31.22273265 | Cohort profile: The Pregnancy, Arsenic, and Immune Response (PAIR) Study, a longitudinal pregnancy and birth cohort in rural northern Bangladesh | PurposeArsenic exposure and micronutrient deficiencies may alter immune reactivity to influenza vaccination in pregnant women, transplacental transfer of maternal antibodies to the fetus, and maternal and infant acute morbidity. The Pregnancy, Arsenic, and Immune Response (PAIR) Study is a longitudinal pregnancy and birth cohort designed to assess whether arsenic exposure and micronutrient deficiencies alter maternal or newborn immunity and acute morbidity following maternal seasonal influenza vaccination during pregnancy.
ParticipantsWe enrolled 784 pregnant women in rural Gaibandha District in northern Bangladesh between October 2018 and March 2019. Women received a quadrivalent seasonal inactivated influenza vaccine at enrollment in the late first or early second trimester between 11 and 17 weeks of gestational age. Follow-up included up to 13 visits between enrollment and three months postpartum as well as weekly telephone surveillance to ascertain influenza-like illness and other acute morbidity symptoms in women and infants. Tube well drinking water and urine specimens were collected to assess arsenic exposure. Of 784 women who enrolled, 736 (93.9%) delivered live births and 551 (70.3%) completed follow-up visit to three months postpartum.
Findings to DateArsenic was [≥]0.02 {micro}g/L in 97.9% of water specimens collected from participants at enrollment. The medians (interquartile ranges) of water and urinary arsenic were 5.1 (0.5-25.1) {micro}g/L and 33.1 (19.6-56.5) {micro}g/L, respectively. Water and urinary arsenic were strongly correlated (Spearmans {rho}=0.72) among women with water arsenic [≥] median but weakly correlated ({rho}=0.18) among women with water arsenic < median.
Future PlansThe PAIR Study is well positioned to examine the effects of low-moderate arsenic exposure and micronutrient deficiencies on immune outcomes in women and infants.
RegistrationNCT03930017 | epidemiology |
10.1101/2022.03.31.22273157 | Reconstructing long-term dengue virus immunity in French Polynesia | BackgroundUnderstanding the underlying risk of infection by dengue virus from surveillance systems is complicated due to the complex nature of the disease. In particular, the probability of becoming severely sick is driven by serotype-specific infection histories as well as age; however, this has rarely been quantified. Island communities that have periodic outbreaks dominated by single serotypes provide an opportunity to disentangle the competing role of serotype, age and changes in surveillance systems in characterising disease risk.
MethodologyWe develop mathematical models to analyse 35 years of dengue surveillance (1979-2014) and seroprevalence studies from French Polynesia. We estimate the annual force of infection, serotype-specific reporting probabilities and changes in surveillance capabilities using the annual age and serotype-specific distribution of dengue.
Principal FindingsEight dengue epidemics occurred between 1979 and 2014, with reporting probabilities for DENV-1 primary infections increasing from 3% to 5%. The reporting probability for DENV-1 secondary infections was 3.6 times that for primary infections. Reporting probabilities for DENV-2-DENV-4 were 0.1-2.6 and 0.7-2.3 times that for DENV-1, for primary and secondary infections, respectively. Reporting probabilities declined with age after 14 y.o. Between 1979 and 2014, the proportion never infected declined from 70% to 23% while the proportion infected at least twice increased from 4.5% to 45%. By 2014, almost half of the population had acquired heterotypic immunity. The probability of an epidemic increased sharply with the estimated fraction of susceptibles among children.
Conclusion / SignificanceBy analysing 35 years of dengue data in French Polynesia, we characterised key factors affecting the dissemination profile and reporting of dengue cases in an epidemiological context simplified by mono-serotypic circulation. Our analysis provides key estimates that can inform the study of dengue in more complex settings where the co-circulation of multiple serotypes can greatly complicate inference.
Author summaryCharacterising the true extent of dengue circulation and the level of population immunity is essential to assess the burden of disease, evaluate epidemic risk and organise prevention strategies against future epidemics. However, this is difficult in a context where most people who are infected by dengue virus (DENV) only have mild symptoms which may not be reported to surveillance systems. In this article, we develop a mathematical model to evaluate the fraction of unreported dengue infections from case data. The key idea is to introduce reporting probabilities that depend on the infecting serotype and the infection history of patients. These factors are known to contribute to variations in the severity of symptoms and hence the reporting probabilities, but have rarely been taken into account in model frameworks to study population immunity from the case data. Using the developed model, we study long-term dengue virus immunity in French Polynesia. | epidemiology |
10.1101/2022.04.04.22272484 | Estimating the allele frequency threshold of the pathogenic mtDNA variant m.3243A>G tolerated by human myofibres | Cells are thought to tolerate the presence of heteroplasmic pathogenic mitochondrial (mt)DNA variants, up to a threshold proportion with little or no functional consequence, only developing a detectable oxidative phosphorylation (OXPHOS) defect once this threshold is exceeded. We hypothesised that, given the highly variable phenotypic presentation and severity of disease caused by the most common heteroplasmic pathogenic mtDNA variant m.3243A>G, this threshold may vary between patients. To estimate this threshold we examined the relationship between m.3243A>G level and deficiency of respiratory complexes I and IV in hundreds of single skeletal muscle fibres captured by laser microdissection from 17 patients carrying the m.3243A>G variant. We confirmed that tissue homogenate m.3243A>G level is a poor surrogate for the broad and complex distributions of m.3243A>G level in single cells from individual patients. We used unsupervised machine learning to characterise the biochemical status of muscle fibres, based on immunocytochemical measurements and estimate that the m.3243A>G threshold proportion above which myofibres develop a respiratory complex defect is 82.8% (IQR:2.7%). We do not find any significant difference in threshold between individuals, however we demonstrate that the pattern of respiratory complex defect and the distribution of m.3243A>G varies considerably between patients. This observation suggests that the pathogenesis of m.3243A>G is more diverse than previously thought. | pathology |
10.1101/2022.04.04.22272484 | Quantifying phenotype and genotype distributions in single muscle fibres from patients carrying the pathogenic mtDNA variant m.3243A>G | Cells are thought to tolerate the presence of heteroplasmic pathogenic mitochondrial (mt)DNA variants, up to a threshold proportion with little or no functional consequence, only developing a detectable oxidative phosphorylation (OXPHOS) defect once this threshold is exceeded. We hypothesised that, given the highly variable phenotypic presentation and severity of disease caused by the most common heteroplasmic pathogenic mtDNA variant m.3243A>G, this threshold may vary between patients. To estimate this threshold we examined the relationship between m.3243A>G level and deficiency of respiratory complexes I and IV in hundreds of single skeletal muscle fibres captured by laser microdissection from 17 patients carrying the m.3243A>G variant. We confirmed that tissue homogenate m.3243A>G level is a poor surrogate for the broad and complex distributions of m.3243A>G level in single cells from individual patients. We used unsupervised machine learning to characterise the biochemical status of muscle fibres, based on immunocytochemical measurements and estimate that the m.3243A>G threshold proportion above which myofibres develop a respiratory complex defect is 82.8% (IQR:2.7%). We do not find any significant difference in threshold between individuals, however we demonstrate that the pattern of respiratory complex defect and the distribution of m.3243A>G varies considerably between patients. This observation suggests that the pathogenesis of m.3243A>G is more diverse than previously thought. | pathology |
10.1101/2022.04.04.22273356 | EARLY OUTPATIENT TREATMENT OF COVID-19: A RETROSPECTIVE ANALYSIS OF 392 CASES IN ITALY | IntroductionThe pandemic of severe acute respiratory syndrome (SARS)-coronavirus-2 (CoV-2) disease 2019 (COVID-19) was declared in march 2020. Knowledge of COVID-19 pathophysiology soon provided a strong rationale for the early use of anti-inflammatory, antiplatelet and anticoagulant drugs, however the evidence was only slowly and partially incorporated into institutional guidelines. Unmet needs of COVID-19 outpatients were soon taken care of by networks of physicians and researchers, using pharmacotherapeutic approaches based on the best available experiences.
MethodsObservational retrospective study investigating characteristics, management and outcomes in COVID-19 patients taken care of in Italy by physicians volunteering within the IppocrateOrg Association, one of the main international assistance networks, between 1st november 2020 and 31st march 2021.
ResultsTen doctors took part in the study and provided data about 392 consecutive COVID-19 patients. Patients mean age was 48,5 years (range: 0,5-97). They were 51,3% females and were taken care of when in COVID-19 stage 0 (15,6%), 1 (50,0%), 2a (28,8%), 2b (5,6%). Many patients were overweight (26%) or obese (11,5%), with chronic comorbidities (34,9%), mainly cardiovascular (23%) and metabolic (13,3%). Drugs most frequently prescribed included: vitamins and supplements (98,7%), aspirin (66,1%), antibiotics (62%), glucocorticoids (41,8%), hydroxychloroquine (29,6%), enoxaparin (28,6%), colchicine (8,9%), oxygen therapy (6,9%), ivermectin (2,8%). Hospitalization occurred in 5,8% of total cases, mainly in patients taken care of when in stage 2b (27,3%). Altogether, 390 patients (99,6%) recovered, one patient (0,2%) was lost at follow up, and one patient (0,2%) died after hospitalization. One doctor reported one grade 1 adverse drug reaction (ADR) (transient or mild discomfort), and 3 doctors reported in total 8 grade 2 ADR (mild to moderate limitation in activity).
ConclusionsThis is the first study describing attitudes and behaviors of physicians caring for COVID-19 outpatients, and the effectiveness and safety of COVID-19 early treatment in the real world. COVID-19 lethality in our cohort was 0,2%, while the overall COVID-19 lethality in Italy in the same period was between 3% and 3,8%. The use of individual drugs and drug combinations described in this study appears therefore effective and safe, as indicated by the few and mild ADR reported. Present evidence should be carefully considered by physicians caring for COVID-19 patients as well as by political decision makers managing the current global crisis. | pharmacology and therapeutics |
10.1101/2022.03.31.22273181 | Cannabis potential effects to prevent or attenuate SARS-COV2 contagion | Medical cannabis has gained an exponential interest in recent years. Therapeutic targets have been broadened from specific applications over pain control, chemotherapy side effects, treatment-resistant epilepsies and multiple sclerosis, among others. Several in vitro and animal studies, along with few human controlled studies, suggest cannabinoids have a potential therapeutic role over medical conditions comporting inflammatory mechanisms. Given the tremendous world-wide impact of the COVID-19 pandemic, research efforts are converging towards the use of cannabinoids to attenuate severe or fatal forms of the disease. The present survey aims to explore possible correlations between cannabis use, either recreational or medical, over the presence of SARS-COV-2 contagion, along with the symptoms severity. 4026 surveys were collected via electronic form. Results suggest a relation between any type of cannabis use and a lower risk of SARS-COV-2 contagion (p=0,004; OR=0,689, IC95% 0,534-0,889). Despite several methodological limitations, the present survey steps up the urge to expand our understanding on cannabinoids potential use on human controlled studies, that can better arm us in the fight against the current COVID-19 pandemic. | pharmacology and therapeutics |
10.1101/2022.04.03.22273354 | Voice patterns as markers of schizophrenia: building a cumulative generalizable approach via cross-linguistic and meta-analysis based investigation | Background and HypothesisVoice atypicalities are potential markers of clinical features of schizophrenia (e.g., negative symptoms). A recent meta-analysis identified an acoustic profile associated with schizophrenia (reduced pitch variability and increased pauses), but also highlighted shortcomings in the field: small sample sizes, little attention to the heterogeneity of the disorder, and to generalizing findings to diverse samples and languages.
Study DesignWe provide a critical cumulative approach to vocal atypicalities in schizophrenia, where we conceptually and statistically build on previous studies. We aim at identifying a cross-linguistically reliable acoustic profile of schizophrenia and assessing sources of heterogeneity (symptomatology, pharmacotherapy, clinical and social characteristics). We relied on previous meta-analysis to build and analyze a large cross-linguistic dataset of audio recordings of 231 patients with schizophrenia and 238 matched controls (>4.000 recordings in Danish, German, Mandarin and Japanese). We used multilevel Bayesian modeling contrasting meta-analytically informed and skeptical inferences.
Study ResultsWe found only a minimal generalizable acoustic profile of schizophrenia (reduced pitch variability), while duration atypicalities replicated only in some languages. We identified reliable associations between acoustic profile and individual differences in clinical ratings of negative symptoms, medication, age and gender. However, these associations vary across languages.
ConclusionsThe findings indicate that a strong cross-linguistically reliable acoustic profile of schizophrenia is unlikely. Rather, if we are to devise effective clinical applications able to target different ranges of patients, we need first to establish larger and more diverse cross-linguistic datasets, focus on individual differences, and build self-critical cumulative approaches. | psychiatry and clinical psychology |