id
stringlengths 16
27
| title
stringlengths 18
339
| abstract
stringlengths 95
38.7k
| category
stringlengths 7
44
|
---|---|---|---|
10.1101/2022.04.08.22272471 | Automated Deep Brain Stimulation programming based on electrode location - a randomized, crossover trial using a data-driven algorithm | BackgroundDeep brain stimulation (DBS) of the subthalamic nucleus (STN) is highly effective in controlling motor symptoms in patients with Parkinsons Disease (PD). However, correct selection of stimulation parameters is pivotal to treatment success and currently follows a time-consuming and demanding trial-and-error process. We conducted a double-blind, ran-domized, cross-over, non-inferiority trial to assess treatment effects of stimulation parameters suggested by a recently published algorithm (StimFit) based on neuroimaging data.
MethodsThe trial was carried out at Charite - Universitatsmedizin, Berlin, Germany and enrolled 35 PD patients treated with directional octopolar electrodes targeted at the STN. All patients had undergone DBS programming according to our centers standard of care (SoC) treatment before study recruitment. Based on perioperative imaging data DBS electrodes were reconstructed and StimFit was applied to suggest optimal stimulation settings. Patients underwent motor assessments using MDS-UPDRS-III during OFF-medication and in OFF-and ON-stimulation states under both conditions, StimFit and SoC parameter settings that were double blinded and randomized in a 1:1 ratio. The primary endpoint of this study was the absolute mean difference between MDS-UPDRS-III scores under StimFit and SoC stimulation, with a non-inferiority margin of five points.
FindingsSTN DBS resulted in mean MDS-UPDRS-III improvements of 48 % for SoC and 43 % with StimFit as compared to OFF-stimulation condition. The mean difference between MDS-UPDRS-III scores under StimFit and SoC stimulation was not significant (1.6 points), and non-inferiority was established. In six patients (17 %) initial programming of StimFit settings resulted in acute side-effects and amplitudes were reduced until side-effects disappeared.
InterpretationAutomated data-driven algorithms can predict stimulation parameters which lead to motor symptom control comparable to standard of care treatment. This approach could significantly decrease the time necessary to obtain optimal treatment parameters thereby fostering the design of more complex DBS electrodes. Long-term data including effects on quality of life require further investigation. | neurology |
10.1101/2022.04.06.22273493 | From months to minutes: creating Hyperion, a novel data management system expediting data insights for oncology research and patient care | Structured AbstractO_ST_ABSObjectiveC_ST_ABSDescribe the design and implementation of a novel data management platform for an academic cancer center which meets the needs of multiple stakeholders.
Materials and MethodsA small, cross-functional technical team identified key challenges to creating a broad data management and access software solution: lowering the technical skill floor, reducing cost, enhancing user autonomy, optimizing data governance, and reimagining technical team structures in academia. The Hyperion data management platform was designed to meet these challenges in addition to usual considerations of data quality, security, access, stability, and scalability.
ResultsImplemented between May 2019 and December 2020 at the Wilmot Cancer Institute, Hyperion includes a sophisticated custom validation and interface engine to process data from multiple sources, storing it in a database. Graphical user interfaces and custom wizards permit users to directly interact with data across operational, clinical, research, and administrative contexts. The use of multi-threaded processing, open-source programming languages, and automated system tasks (normally requiring technical expertise) minimizes costs. An integrated ticketing system and active stakeholder committee support data governance and project management. A co-directed, cross-functional team with flattened hierarchy and integration of industry software management practices enhances problem solving and responsiveness to user needs.
DiscussionAccess to validated, organized, and current data is critical to the functioning of multiple domains in medicine. Although there are downsides to developing in-house customized software, we describe a successful implementation of custom data management software in an academic cancer center. | health informatics |
10.1101/2022.04.06.22273509 | A Voice App Design for Heart Failure Self-Management: A Pilot Study | There is a growing interest to investigate the feasibility of using voice user interfaces as a platform for digital therapeutics in chronic disease management. While mostly deployed as smartphone applications, some demographics struggle when using touch screens and often cannot complete tasks independently. This research aimed to evaluate how heart failure patients interacted with a voice app version of an already existing digital therapeutic, Medly, using a mixed-methods concurrent triangulation approach. The objective was to determine the acceptability and feasibility of the voice app by better understanding who this platform is be best suited for. Quantitative data included engagement levels and accuracy rates. Participants (n=20) used the voice app over a four week period and completed questionnaires and semi-structured interviews relating to acceptability, ease of use, and workload. The average engagement level was 73%, with a 14% decline between week one and four. The difference in engagement levels between the oldest and youngest demographic was the most significant, 84% and 43% respectively. The Medly voice app had an overall accuracy rate of 97.8% and was successful in sending data to the clinic. Users were accepting of the technology (ranking it in the 80th percentile) and felt it did not require a lot of work (2.1 on a 7-point Likert scale). However, 13% of users were less inclined to use the voice app at the end of the study. The following themes and subthemes emerged: (1) feasibility of clinical integration: user adaptation to voice apps conversational style, device unreliability, and (2) voice app acceptability: good device integration within household, users blamed themselves for voice app problems, and voice app missing desirable user features. The voice app proved to be most beneficial to those who: are older, have flexible schedules, are confident with using technology, and are experiencing other medical conditions. | health informatics |
10.1101/2022.04.09.22273656 | THEMIS: A Framework for Cost-Benefit Analysis of COVID-19 Non-Pharmaceutical Interventions | Since December 2019, the world has been ravaged by the COVID-19 pandemic, with over 150 million confirmed cases and 3 million confirmed deaths worldwide. To combat the spread of COVID-19, governments have issued unprecedented non-pharmaceutical interventions (NPIs), ranging from mass gathering restrictions to complete lockdowns. Despite their proven effectiveness in reducing virus transmission, the policies often carry significant economic and humanitarian cost, ranging from unemployment to depression, PTSD, and anxiety. In this paper, we create a data-driven system dynamics framework, THEMIS, that allows us to compare the costs and benefits of a large class of NPIs in any geographical region across different cost dimensions. As a demonstration, we analyzed thousands of alternative policies across 5 countries (United States, Germany, Brazil, Singapore, Spain) and compared with the actual implemented policy.
Our results show that moderate NPIs (such as restrictions on mass gatherings) usually produce the worst results, incurring significant cost while unable to sufficiently slow down the pandemic to prevent the virus from becoming endemic. Short but severe restrictions (complete lockdown for 4-5 weeks) generally produced the best results for developed countries, but only if the speed of reopening is slow enough to prevent a resurgence. Developing countries exhibited very different trade-off profiles from developed countries, and suggests that severe NPIs such as lockdowns might not be as suitable for developing countries in general. | health policy |
10.1101/2022.04.05.22273463 | An observational analysis of patient recruitment in clinical trials in France using real-word database PMSI.French patient recruitment in clinical trials using PMSI | BackgroundFrance has significant clinical research and development potential, however, struggles in comparison to neighbouring countries. A significant reason is the difficulty to recruit patients, thus causing delays in the availability of new therapies to market. IQVIA uses Health Insurance Claims Data among other data assets, to better locate patients for trials based on the potential of hospitals.
ObjectiveThe aim of the study was to monitor whether an increased number of patients enrolled in clinical trials in France was observed when PMSI data supported patient recruitment, as well as describing clinical trial landscape worldwide and in Europe.
MethodsWe used data from ClinicalTrials.gov and Citeline to describe the clinical trial landscape in Europe between 2010 and 2019. We also looked at the IQVIA internal clinical trial tracker, Clinical Trial Management System (CTMS) to describe IQVIA-run trials and their performance after matching trials supported with PMSI data in France. We compared the average number of enrolled patients per site in PMSI and non-PMSI supported trials according to the study phase, using a Student t-test.
ResultsResults suggest that the support of PMSI on the average number of enrolled patients per site, when comparing at similar trial phase level, shows a positive trend especially for phase 4 studies (11.0 with PMSI vs 9.3 without PMSI, p=0.67), and for phases 3b, 3 and 1, when compared to non-PMSI supported studies.
ConclusionsThe findings of this study suggest that PMSI use has the potential to increase patient recruitment into clinical trials run in France, rendering France more attractive in its exploitation of the clinical research potential. Optimising patient recruitment has a direct impact on the availability and timeliness of innovative therapies to market for French patients. | health systems and quality improvement |
10.1101/2022.04.09.22273655 | Protocol for mixed-method study by LOng COvid Multidisciplinary consortium: Optimising Treatments and servIces acrOss the NHS (LOCOMOTION) | IntroductionLong COVID, a new condition whose origins and natural history are not yet fully established, currently affects 1.5 million people in the UK. Most do not have access to specialist long COVID services. We seek to optimise long COVID care both within and outside specialist clinics, including improving access, reducing inequalities, helping patients manage their symptoms effectively at home, and providing guidance and decision support for primary care. We aim to establish a gold standard of care by systematically analysing symptom clusters and current practices, iteratively improving pathways and systems of care, and working to disseminate better practices.
Methods and analysisThis mixed-method, multi-site study is informed by the principles of applied health services research, quality improvement, co-design, and learning health systems. It was developed in close partnership with patients (whose stated priorities are prompt clinical assessment; evidence-based advice and treatment; and help with returning to work and other roles) and with front-line clinicians. Workstreams and tasks to optimise assessment, treatment and monitoring are based in three contrasting settings: [1] specialist management in 10 long COVID clinics across the UK, via a quality improvement collaborative, experience-based co-design and targeted efforts to reduce inequalities of access; [2] patient self-management at home, with technology-supported monitoring; and [3] generalist management in primary care, harnessing electronic record data to study population phenotypes and develop evidence-based decision support, referral pathways and prioritisation criteria across the primary-secondary care interface, along with analysis of costs. Study governance includes an active patient advisory group.
Ethics and disseminationLOCOMOTION is sponsored by the University of Leeds and approved by Yorkshire & The Humber - Bradford Leeds Research Ethics Committee (ref: 21/YH/0276). Dissemination plans include academic and lay publications, and partnerships with national and regional policymakers to influence service specifications and targeted funding streams.
Study registrationClinicalTrials.gov: NCT05057260; ISRCTN15022307. | health systems and quality improvement |
10.1101/2022.04.06.22273542 | Effectiveness of convalescent plasma therapy in COVID-19 patients with haematological malignancies | BackgroundImmunocompromised patients, including those with haematological malignancies, are among the high-risk group to develop severe coronavirus disease 2019 (COVID-19) complications. The effectiveness of passive immunotherapy with convalescent plasma (CP) on such patients diagnosed with COVID-19 has not been reviewed. Therefore, the aim of this review was to systematically appraise the current evidence for the efficacy of this therapy in haematological malignancies patients with COVID-19 infection.
MethodsA comprehensive search was conducted up-to October 2021, using four databases: PubMed, Web of Science, Science Direct, and Scopus. Two reviewers independently assessed the quality of the included studies. Data collection analysis were performed using Microsoft Excel 365 and GraphPad Prism software.
Results17 studies met the inclusion criteria; these records included 258 COVID-19 patients with haematological malignancies and treated with CP therapy (CPT). The main findings from the reviewed data suggests CPT may be associated with improved clinical outcomes including (a) higher survival rate, (b) improved SARS-CoV-2 clearance and presence of detectable anti-SARS-CoV-2 antibodies post CP transfusion, (c) improved hospital discharge time, and recovery after 1 month of CP therapy. Furthermore, treatment with convalescent plasma was not associated with development of adverse events.
ConclusionOwing to its safety and beneficial effects in improving clinical outcomes, CPT appears to be an effective supportive therapeutic option for haematological malignancy patients infected with COVID-19. | hematology |
10.1101/2022.04.07.22273584 | Syndemic violence victimization, alcohol and drug use, and HIV transmission risk behavior among transgender women in India: A cross-sectional, population-based study | IntroductionTransgender women are disproportionately burdened by HIV. Co-occurring epidemics of adverse psychosocial exposures accelerate HIV sexual risk, including among transgender women; however, studies using additive models fail to examine synergies among psychosocial conditions that define a syndemic. We examined the impact of synergistic interactions among 4 psychosocial exposures on condomless anal sex (CAS) among a national probability sample of transgender women in India.
MethodsA probability-based sample of 4,607 HIV-negative transgender women completed the Indian Integrated Bio-behavioral Surveillance survey, 2014-2015. We used linear probability regression and logistic regression to assess 2-, 3-, and 4-way interactions among 4 exposures (physical and sexual violence, drug and alcohol use) on CAS.
ResultsOverall, 27.3% reported physical and 22.3% sexual violence victimization (39.2% either physical or sexual violence), one-third (33.9%) reported frequent alcohol use and 11.5% illicit drug use. Physical violence was associated with twofold higher odds of CAS in the main effects model. Significant two- and three-way interactions were identified on both multiplicative and additive scales between physical violence and drug use; physical and sexual violence; physical violence, sexual violence, and alcohol use; and physical violence, alcohol and drug use.
ConclusionsPhysical and sexual violence victimization, and alcohol and drug use are highly prevalent and synergistically interact to increase CAS among transgender women in India. Targeted and integrated initiatives to improve assessment of psychosocial comorbidities, to combat transphobic violence, and to provide tailored, trauma-informed alcohol and substance use treatment services may reduce HIV risk among transgender women. | hiv aids |
10.1101/2022.04.07.22273581 | Time series cross-correlation between home range and number of infected people during the medium term of COVID-19 Pandemic in a suburban city | Control of human mobility is among the most effective measures to prevent the spread of coronavirus disease 2019 (COVID-19). This study aims to clarify the correlation between home range and the number of people infected with SARS-CoV-2 during the medium-term of the COVID-19 pandemic in Ibaraki City. Home ranges are analyzed by the Minimum Convex Polygon method using mobile phone GPS location history data. We analyzed the time series cross-correlation between home range lengths and the number of infected people. Results reveal a slight positive correlation between home range and the number of infected people after one week during the medium-term of the COVID-19 pandemic. Regarding home range length, the cross-correlation coefficient is 0.4030 even at a lag level of six weeks, which has the most significant coefficient. Thus, a decrease in home range is only one of the indirect factors contributing toward a reduction in the number of infected people. This study makes a significant contribution to the literature by evaluating key public health challenges from the perspective of controliing the spread of the COVID-19 infectuion. Its findings has implications for policy makers, practitioners, and urban scientists seeking to promote urban sustainability. | infectious diseases |
10.1101/2022.04.09.22273638 | Population-level hypertension control rate in India: A systematic review and meta-analysis of community based non-interventional studies, 2001-2020 | BackgroundHypertension is a significant contributor to mortality in India. Adequate control of hypertension is important to prevent cardiovascular morbidity and mortality.
MethodsWe conducted a systematic review and meta-analysis of community-based, non-interventional studies published between 2001 and 2020. We screened records from PubMed, Embase, and Web of Science databases, extracted data, and assessed risk of bias. We conducted random-effects meta-analysis to provide overall summary estimates and subgroup estimates, and mixed-effects meta-regression with sex, region, and study period as covariates. The risk of bias was assessed using modified New Castle-Ottawa scales. This study is registered with PROSPERO, CRD42021267973.
ResultsThe systematic review included 37 studies (n=170,631 hypertensive patients). Twelve studies (32%) reported poorer control rates among males than females, four studies (11%) reported poorer control rates among rural patients, while very few studies reported differences across socioeconomic variables. The overall control rate was 33.2% (n=84,485, 95% CI=27.9,38.6) with substantial heterogeneity (I2=99.1%, \chi^2= 3003.91, 95% CI=98.9,99.2; p <0.001). Unadjusted sub-group analysis showed significantly different hypertension control rates across regions (n=12,938, p=0.003) but not across study periods (n= 84,485, p=0.22), or sex (n= 81,197, p=0.22). Meta-regression showed that control rates increased by 14.7% during 2011-2020 compared to 2001-2010 (95%CI=5.8, 23.5, p=0.0021), and was 26.3% higher in the south (95%CI=12.6, 39.9, p=0.0005) and 15.9% higher in the west (95%CI=3.4, 31.4, p=0.0456) compared to the east. The control rates did not differ by sex.
ConclusionHypertension is adequately controlled only among one-third of patients in India. The control rate has improved during 2011-2020 compared to 2001-2010, but substantial differences exist across regions. Very few studies examined relevant socioeconomic factors relevant to hypertension control. India needs more studies at the community level to understand the health system and socioeconomic factors that determine uncontrolled hypertension in India. | epidemiology |
10.1101/2022.04.09.22273638 | Population-level hypertension control rate in India: A systematic review and meta-analysis of community based non-interventional studies, 2001-2020 | BackgroundHypertension is a significant contributor to mortality in India. Adequate control of hypertension is important to prevent cardiovascular morbidity and mortality.
MethodsWe conducted a systematic review and meta-analysis of community-based, non-interventional studies published between 2001 and 2020. We screened records from PubMed, Embase, and Web of Science databases, extracted data, and assessed risk of bias. We conducted random-effects meta-analysis to provide overall summary estimates and subgroup estimates, and mixed-effects meta-regression with sex, region, and study period as covariates. The risk of bias was assessed using modified New Castle-Ottawa scales. This study is registered with PROSPERO, CRD42021267973.
ResultsThe systematic review included 37 studies (n=170,631 hypertensive patients). Twelve studies (32%) reported poorer control rates among males than females, four studies (11%) reported poorer control rates among rural patients, while very few studies reported differences across socioeconomic variables. The overall control rate was 33.2% (n=84,485, 95% CI=27.9,38.6) with substantial heterogeneity (I2=99.1%, \chi^2= 3003.91, 95% CI=98.9,99.2; p <0.001). Unadjusted sub-group analysis showed significantly different hypertension control rates across regions (n=12,938, p=0.003) but not across study periods (n= 84,485, p=0.22), or sex (n= 81,197, p=0.22). Meta-regression showed that control rates increased by 14.7% during 2011-2020 compared to 2001-2010 (95%CI=5.8, 23.5, p=0.0021), and was 26.3% higher in the south (95%CI=12.6, 39.9, p=0.0005) and 15.9% higher in the west (95%CI=3.4, 31.4, p=0.0456) compared to the east. The control rates did not differ by sex.
ConclusionHypertension is adequately controlled only among one-third of patients in India. The control rate has improved during 2011-2020 compared to 2001-2010, but substantial differences exist across regions. Very few studies examined relevant socioeconomic factors relevant to hypertension control. India needs more studies at the community level to understand the health system and socioeconomic factors that determine uncontrolled hypertension in India. | epidemiology |
10.1101/2022.04.07.22273555 | Longitudinal associations between physical activity and other health behaviours during the COVID-19 pandemic: A fixed effects analysis | BackgroundGovernment enforced restrictions on movement during the COVID-19 pandemic are likely to have had profound impacts on the daily behaviours of many individuals, including physical activity (PA). Given the pre-pandemic evidence for associations between PA and other health behaviours, changes in PA during the pandemic may have been detrimental for other health behaviours. This study aimed to evaluate whether changes in PA during and after the first national lockdown in the United Kingdom (UK) were associated with concurrent changes in other health behaviours, namely alcohol consumption, sleep, nutrition quality, diet quantity and sedentary time.
MethodsData were derived from the UCL COVID-19 Social Study. The analytical sample consisted of 52,784 adults followed weekly across 22 weeks of the pandemic from 23rd March to 23rd August 2020. Data were analysed using fixed effects regression.
ResultsThere was significant within-individual variation in both PA and other health behaviours throughout the study period. Increased PA was positively associated with improved sleep and nutrition quality. However, increases in PA also showed modest associations with increased alcohol consumption and sedentary time.
ConclusionOur findings indicate that, whilst the first wave of COVID-19 restrictions were in place, increases in PA were associated with improved sleep and better diet. Encouraging people to engage in PA may therefore lead to positive change in other health behaviours in times of adversity. However, increases in PA were also associated with more engagement in the negative health behaviours of alcohol consumption and sedentary time. These associations could be a result of increases in available leisure time for many people during COVID-19 restrictions and require further investigation to inform future public health guidance. | epidemiology |
10.1101/2022.04.08.22273585 | Genetic and environmental factors underlying parallel changes in body mass index and alcohol consumption: a 36-year longitudinal study of adult twins | BackgroundThe genetic and environmental underpinnings of simultaneous changes in weight and alcohol consumption are poorly known.
ObjectiveWe sought to quantify the environmental and genetic components underlying parallel changes in weight and alcohol consumption, and to investigate potential covariations between them.
MethodsThe analysis comprised 4461 adult participants (58% women) from the Finnish Twin Cohort. Four measures of alcohol consumption and body mass index (BMI) were available over a 36-year follow-up. Trajectories of each trait were described by growth factors, defined as intercepts (i.e., baseline) and slopes (i.e., change over follow-up), using Latent Growth Curve Modeling. Growth values were used for male (190 MZ pairs, 293 DZ pairs) and female (316 MZ pairs, 487 DZ pairs) same-sex complete twin pairs in multivariate twin modeling. The variance and covariance of growth factors were then decomposed into genetic and environmental components.
ResultsThe baseline heritabilities were similar in men (BMI: h2=79%; alcohol consumption: h2=49%) and women (h2=77%; h2=45%). Heritabilities of BMI change were similar in men (h2=52%) and women (h2=57%), but higher in men for change in alcohol consumption (h2=45%) than in women (h2=31%). Significant genetic correlations between BMI at baseline and change in alcohol consumption were observed in both men (r =-0.17(95% Confidence Interval: -0.29,-0.04)) and women (r=-0.18(-0.31,-0.06)). The genetic components of baseline and longitudinal change were correlated for both BMI and alcohol consumption with sex differences. Non-shared environmental factors affecting changes in alcohol consumption and BMI were correlated in men (r=0.18(0.06,0.30)). Among women, non-shared environmental factors affecting baseline alcohol consumption and the change in BMI were correlated (r=-0.11(-0.20,-0.01)).
ConclusionsWe provide evidence of genetic correlations between BMI and change in alcohol consumption. Independent of genetic effects, change in BMI and change in alcohol consumption covary. | epidemiology |
10.1101/2022.04.06.22273496 | Characterisation of ethnic differences in DNA methylation between UK resident South Asians and Europeans | Ethnic differences in non-communicable disease risk have been described between individuals of South Asian and European ethnicity that are only partially explained by genetics and other known risk factors. DNA methylation is one underexplored mechanism that may explain differences in disease risk. Currently there is little knowledge of how DNA methylation varies between South Asian and European ethnicities.
This study characterised differences in blood DNA methylation between individuals of self-reported European and South Asian ethnicity from two UK-based cohorts; Southall and Brent Revisited (SABRE) and Born in Bradford (BiB). DNA methylation differences between ethnicities were widespread throughout the genome (n=16,433 CpG sites, 3.4% sites tested). Specifically, 76% of associations were attributable to ethnic differences in cell composition with fewer effects attributable to smoking and genetic variation. Ethnicity associated CpG sites were enriched for EWAS Catalog phenotypes including metabolites. This work highlights the need to consider ethnic diversity in epigenetic research. | epidemiology |
10.1101/2022.04.07.22273569 | Comorbidity clusters associated with newly treated Type 2 diabetes mellitus: a Bayesian nonparametric analysis | BackgroundType 2 Diabetes Mellitus (T2DM) is associated with the development of chronic comorbidities over time, which can lead to high drug utilization and adverse events. Understanding the patterns of disease progression is needed.
ObjectivesTo identify common comorbidity clusters and explore the progression over time in newly treated T2DM patients.
MethodsThe IQVIA Medical Research Data incorporating data from THIN, a Cegedim database of anonymized electronic health records, was used to identify all patients with a first-ever prescription for a non-insulin antidiabetic drug (NIAD) between January 2006 and December 2019. We selected 58 chronic comorbidities of interest and used Bayesian nonparametric latent models (BNLM) to identify disease clusters and model their progression over time.
ResultsAmong the 175,383 eligible T2DM patients, we identified the 20 most frequent comorbidity clusters, which were comprised of 14 latent features (LFs). Each LF was associated with a main disease (e.g., 98% of patients in cluster 2, characterized by LF2, had congestive heart failure [CHF]). The presence of certain LFs increased the probability of having another LF active. For example, LF2 (CHF) frequently appeared with LFs related to chronic kidney disease (CKD). Over time, the clusters associated with cardiovascular diseases, such as CHF, progressed rapidly. Moreover, the onset of certain diseases led to the appearance of further complications (e.g., CHF onset was associated with an increasing prevalence of CKD).
ConclusionsOur models identified established T2DM complications and previously unknown connections, thus, highlighting the potential for BNLMs t to characterize complex comorbidity patterns. | epidemiology |
10.1101/2022.04.08.22273628 | Implications of red state/blue state differences in COVID-19 death rates | The study objective was to explore state death rates pre- and post- 4/19/2021 (date vaccines were assumed available) and the relative contributions of 3 factors to state death rates post- 4/19/2021: 1) vaccination rates, 2) prevalence of obesity, hypertension, diabetes, COPD, cardiovascular disease, and asthma and 3) red vs. blue states, to better understand options for reducing deaths. The ratio of red to blue state deaths/million was 1.6 pre-4/19/2021 and 2.3 between 4/19 and 2/28/2022 resulting in >222,000 extra deaths in red states or 305/ day. Adjusted betas from linear regression showed state vaccination rates had the strongest effect on death rates while red vs. blue states explained more of the difference in state death rates (60% vs. 46% for vaccination rates) with mean vaccination rates ~10% higher in blue states. Results suggest that increasing vaccination rates in red states could potentially save thousands of lives as the pandemic continues. | epidemiology |
10.1101/2022.04.06.22273536 | Age-based variability in the association between restraint use and injury type and severity | PurposePrevious studies have shown elderly individuals receive less relatively less protection from seat belts against fatal injuries, however it is less clear how seat belt protection against severe and torso injury changes with age. We estimated age-based variability in seat belt protection against fatal injuries, injuries with maximum abbreviated injury scale greater than 2 (MAIS3+), and torso injuries.
MethodsWe leveraged the Crash Outcome Data Evaluation System (CODES) to analyze binary indicators of fatal, MAIS3+, and torso injuries. Using a matched cohort design and conditional Poisson regression, we estimated age-based relative risks (RR) of the outcomes associated with seat belt use.
ResultsSeat belts were highly protective against fatal injuries for all ages. For ages 16-30, seat belt use was associated with 66% lower risk of MAIS3+ injury (RR 0.34, 95% CI 0.30, 0.38), whereas for ages 75 and older, seat belt use was associated with 38% lower risk of MAIS3+ injury (RR 0.62; 95% CI 0.45, 0.86). The association between restraint use and torso injury also appeared to attenuate with age.
ConclusionsSeat belt protection against MAIS3+ and torso injury attenuated with age. We encourage that injury prevention continues to be tailored to vulnerable populations like the elderly. | epidemiology |
10.1101/2022.04.06.22273535 | Effectiveness of 2 and 3 mRNA COVID-19 Vaccines Doses against Omicron and Delta-Related Outpatient Illness among Adults, October 2021 - February 2022 | BackgroundWe estimated SARS-CoV-2 Delta and Omicron-specific effectiveness of 2 and 3 mRNA COVID-19 vaccine doses in adults against symptomatic illness in US outpatient settings.
MethodsBetween October 1, 2021, and February 12, 2022, research staff consented and enrolled eligible participants who had fever, cough, or loss of taste or smell and sought outpatient medical care or clinical SARS-CoV-2 testing within 10 days of illness onset. Using the test-negative design, we compared the odds of receiving 2 or 3 mRNA COVID-19 vaccine doses among SARS-CoV-2 cases versus controls using logistic regression. Regression models were adjusted for study site, age, onset week, and prior SARS-CoV-2 infection. Vaccine effectiveness (VE) was calculated as (1 - adjusted odds ratio) x 100%.
ResultsAmong 3847 participants included for analysis, 574 (32%) of 1775 tested positive for SARS-CoV-2 during the Delta predominant period and 1006 (56%) of 1794 participants tested positive during the Omicron predominant period. When Delta predominated, VE against symptomatic illness in outpatient settings was 63% (95% CI: 51% to 72%) among mRNA 2-dose recipients and 96% (95% CI: 93% to 98%) for 3-dose recipients. When Omicron predominated, VE was 21% (95% CI: -6% to 41%) among 2-dose recipients and 62% (95% CI: 48% to 72%) among 3-dose recipients.
ConclusionsIn this adult population, 3 mRNA COVID-19 vaccine doses provided substantial protection against symptomatic illness in outpatient settings when the Omicron variant became the predominant cause of COVID-19 in the U.S. These findings support the recommendation for a 3rd mRNA COVID-19 vaccine dose. | epidemiology |
10.1101/2022.04.07.22273329 | Functional response to a microbial synbiotic in the gastrointestinal system of constipated children | BackgroundOral microbial therapy has been studied as an intervention for a range of gastrointestinal and immunological disorders. Though emerging research suggests microbial exposure may intimately affect the gastrointestinal system, motility, and host immunity in a pediatric population, data has been inconsistent and variable, with the majority of prior studies conducted in neither a randomized nor placebo-controlled setting. The aim of this placebo-controlled study was to evaluate efficacy of a synbiotic (a prebiotic and rationally-defined microbial consortia) on increasing weekly bowel movement frequency in constipated children.
MethodsSixty-four children (3-17 years of age) were randomized to receive a synbiotic composition (n=33) comprised of mixed-chain length, prebiotic oligosaccharides and nine microbial strains or placebo (n=31) for 84 days. Stool microbiota was analyzed using shotgun metagenomic sequencing on samples collected at baseline (T1) and completion (T2). The primary outcome was change from baseline of Weekly Bowel Movements (WBMs) in children compared to placebo.
ResultsTreatment with a multi-strain synbiotic significantly (p < 0.05) increased the number of WBMs in children with low bowel movement frequency (< 4 WBMs and < 5 WBMs), irrespective of broadly distinctive microbiome signatures at baseline. Metagenomic shotgun sequencing revealed that low baseline microbial richness in the treatment group significantly anticipated improvements in constipation (p = 0.00074).
ConclusionsThese findings suggest the potential for (i) multi-species synbiotic interventions to improve digestive health in a pediatric population and (ii) bioinformatics-based methods to predict response to microbial interventions in children.
ImpactSynbiotic microbial treatment exerted functional improvements in the number of spontaneous Weekly Bowel Movements in children compared to placebo
Intervention induced a significant bifidogenic effect in children compared to placebo
All administered probiotic species were enriched in the gut microbiome of the intervention group compared to placebo
Baseline microbial richness demonstrated potential as a predictive biomarker for response to intervention | gastroenterology |
10.1101/2022.04.06.22273448 | Self-Reported Mask Use in SARS-CoV-2 Vaccinated and Unvaccinated Populations | Wearing a facemask can help to decrease the transmission of COVID-19. We investigated self-reported mask use among subjects aged 18 years and older participating in the COVID-19 Community Research Partnership (CRP), a prospective longitudinal COVIS-19 surveillance study. We included those participants who completed [≥]5 daily surveys each month from December 1, 2020 through August 31, 2021. Mask use was defined as self-reported use of a face mask or face covering on every interaction with others outside the household within a distance of less than 6 feet. Participants were considered vaccinated if they reported receiving [≥]1 COVID-19 vaccine dose. Participants (n=17,522) were 91% non-Hispanic White, 68% female, median age 57 years, 26% healthcare workers, with 95% self-reported receiving [≥]1 COVID-19 vaccine dose through August; mean daily survey response was 85%. Mask use was higher among vaccinated than unvaccinated participants across the study period, regardless of the month of the first dose. Mask use remained relatively stable from December 2020 through April (range 71-80% unvaccinated; 86-93% vaccinated) and declined in both groups beginning in mid-May 2021 to 34% and 42% respectively in June 2021; mask use has increased again since July 2021. Mask use by all was lower during weekends and on Christmas and Easter, regardless of vaccination status. Independent predictors of higher mask use were vaccination, age [≥]65 years, female sex, racial or ethnic minority group, and healthcare worker occupation, whereas a history of self-reported prior COVID-19 illness was associated with lower use.
Trial RegistrationThe COVID-19 Community Research Partnership is listed in clinicaltrials.gov (NCT04342884). | infectious diseases |
10.1101/2022.04.08.22272726 | SARS-CoV-2 reinfections with BA.1 (Omicron) variant among fully vaccinated individuals in the northeast of Brazil | BackgroundThe first case of Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) in Rio Grande do Norte, northeast Brazil, was diagnosed on March 12, 2020; thereafter, the pattern of COVID-19 followed the multiple waves as seen elsewhere. Those waves were mostly due to the SARS-CoV-2 virus mutations leading to emergence of variants of concern (VoC). The introduction of new VoCs in a population context of prior SARS-CoV-2 infections or after vaccination has been a challenge in understanding the kinetics of the protective immune response against SARS-CoV-2. The aim of this study was to investigate the outbreak of SARS-CoV-2 reinfections observed in mid-January 2022 in Rio Grande do Norte state, Brazil when the omicron variant was introduced.
Methodology/Principal findingsFrom a total of 172,965 individuals with mild to severe respiratory symptoms, 58,097 tested positive for SARS-CoV-2 between March 2020 through mid-February 2022. Of those previously infected, 444 had documented a second SARS-CoV-2 infection and 9 of these reinfection cases were selected for sequencing. Genomic analysis revealed that virus lineages diverged between primary and the reinfection, with the latter caused by the Omicron (BA.1) variant among individuals fully vaccinated against SARS-CoV-2.
Conclusions/SignificanceOnce all subjects whose samples were sequenced had prior SARS-CoV-2 infection and were also fully vaccinated, our data suggest that the Omicron variant evades natural and vaccine-induced immunities, confirming the continuous need to decrease transmission and to develop effective blocking vaccines.
Author summaryThe pattern of the COVID-19 pandemic has been characterized by multiple waves of cases with a variety of outcomes from asymptomatic, to moderate or to severe fatal cases. By December 2021, about 75.3% of Rio Grande do Norte population, northeast Brazil, had already been fully vaccinated against SARS-CoV-2 and a decrease in newer detection cases was seen to about 8% of the suspected ones. Nevertheless, with the introduction of the Omicron variant at the end of 2021, the number of new SARS-CoV-2 infections reached its highest peak since the start of the pandemic with 75% of the suspected cases testing positive. From March 2020 to February 2022, we confirmed 444 reinfection cases among the ones tested, of which 62.3% (n=277) occurred during the Omicron outbreak, from December 2021 to early February 2022. Of the reinfection cases, 9 were sequenced and genetic analysis showed that they belong to a BA.1 lineage, which seems to have been introduced multiple times into the region. The primary isolates varied. Thus, our data suggest that the Omicron variant evades immunity provided from either natural infection from any other SARS-CoV-2 variants or from different types of vaccines. | infectious diseases |
10.1101/2022.04.08.22273611 | A systematic review of Hepatitis B virus (HBV) prevalence and genotypes in Kenya: Data to inform clinical care and health policy | More than 20% of the global disease burden from chronic hepatitis B infection (CHB) is in Africa, however there is minimal high quality seroprevalence data from individual countries and little viral sequencing data available to represent the continent. We undertook a systematic review of the prevalence and genetic data available for hepatitis B virus (HBV) in Kenya using the Preferred Reporting Items for Systematic Review and Meta-analysis (PRISMA) 2020 checklist. We identified 23 studies reporting HBV prevalence and 25 studies that included HBV genetic data published in English between January 2000 and December 2021. We assessed study quality using the Joanna Briggs Institute critical appraisal checklist. Due to study heterogeneity, we divided the studies to represent low, moderate, high and very high-risk for HBV infection. We calculated pooled HBV prevalence within each group and evaluated available sequencing data. We also assessed whether reported HBV biomarkers could be applied to determine treatment eligibility. Eight studies were identified in the low-risk group, seven in the moderate risk group, five in the high-risk group and three in the very high-risk group for HBV infection. Pooled HBV prevalence was 3.31% (95% CI 2.62-4.01%), 5.58% (95% CI 3.46-7.7%), 6.17% (95% CI 4.4-9.94) and 31.39% (95% CI 9.5-53.09) respectively. Study quality was overall low, representing a small geographical location or a limited population subset. Only three studies detailed sample size calculation and 17/23 studies were cross sectional. Eight studies included genetic information on HBV, representing 247 individuals. Six studies sequenced one or two genes; two undertook whole genome sequencing, representing 22 participants. 92% people were infected with genotype A. Other genotypes included genotype D (6%), D/E recombinants (1%) or mixed populations (1%). Drug resistance mutations were reported by two studies. Seven studies presented additional biomarkers alongside HBsAg, however none provided sufficient information to deduce treatment eligibility. | infectious diseases |
10.1101/2022.04.08.22273611 | A systematic review of Hepatitis B virus (HBV) prevalence and genotypes in Kenya: Data to inform clinical care and health policy | More than 20% of the global disease burden from chronic hepatitis B infection (CHB) is in Africa, however there is minimal high quality seroprevalence data from individual countries and little viral sequencing data available to represent the continent. We undertook a systematic review of the prevalence and genetic data available for hepatitis B virus (HBV) in Kenya using the Preferred Reporting Items for Systematic Review and Meta-analysis (PRISMA) 2020 checklist. We identified 23 studies reporting HBV prevalence and 25 studies that included HBV genetic data published in English between January 2000 and December 2021. We assessed study quality using the Joanna Briggs Institute critical appraisal checklist. Due to study heterogeneity, we divided the studies to represent low, moderate, high and very high-risk for HBV infection. We calculated pooled HBV prevalence within each group and evaluated available sequencing data. We also assessed whether reported HBV biomarkers could be applied to determine treatment eligibility. Eight studies were identified in the low-risk group, seven in the moderate risk group, five in the high-risk group and three in the very high-risk group for HBV infection. Pooled HBV prevalence was 3.31% (95% CI 2.62-4.01%), 5.58% (95% CI 3.46-7.7%), 6.17% (95% CI 4.4-9.94) and 31.39% (95% CI 9.5-53.09) respectively. Study quality was overall low, representing a small geographical location or a limited population subset. Only three studies detailed sample size calculation and 17/23 studies were cross sectional. Eight studies included genetic information on HBV, representing 247 individuals. Six studies sequenced one or two genes; two undertook whole genome sequencing, representing 22 participants. 92% people were infected with genotype A. Other genotypes included genotype D (6%), D/E recombinants (1%) or mixed populations (1%). Drug resistance mutations were reported by two studies. Seven studies presented additional biomarkers alongside HBsAg, however none provided sufficient information to deduce treatment eligibility. | infectious diseases |
10.1101/2022.04.08.22273621 | Second round of the interlaboratory comparison (ILC) exercise of SARS-CoV-2 molecular detection assays being used by 45 veterinary diagnostic laboratories in the US | The coronavirus disease 2019 (COVID-19) pandemic presents a continued public health challenge across the world. Veterinary diagnostic laboratories in the U.S. use real-time reverse transcriptase PCR (RT-PCR) for animal testing, and many are certified for testing human samples, so ensuring laboratories have sensitive and specific SARS-CoV-2 testing methods is a critical component of the pandemic response. In 2020, the FDA Veterinary Laboratory Investigation and Response Network (Vet-LIRN) led the first round of an Inter-Laboratory Comparison (ILC) Exercise to help laboratories evaluate their existing real-time RT-PCR methods for detecting SARS-CoV-2. The ILC1 results indicated that all participating laboratories were able to detect the viral RNA spiked in buffer and PrimeStore molecular transport medium (MTM). The current ILC (ILC2) aimed to extend ILC1 by evaluating analytical sensitivity and specificity of the methods used by participating laboratories to detect three SARS-CoV-2 variants (B.1, B.1.1.7 (Alpha) and B.1.351 (Beta)). ILC2 samples were prepared with RNA at levels between 10 to 10,000 copies per 50 L MTM. Fifty-seven sets of results from 45 laboratories were qualitatively and quantitatively analyzed according to the principles of ISO 16140-2:2016. The results showed that over 95% of analysts detected the SARS-CoV-2 RNA in MTM at 500 copies or higher for all three variants. In addition, 81% and 92% of the analysts achieved a Level of Detection (LOD95eff. vol.) below 20 copies in the assays with nucleocapsid markers N1 and N2, respectively. The analytical specificity of the evaluated methods was over 99%. The study allowed participating laboratories to assess their current method performance, identify possible limitations, and recognize method strengths as part of a continuous learning environment to support the critical need for reliable diagnosis of COVID-19 in potentially infected animals and humans. | infectious diseases |
10.1101/2022.04.08.22273634 | A prospective observational study on BBV152 coronavirus vaccine use in adolescents and comparison with adults- first real-world safety analysis | BackgroundThe BBV152 COVID-19 vaccine (COVAXIN) has recently been approved for adolescents. We provide the first real world safety data of COVAXIN use in adolescents and compare this with adults.
MethodsA prospective observational study is being conducted since January 2022. Enrolled adolescents and adults were contacted telephonically after 14 days of receiving the BBV152 vaccine. Primary outcome was vaccine safety assessed as rates of adverse events following immunization (AEFI). Severity grading of AEFIs was done using the FDA scale.
FindingsA total of 698 adolescents and 326 adults were enrolled. AEFIs occurred in 36.3% adolescents after first and in 37.9% after second dose. Systemic involvement was seen in 15-17% adolescents. Injection site pain and fever were the common AEFIs. Majority of AEFIs were mild-moderate. Severe and atypical AEFIs were observed in 0.9% and 0.6% adolescents respectively. Majority of AEFIs recovered in 1-2 days. In >2% adolescents, AEFIs were persisting at 14-day follow-up since the second dose. No difference in AEFI incidence and patterns was observed between adolescents and adults. Regression analysis showed females and those with history of allergy to be respectively at 1.5-times and 3-times increased risk of AEFIs among adolescents.
InterpretationCOVAXIN carries an overall favorable short term safety profile in adolescents. The observed AEFI rates in adolescents are much lower than that reported with mRNA vaccines. Female adolescents and those with history of allergy need watchfulness. With some AEFIs persisting at 14 days, a longer follow-up is recommended to strengthen the safety data of these vaccines.
FundingNo funding support | infectious diseases |
10.1101/2022.04.07.22273565 | Omicron BA.1 and BA.2 immune response in naive and prior infected persons | The highly transmissible SARS-CoV-2 Omicron (B.1.1.529) variant has replaced previous variants and is less susceptible to neutralizing antibodies elicited by vaccination or infection1-3. Currently, BA.2 is the dominant omicron sublineage4. Vaccinated individuals with BA.1 infection develop measurable neutralizing antibody titers against BA.1 and BA.25. The ability of BA.2 infection to induce neutralizing antibodies in unvaccinated individuals, either without or with previous SARS-CoV-2 infection, is pending definition. | infectious diseases |
10.1101/2022.04.04.22273426 | Specific Associations Between Type of Childhood Abuse and Elevated C-Reactive Protein in Young Adult Psychiatric Rehabilitation Participants | BackgroundEarly life adversity such as childhood emotional, physical, and sexual trauma is associated with a plethora of later-life psychiatric and chronic medical conditions, including elevated inflammatory markers. Although previous research suggests a role for chronic inflammatory dysfunctions in several disease etiologies, specific associations between childhood trauma types and later life inflammation and health status are not well understood.
MethodsWe studied patients (n=280) who were admitted to a psychiatric rehabilitation center. Self-reported histories of childhood emotional, physical, and sexual trauma history were collected. At the time of admission, we also assessed individuals body mass index (BMI) and collected blood samples that were used to examine levels of inflammatory marker C-reactive protein (CRP).
ResultsThe prevalence of all three types of abuse were quite high, at 21% or more. 50% of the sample had elevations in CRP, with clinically significant elevations in 26%. We found that compared to a history of emotional or physical abuse, a history of childhood sexual trauma was more specifically associated with elevated CRP. This result held up when controlling for BMI.
LimitationOur sample is relatively young, with an average age of 27.2 years, with minimal representation of ethnic and racial minority participants.
ConclusionRelative to childhood emotional and physical trauma, childhood sexual trauma may lead to elevated inflammatory responses, which were common overall in the sample. Future studies need to assess the causal link between childhood sexual trauma and poorer health outcomes later in life.
HIGHLIGHTSO_LI- The prevalence of both childhood abuse experiences and elevations in inflammatory markers were quite high.
C_LIO_LI- We found that the history and severity of childhood sexual abuse were differentially correlated with later life inflammatory status and body mass index, with childhood emotional and physical abuse not showing the same degree of correlation with inflammatory status later in early adulthood.
C_LIO_LI- These results demonstrate how specific elements of environmental adversity, which, when suffered at a critical developmental period, can have lingering negative physiological consequences later in life.
C_LI | psychiatry and clinical psychology |
10.1101/2022.04.03.22273162 | Social Support during pregnancy: A phenomenological exploration of young womens experiences of support networks on pregnancy care and wellbeing in Soweto, South Africa. | Social support is deemed to have a crucial influence on maternal health and wellbeing during pregnancy. The objective of the study was to explore the experiences of pregnant young females and their receipt of social support in Soweto, South Africa. An interpretive phenomenological approach was employed to understand and interpret pregnant young womens lived experiences of support networks on their pregnancy care and wellbeing. Data was collected conducting 18 indepth interviews with young pregnant women. Analysis of the data resulted in the development of two superordinate themes: (1) relationships during pregnancy and (2) network involvement. Each superordinate theme was linked to subthemes that helped explain whether young women had positive or negative experiences of social support during their pregnancy care, and their wellbeing. The sub-themes emanating from the superordinate theme relationships during pregnancy were (a) behavioural response of partner following disclosure of pregnancy, (b) behavioural response of family following disclosure of pregnancy, and (c) sense of emotional security. Accompanying subthemes of the superordinate theme network involvement were (a) emotional and instrumental support, and (b) information support. An interpretation of the young womens experiences has revealed that young womens satisfaction with existing support networks and involvement of the various social networks contributed greatly to the participants having a greater sense of potential parental efficacy and increased acceptance of their pregnancies. Pregnant women who receive sufficient social support from immediate networks have increased potential to embrace and give attention to pregnancy-related changes. This could, in turn, foster positive behavioural outcomes that encourage engaging in good pregnancy care practices and acceptance of motherhood.
Focusing on previously unexamined factors that could improve maternal health, such as social support, could improve maternal mortality rates and help achieve reproductive health accessibility universally. | obstetrics and gynecology |
10.1101/2022.04.07.22273561 | The effect of the COVID-19 lockdown on mental health care use in South Africa: an interrupted time series analysis | AimsIn March 2020, South Africa introduced a lockdown in response to the COVID-19 pandemic, entailing the suspension of all non-essential activities and a complete ban of tobacco and alcohol sales. We studied the effect of the lockdown on mental health care utilisation rates in private-sector care in South Africa.
MethodsWe did an interrupted time series analysis using insurance claims from January 1, 2017, to June 1, 2020 of beneficiaries 18 years or older from a large private sector medical aid scheme. We calculated weekly outpatient consultation and hospital admission rates for organic mental disorders, substance use disorders, serious mental disorders, depression, anxiety, other mental disorders, any mental disorder, and alcohol withdrawal syndrome. We calculated adjusted odds ratios (OR) for the effect of the lockdown on weekly outpatient consultation and hospital admission rates and the weekly change in rates during the lockdown until June 1, 2020.
Results710,367 persons were followed up for a median of 153 weeks. Hospital admission rates (OR 0.38; 95% CI 0.33-0.44) and outpatient consultation rates (OR 0.74; 95% CI 0.63-0.87) for any mental disorder decreased substantially after the lockdown and did not recover to pre-lockdown levels until June 1, 2020. Health care utilisation rates for alcohol withdrawal syndrome doubled after the introduction of the lockdown, but the statistical uncertainty around the estimates was large (OR 2.24; 95% CI 0.69-7.24).
ConclusionsReduced mental health care contact rates during the COVID-19 lockdown likely reflect a substantial unmet need for mental health services with potential long-term consequences for mental health patients and their families. Steps to ensure access and continuity of mental health services during future lockdowns should be considered. | epidemiology |
10.1101/2022.04.08.22273642 | Antibiotic courses for acute otitis media alter human gut microbiome resistome and colonization resistance against antimicrobial resistant species in children | Antimicrobial resistance (AMR) is a major global public health problem. Human gut microbiome plays an important role in modulating AMR. On one hand, the microbiome itself can serve as a reservoir of AMR genes, i.e. resistome. On the other hand, the microbiome performs colonization resistance, preventing invasive microbes from colonizing the gastrointestinal tract. In this study, we investigated how antibiotic treatment affects the resistome and colonization resistance of the gut microbiome in children receiving amoxicillin, amoxicillin-clavulanate, or no treatment for acute otitis media in a randomized clinical trial. Fecal samples from children receiving an antibiotic or no treatment before and after the treatment were analyzed using deep metagenomic sequencing. We used a flow cytometry-based approach to quantify the bacterial load in the fecal samples. Both metagenomic sequencing-based relative abundance and flow cytometry-based absolute abundance of the microbial species were analyzed. We found that the resistome fluctuated over time and in a small fraction ([~]10%) of subjects, AMR genes increased rapidly due to colonization by AMR species, even in the control group without any antibiotic treatment. Amoxicillin significantly increased the risk for invasive species, especially pathogenic species carrying AMR genes, to colonize the gut. We also found that children lacking Blautia, Ruminococcus, Faecalibacterium, Roseburia, or Faecalitalea were more vulnerable to colonization by invasive AMR species in their gut microbiome. | infectious diseases |
10.1101/2022.04.07.22273564 | Evaluation and risk communication of effects of alcohol exposure on disposable procedure masks and portable air purifiers | As electret technology can drastically improve the filtration efficiency of disposable procedure masks and portable air purifiers, it is widely used to prevent the airborne transmission of SARS-CoV-2. Furthermore, alcohol disinfectants are now widely used in offices, hospitals, and homes to prevent contact infection; hence, there is a concern that alcohol exposure may inactivate electret. In this study, 5 types of high-efficiency particulate air filter (HEPA) air purifiers--of which, one was made of fiberglass HEPA filter--14 types of cubical masks, and 11 types of pleated masks available to Japanese citizens were subjected to discharge according to the alcohol exposure protocol based on ISO (International Organization for Standardization) 16890, and changes in filtration efficiency and pressure drop were measured before and after the discharge. The results revealed that 17 (68%) of the 25 masks exhibited a significant decrease in filtration efficiency; this decrease due to discharge depended on the filter material. However, masks of polypropylene, polyethylene, and poly-vinylidene-difluoride composite fiber materials exhibited no significant decrease in filtration efficiency. In addition, 4 (80%) of the 5 HEPA filters showed a 40-64% decrease in filtration efficiency, while no decrease in filtration efficiency was observed for the fiberglass HEPA filter. Our survey (n = 500 Japanese adults, including 30 healthcare professionals) revealed that approximately 90% of the general public was unaware that the performance of masks and air purifiers could be degraded by direct spraying of alcohol--for disinfection purposes--or vapor exposure. Furthermore, 36% of the surveyed healthcare professionals indicated that they had sprayed alcohol directly on their masks. Therefore, based on the results of this experiment, we examined effective consumer warnings that could be utilized on the product labels and in the instructions. The results showed that it would be more effective to detail the extent and duration of the adverse effects of disregarding the precautions. | occupational and environmental health |
10.1101/2022.04.07.22273593 | Inequities in COVID-19 vaccine and booster coverage across Massachusetts ZIP codes: large gaps persist after the 2021/22 Omicron wave | BackgroundInequities in COVID-19 vaccine coverage may contribute to future disparities in morbidity and mortality between Massachusetts (MA) communities.
MethodsWe obtained public-use data on residents vaccinated and boosted by ZIP code (and by age group: 5-19, 20-39, 40-64, 65+) from MA Department of Public Health. We constructed population denominators for postal ZIP codes by aggregating Census-tract population estimates from the 2015-2019 American Community Survey. We excluded non-residential ZIP codes and the smallest ZIP codes containing 1% of the states population. We mapped variation in ZIP-code level primary series vaccine and booster coverage and used regression models to evaluate the association of these measures with ZIP-code-level socioeconomic and demographic characteristics. Because age is strongly associated with COVID-19 severity and vaccine access/uptake, we assessed whether observed socioeconomic and racial inequities persisted after adjusting for age composition and plotted age-specific vaccine and booster coverage by deciles of ZIP-code characteristics.
ResultsWe analyzed data on 418 ZIP codes. We observed wide geographic variation in primary series vaccination and booster rates, with marked inequities by ZIP-code-level education, median household income, essential worker share, and racial-ethnic composition. In age-stratified analyses, primary series vaccine coverage was very high among the elderly. However, we found large inequities in vaccination rates among younger adults and children, and very large inequities in booster rates for all age groups. In multivariable regression models, each 10 percentage point increase in "percent college educated" was associated with a 5.0 percentage point increase in primary series vaccine coverage and a 4.9 percentage point increase in booster coverage. Although ZIP codes with higher "percent Black/Latino/Indigenous" and higher "percent essential workers" had lower vaccine coverage, these associations became strongly positive after adjusting for age and education, consistent with high demand for vaccines among Black/Latino/Indigenous and essential worker populations.
ConclusionOne year into MAs vaccine rollout, large disparities in COVID-19 primary series vaccine and booster coverage persist across MA ZIP codes.
O_TEXTBOXKey Messages
O_LIAs of March 2022, in the wake of MAs Omicron wave, there were large inequities in ZIP-code-level vaccine and booster coverage by income, education, percent Black/Latino/Indigenous, and percent essential workers.
C_LIO_LIEducation was the strongest predictor of ZIP-code vaccine coverage in MA.
C_LIO_LICoverage gaps in ZIP codes with many essential workers and large Black/Latino/Indigenous populations are troubling, as these groups face disproportionate risk for COVID-19 infection and severe illness. However, we found no evidence that "hesitancy" drives vaccination gaps. After adjusting for age and education levels, vaccine uptake was higher in ZIP codes with many Black/Latino/Indigenous residents or essential workers.
C_LIO_LIGaps in vaccine and booster coverage among vulnerable groups may lead to excess morbidity, mortality, and economic losses during the next COVID-19 wave. These burdens will not be equitably shared and are preventable.
C_LI
C_TEXTBOX | epidemiology |
10.1101/2022.04.07.22273593 | Inequities in COVID-19 vaccine and booster coverage across Massachusetts ZIP codes: large gaps persist after the 2021/22 Omicron wave | BackgroundInequities in COVID-19 vaccine coverage may contribute to future disparities in morbidity and mortality between Massachusetts (MA) communities.
MethodsWe obtained public-use data on residents vaccinated and boosted by ZIP code (and by age group: 5-19, 20-39, 40-64, 65+) from MA Department of Public Health. We constructed population denominators for postal ZIP codes by aggregating Census-tract population estimates from the 2015-2019 American Community Survey. We excluded non-residential ZIP codes and the smallest ZIP codes containing 1% of the states population. We mapped variation in ZIP-code level primary series vaccine and booster coverage and used regression models to evaluate the association of these measures with ZIP-code-level socioeconomic and demographic characteristics. Because age is strongly associated with COVID-19 severity and vaccine access/uptake, we assessed whether observed socioeconomic and racial inequities persisted after adjusting for age composition and plotted age-specific vaccine and booster coverage by deciles of ZIP-code characteristics.
ResultsWe analyzed data on 418 ZIP codes. We observed wide geographic variation in primary series vaccination and booster rates, with marked inequities by ZIP-code-level education, median household income, essential worker share, and racial-ethnic composition. In age-stratified analyses, primary series vaccine coverage was very high among the elderly. However, we found large inequities in vaccination rates among younger adults and children, and very large inequities in booster rates for all age groups. In multivariable regression models, each 10 percentage point increase in "percent college educated" was associated with a 5.0 percentage point increase in primary series vaccine coverage and a 4.9 percentage point increase in booster coverage. Although ZIP codes with higher "percent Black/Latino/Indigenous" and higher "percent essential workers" had lower vaccine coverage, these associations became strongly positive after adjusting for age and education, consistent with high demand for vaccines among Black/Latino/Indigenous and essential worker populations.
ConclusionOne year into MAs vaccine rollout, large disparities in COVID-19 primary series vaccine and booster coverage persist across MA ZIP codes.
O_TEXTBOXKey Messages
O_LIAs of March 2022, in the wake of MAs Omicron wave, there were large inequities in ZIP-code-level vaccine and booster coverage by income, education, percent Black/Latino/Indigenous, and percent essential workers.
C_LIO_LIEducation was the strongest predictor of ZIP-code vaccine coverage in MA.
C_LIO_LICoverage gaps in ZIP codes with many essential workers and large Black/Latino/Indigenous populations are troubling, as these groups face disproportionate risk for COVID-19 infection and severe illness. However, we found no evidence that "hesitancy" drives vaccination gaps. After adjusting for age and education levels, vaccine uptake was higher in ZIP codes with many Black/Latino/Indigenous residents or essential workers.
C_LIO_LIGaps in vaccine and booster coverage among vulnerable groups may lead to excess morbidity, mortality, and economic losses during the next COVID-19 wave. These burdens will not be equitably shared and are preventable.
C_LI
C_TEXTBOX | epidemiology |
10.1101/2022.04.08.21259596 | Evaluate the reliability of the apprenticeship in the first year of medical school: towards a reliable first level ultrasound examination. | ObjectivesOur aim was to motivate apprentices sonographer needs, to appraise their own measurements, to reduce inconsistencies within and between operators. Deep knowledge of ultrasound sectional anatomy is mandatory for an appropriate performance.
MethodsIn three different weekdays, 3 sonographer apprentices (rater), randomly selected from a cohort of San Paolo Medical School first year students participated in vertically integrated study of living anatomy through ultrasound examination, repeated lumbar multifidus cross-sections scans on 6 subjects at lumbar level. The Agreement R package 0.8-1 was used to monitored the performances of each apprentice.
ResultsThe agreement (CCCintra 0.6749; CCCinter 0.556; CCCtotal is 0.5438) was further from least acceptable CCC of 0.92-0.95. The precision indices (precisionintra 0.6749; inter 0.801; total0.6274) were unsatisfactory, while the accuracy was high (0.9889 to 0.9913). The same occurred for the agreement on rater performances comparisons, where readings were high accurate (0.9537 to 0.9733) but moderately precise (0.7927 to 0.8895), not interchangeable TIR (1.173) but without rater supremacy. IIR (r1 vs r2 1.104, r1 vs r3 1.015, r2 vs r3 0.92) 95% confidence limits.
ConclusionsApprentices were not reliable, repeatable, interchangeable. The weak link in the method seemed to be cultural weakness on vivo imaging morphologies, qualitative and quantitative measurement procedure on elementary statistical processing. | medical education |
10.1101/2022.04.07.22273534 | Quantifying the relationship between SARS-CoV-2 wastewater concentrations and building-level COVID-19 prevalence at an isolation residence using a passive sampling approach | SARS-CoV-2 RNA can be detected in the excreta of individuals with COVID-19 and has demonstrated a positive correlation with various clinical parameters. Consequently, wastewater-based epidemiology (WBE) approaches have been implemented globally as a public health surveillance tool to monitor the community-level prevalence of infections. Over 270 higher education campuses monitor wastewater for SARS-CoV-2, with most gathering either composite samples via automatic samplers (autosamplers) or grab samples. However, autosamplers are expensive and challenging to manage with seasonal variability, while grab samples are particularly susceptible to temporal variation when sampling sewage directly from complex matrices outside residential buildings. Prior studies have demonstrated encouraging results utilizing passive sampling swabs. Such methods can offer affordable, practical, and scalable alternatives to traditional methods while maintaining a reproducible SARS-CoV-2 signal. In this regard, we deployed tampons as passive samplers outside of a COVID-19 isolation unit (a segregated residence hall) at a university campus from February 1, 2021 - May 21, 2021. Samples were collected several times weekly and remained within the sewer for a minimum of 24 hours (n = 64). SARS-CoV-2 RNA was quantified using reverse transcription-quantitative polymerase chain reaction (RT-qPCR) targeting the viral N1 and N2 gene fragments. We quantified the mean viral load captured per individual and the association between the daily viral load and total persons, adjusting for covariates using multivariable models to provide a baseline estimate of viral shedding. Samples were processed through two distinct laboratory pipelines on campus, yielding highly correlated N2 concentrations. Data obtained here highlight the success of passive sampling utilizing tampons to capture SARS-CoV-2 in wastewater coming from a COVID-19 isolation residence, indicating that this method can help inform public health responses in a range of settings.
HighlightsO_LIDaily SARS-CoV-2 RNA loads in building-level wastewater were positively associated with the total number of COVID-19 positive individuals in the residence
C_LIO_LIThe variation in individual fecal shedding rates of SARS-CoV-2 extended four orders of magnitude
C_LIO_LIWastewater sample replicates were highly correlated using distinct processing pipelines in two independent laboratories
C_LIO_LIWhile the isolation residence was occupied, SARS-CoV-2 RNA was detected in all passive samples
C_LI | occupational and environmental health |
10.1101/2022.04.09.22273646 | Trends of Cutaneous Leishmaniasis, Western Ethiopia: retrospective study. | BackgroundCutaneous leishmaniasis (CL) is the most common form of leishmaniasis and causes skin lesions, mainly ulcers, on exposed parts of the body, leaving life-long scars and serious disability or stigma. In Ethiopia, cutaneous leishmaniasis is primarily caused by Leishmania aethiopica and less often by Leishmania Tropica and Leishmania major. There is a major prevalence gap in study areas. Hence, this study assessed the trends of cutaneous leishmaniasis in the western part of Ethiopia.
MethodologyA three-year retrospective study (09 October 2018 to 31 January 2022) was conducted by extracting information from the national leishmaniasis register for patients visiting the Nekemte Specialized Hospital (NSH) treatment center, Nekemte, Western Ethiopia. A standard data abstraction checklist was used to review Leishmaniasis records. Data were extracted from national leishmaniasis cases registration book by principal investigators and summarized using Microsoft Excel. All data were entered and analyzed using the Excel Microsoft office package.
ResultsA total of 64 patients were treated for cutaneous leishmaniasis in the area during the study period. About 35(54.69%) cutaneous leishmaniasis cases were males, and the median age for sex was 18.5 years. Most of the cases were among those aged 15-24 years (39.1%) while extreme age groups reported the least. About 35 (54.69%) of cutaneous leishmaniasis cases were from rural areas, and two-thirds (31, 65.96%) of patients were seeking of medical treatment after 3-6 months developing sign and symptoms. One-fourth (17, 26.56%) of CL cases were reported in January followed by August (10, 15.63), and there were no cases reported in June and October.
ConclusionThe most affected age group are those 15-24 years and those from rural communities. January is months most cases reported and late coming to treatment and needs awareness creation.
Author summaryGlobally, cutaneous leishmaniasis (CL) is the most common form of leishmaniasis which accounts for about 95% of cases. It is an emerging uncontrolled and neglected infection affecting millions yearly. Most CL patients are residing in low- to middle-income countries, where limited healthcare budgets and a large burden caused by other ailments such as malaria, tuberculosis, and HIV (human immunodeficiency virus) are prominent. Accurate disease burden is challenging since misdiagnosis is common, and there are no standard reporting guidelines. There is limited information regarding the magnitude of the cases in low and middle-income countries, including Ethiopia. The lack of epidemiological burden and distribution makes it difficult to advocate for control activities and further research to inform public health policy. This study aimed to assess the trends of CL in the western part of Ethiopia, to fill the gaps in the dearth of information in the area. The study highlighted the distribution of CL cases by gender, age, seasons of the year, and geographical areas (rural or rural). Moreover, we recommend community-based research programs to determine the exact incidence and prevalence of CL cases and associated risk factors in the western part of Ethiopia, particularly in the East Wollega Zone. | epidemiology |
10.1101/2022.04.08.22273582 | Real-World Evidence of the Effectiveness and Safety of Generic Tofacitinib in Rheumatoid Arthritis Patients: A Retrospective, Single-Centre Analysis from Western India | BackgroundGeneric tofacitinib has been available in India for more than a year and is widely used in rheumatoid arthritis (RA) therapy. There is scarce real-world data on its effectiveness and safety from India, especially given infection endemicity.
MethodsWe retrospectively analysed records (demographic and clinical information, haematology and biochemistry, adverse events) of patients prescribed generic tofacitinib from a single centre in Mumbai, India. Disease activity was calculated using the disease activity score-28 and erythrocyte sedimentation rate (DAS28-ESR) and other tools, and we used paired T-tests for significant response. We defined clinical tofacitinib failure as a composite outcome, including clinicians decision to change to an alternative disease-modifying anti-rheumatic drug (DMARD) or flare after self-withdrawal. We performed logistic regression and survival analysis for determinants of clinical failure.
ResultsWe reviewed records of 102 patients (92 female; median age: 53 years) with mean RA duration of 146 months. Thirteen had prior treatment with innovator tofacitinib. There was significant improvement in disease activity parameters at a mean duration of 186 days. No serious adverse events were reported; 4 patients had tuberculosis and 19 patients had mild COVID-19 while on treatment. Clinical failure was seen in 25 patients, and mean time to failure on survival analysis was 357 days. No baseline characteristic predicted clinical failure.
InterpretationGeneric tofacitinib showed good effectiveness and a tolerable adverse effect profile, despite tuberculosis endemicity andCOVID-19.. Setting up registries would be valuable in gaining more data on generic tofacitinib.
Key points- There is scarce data from India regarding the use of tofacitinib in rheumatoid arthritis, despite widespread use
- In this retrospective analysis of 102 patients at a single centre, we found tofacitinib monotherapy was efficacious and tolerable.
- Tuberculosis was detected in four and nineteen patients had mild covid. | rheumatology |
10.1101/2022.04.08.22273571 | Risk and severity of SARS-CoV-2 reinfections during 2020-2022 in Vojvodina, Serbia: a population-level study | BackgroundData on the rate and severity of SARS-CoV-2 reinfections in real-world settings are scarce and the effects of vaccine boosters on reinfection risk are unknown.
MethodsIn a retrospective cohort study, registered SARS-CoV-2 laboratory-confirmed Vojvodina residents, between March 6, 2020 and October 31, 2021, were followed for reinfection [≥]90 days after primary infection. Data were censored at the end of follow-up (January 31, 2022) or death. The reinfection risk was visualized with Kaplan-Meier plots. To examine the protective effect of vaccination, the subset of individuals with primary infection in 2020 (March 6-December 31) were matched (1:2) with controls without reinfection.
FindingsUntil January 31, 2022, 13,792 reinfections were recorded among 251,104 COVID-19 primary infections (5.49%). Most reinfections (86.8%) were recorded in January 2022. Reinfections were mostly mild (99.2%). Hospitalizations were uncommon (1.8% vs. 3.70% in primary infection) and COVID-19 deaths were very rare (n=20, case fatality rate 0.15%). The overall incidence rate of reinfections was 5.99 (95% CI 5.89-6.09) per 1,000 person-months. The reinfection risk was estimated as 0.76% at six months, 1.36% at nine months, 4.96% at 12 months, 16.7% at 15 months, and 18.9% at 18 months. Unvaccinated (OR=1.23; 95%CI=1.14-1.33), incompletely (OR=1.33; 95%CI=1.08-1.64) or completely vaccinated (OR=1.50; 95%CI=1.37-1.63), were modestly more likely to be reinfected compared with recipients of a third (booster) vaccine dose.
InterpretationSARS-CoV-2 reinfections were uncommon until the end of 2021 but became common with the advent of Omicron. Very few reinfections were severe. Boosters may modestly reduce reinfection risk. | infectious diseases |
10.1101/2022.04.08.22273571 | Risk and severity of SARS-CoV-2 reinfections during 2020-2022 in Vojvodina, Serbia: a population-level study | BackgroundData on the rate and severity of SARS-CoV-2 reinfections in real-world settings are scarce and the effects of vaccine boosters on reinfection risk are unknown.
MethodsIn a retrospective cohort study, registered SARS-CoV-2 laboratory-confirmed Vojvodina residents, between March 6, 2020 and October 31, 2021, were followed for reinfection [≥]90 days after primary infection. Data were censored at the end of follow-up (January 31, 2022) or death. The reinfection risk was visualized with Kaplan-Meier plots. To examine the protective effect of vaccination, the subset of individuals with primary infection in 2020 (March 6-December 31) were matched (1:2) with controls without reinfection.
FindingsUntil January 31, 2022, 13,792 reinfections were recorded among 251,104 COVID-19 primary infections (5.49%). Most reinfections (86.8%) were recorded in January 2022. Reinfections were mostly mild (99.2%). Hospitalizations were uncommon (1.8% vs. 3.70% in primary infection) and COVID-19 deaths were very rare (n=20, case fatality rate 0.15%). The overall incidence rate of reinfections was 5.99 (95% CI 5.89-6.09) per 1,000 person-months. The reinfection risk was estimated as 0.76% at six months, 1.36% at nine months, 4.96% at 12 months, 16.7% at 15 months, and 18.9% at 18 months. Unvaccinated (OR=1.23; 95%CI=1.14-1.33), incompletely (OR=1.33; 95%CI=1.08-1.64) or completely vaccinated (OR=1.50; 95%CI=1.37-1.63), were modestly more likely to be reinfected compared with recipients of a third (booster) vaccine dose.
InterpretationSARS-CoV-2 reinfections were uncommon until the end of 2021 but became common with the advent of Omicron. Very few reinfections were severe. Boosters may modestly reduce reinfection risk. | infectious diseases |
10.1101/2022.04.08.22273571 | Risk and severity of SARS-CoV-2 reinfections during 2020-2022 in Vojvodina, Serbia: a population-level study | BackgroundData on the rate and severity of SARS-CoV-2 reinfections in real-world settings are scarce and the effects of vaccine boosters on reinfection risk are unknown.
MethodsIn a retrospective cohort study, registered SARS-CoV-2 laboratory-confirmed Vojvodina residents, between March 6, 2020 and October 31, 2021, were followed for reinfection [≥]90 days after primary infection. Data were censored at the end of follow-up (January 31, 2022) or death. The reinfection risk was visualized with Kaplan-Meier plots. To examine the protective effect of vaccination, the subset of individuals with primary infection in 2020 (March 6-December 31) were matched (1:2) with controls without reinfection.
FindingsUntil January 31, 2022, 13,792 reinfections were recorded among 251,104 COVID-19 primary infections (5.49%). Most reinfections (86.8%) were recorded in January 2022. Reinfections were mostly mild (99.2%). Hospitalizations were uncommon (1.8% vs. 3.70% in primary infection) and COVID-19 deaths were very rare (n=20, case fatality rate 0.15%). The overall incidence rate of reinfections was 5.99 (95% CI 5.89-6.09) per 1,000 person-months. The reinfection risk was estimated as 0.76% at six months, 1.36% at nine months, 4.96% at 12 months, 16.7% at 15 months, and 18.9% at 18 months. Unvaccinated (OR=1.23; 95%CI=1.14-1.33), incompletely (OR=1.33; 95%CI=1.08-1.64) or completely vaccinated (OR=1.50; 95%CI=1.37-1.63), were modestly more likely to be reinfected compared with recipients of a third (booster) vaccine dose.
InterpretationSARS-CoV-2 reinfections were uncommon until the end of 2021 but became common with the advent of Omicron. Very few reinfections were severe. Boosters may modestly reduce reinfection risk. | infectious diseases |
10.1101/2022.04.06.22269907 | Proteomic Fingerprinting: A novel privacy concern | IntroductionPrivacy protection is a core principle of genomic research but needs further refinement for high-throughput proteomic platforms.
MethodsWe identified independent single nucleotide polymorphism (SNP) quantitative trait loci (pQTL) from COPDGene and Jackson Heart Study (JHS) and then calculated genotype probabilities by protein level for each protein-genotype combination (training). Using the most significant 100 proteins, we applied a naive Bayesian approach to match proteomes to genomes for 2,812 independent subjects from COPDGene, JHS, SubPopulations and InteRmediate Outcome Measures In COPD Study (SPIROMICS) and Multi-Ethnic Study of Atherosclerosis (MESA) with SomaScan 1.3K proteomes and also 2,646 COPDGene subjects with SomaScan 5K proteomes (testing). We tested whether subtracting mean genotype effect for each pQTL SNP would obscure genetic identity.
ResultsIn the four testing cohorts, we were able to correctly match 90%-95% their proteomes to their correct genome and for 95%-99% we could match the proteome to the 1% most likely genome. With larger profiling (SomaScan 5K), correct identification was > 99%. The accuracy of matching in subjects with African ancestry was lower ([~]60%) unless training included diverse subjects. Mean genotype effect adjustment reduced identification accuracy nearly to random guess.
ConclusionLarge proteomic datasets (> 1,000 proteins) can be accurately linked to a specific genome through pQTL knowledge and should not be considered deidentified. These findings suggest that large scale proteomic data be given privacy protections of genomic data, or that bioinformatic transformations (such as adjustment for genotype effect) should be applied to obfuscate identity. | genetic and genomic medicine |
10.1101/2022.04.04.22273376 | From elimination to suppression: genomic epidemiology of a large Delta SARS-CoV-2 outbreak in Aotearoa New Zealand | New Zealands COVID-19 elimination strategy heavily relied on the use of genomics to inform contact tracing, linking cases to the border and to clusters during community outbreaks. In August 2021, New Zealand entered its second nationwide lockdown after the detection of a single community case with no immediately apparent epidemiological link to the border. This incursion resulted in the largest outbreak seen in New Zealand caused by the Delta Variant of Concern. Here we generated 3806 high quality SARS-CoV-2 genomes from cases reported in New Zealand between 17 August and 1 December 2021, representing 43% of reported cases. We detected wide geographical spread coupled with undetected community transmission, characterised by the apparent extinction and reappearance of genomically linked clusters. We also identified the emergence, and near replacement, of genomes possessing a 10-nucleotide frameshift deletion that caused the likely truncation of accessory protein ORF7a. By early October, New Zealand moved from elimination to suppression and the role of genomics changed markedly from being used to track and trace, towards population-level surveillance. | infectious diseases |
10.1101/2022.04.08.22273533 | The impact of Illinois' comprehensive handheld phone ban on talking on handheld and handsfree cellphones while driving | IntroductionDistracted driving has been linked to multiple driving decrements and is responsible for thousands of motor vehicle fatalities annually. Most US states have enacted restrictions on cellphone use while driving, the strictest of which prohibit any manual operation of a cellphone while driving. Illinois enacted such a law in 2014. To better understand how this law affected cellphone behaviors while driving, we estimated associations between Illinois handheld phone ban and self-reported talking on handheld, handsfree, and any cellphone (handheld or handsfree) while driving.
MethodsWe leveraged data from annual administrations of the Traffic Safety Culture Index from 2012-2017 in Illinois and a set of control states. We cast the data into a difference-in-differences (DID) modeling framework, which compared Illinois to control states in terms of pre-to post-intervention changes in the proportion of drivers who self-reported the three outcomes. We fit separate models for each outcome, and fit additional models to the subset of drivers who talk on cellphones while driving.
ResultsIn Illinois, the pre-to post-intervention decrease in the drivers probability of self-reporting talking on a handheld phone was significantly more extreme than that of drivers in control states (DID estimate -0.22; 95% CI -0.31, -0.13). Among drivers who talk on cellphones while driving, those in Illinois exhibited a more extreme increase in the probability of talking on a handsfree phone while driving than those control states (DID estimate 0.13; 95% CI 0.03, 0.23).
ConclusionsOur results suggest that Illinois handheld phone ban reduced talking on handheld phones while driving and corroborated the hypothesis that the ban promoted harm-reduction via substitution from handheld to handsfree phones among drivers who talk on the phone while driving.
Practical ApplicationsOur findings should encourage other states to enact comprehensive handheld phone bans to improve traffic safety. | epidemiology |
10.1101/2022.04.08.22273533 | The impact of Illinois' comprehensive handheld phone ban on talking on handheld and handsfree cellphones while driving | IntroductionDistracted driving has been linked to multiple driving decrements and is responsible for thousands of motor vehicle fatalities annually. Most US states have enacted restrictions on cellphone use while driving, the strictest of which prohibit any manual operation of a cellphone while driving. Illinois enacted such a law in 2014. To better understand how this law affected cellphone behaviors while driving, we estimated associations between Illinois handheld phone ban and self-reported talking on handheld, handsfree, and any cellphone (handheld or handsfree) while driving.
MethodsWe leveraged data from annual administrations of the Traffic Safety Culture Index from 2012-2017 in Illinois and a set of control states. We cast the data into a difference-in-differences (DID) modeling framework, which compared Illinois to control states in terms of pre-to post-intervention changes in the proportion of drivers who self-reported the three outcomes. We fit separate models for each outcome, and fit additional models to the subset of drivers who talk on cellphones while driving.
ResultsIn Illinois, the pre-to post-intervention decrease in the drivers probability of self-reporting talking on a handheld phone was significantly more extreme than that of drivers in control states (DID estimate -0.22; 95% CI -0.31, -0.13). Among drivers who talk on cellphones while driving, those in Illinois exhibited a more extreme increase in the probability of talking on a handsfree phone while driving than those control states (DID estimate 0.13; 95% CI 0.03, 0.23).
ConclusionsOur results suggest that Illinois handheld phone ban reduced talking on handheld phones while driving and corroborated the hypothesis that the ban promoted harm-reduction via substitution from handheld to handsfree phones among drivers who talk on the phone while driving.
Practical ApplicationsOur findings should encourage other states to enact comprehensive handheld phone bans to improve traffic safety. | epidemiology |
10.1101/2022.04.08.22273525 | Talking on handsfree and handheld cellphones while driving in association with handheld phone bans. | IntroductionConcurrent use of a cellphone while driving impairs driving abilities, and studies of policy effectiveness in reducing distracted driving have yielded mixed results. Furthermore, few studies have considered how hands-free phone use associates with handheld phone bans. It is not clear whether hand-held phone bans dissuade some drivers from using the phone while driving completely, or whether it simply promotes a shift to hands-free use. The present study estimates the association between handheld phone policies and self-reported talking on handsfree and handheld cellphones while driving.
MethodsOur data consisted of 16,067 respondents to annual administrations of the Traffic Safety Culture Index from 2012-2017. Our primary exposure variable was handheld phone policy, and our primary outcome variables were self-reported talking on any phone, self-reported talking on a handheld phone, and self-reported talking on a hands-free phone while driving. We estimated adjusted prevalence ratios of the outcomes associated with handheld phone bans via modified Poisson regression.
ResultsDrivers in states with handheld bans were 13% less likely to self-report talking on any type of cellphone (handheld or handsfree) while driving. When broken down by cellphone type, drivers in states with handheld bans were 38% less likely to self-report talking on a handheld phone and 10% more likely to self-report talking on a hands-free phone while driving.
ConclusionsHandheld phone bans were associated with more self-reported talking on hands-free phones and less talking on handheld phones, consistent with a substitution hypothesis. Handheld bans were also associated with less talking on any phone while driving, supporting a net safety benefit.
Practical ApplicationsIn the absence of a national ban on handheld phone use while driving, our study supports state handheld phone bans to deter distracted driving and improve traffic safety. | epidemiology |
10.1101/2022.04.08.22273525 | Talking on handsfree and handheld cellphones while driving in association with handheld phone bans. | IntroductionConcurrent use of a cellphone while driving impairs driving abilities, and studies of policy effectiveness in reducing distracted driving have yielded mixed results. Furthermore, few studies have considered how hands-free phone use associates with handheld phone bans. It is not clear whether hand-held phone bans dissuade some drivers from using the phone while driving completely, or whether it simply promotes a shift to hands-free use. The present study estimates the association between handheld phone policies and self-reported talking on handsfree and handheld cellphones while driving.
MethodsOur data consisted of 16,067 respondents to annual administrations of the Traffic Safety Culture Index from 2012-2017. Our primary exposure variable was handheld phone policy, and our primary outcome variables were self-reported talking on any phone, self-reported talking on a handheld phone, and self-reported talking on a hands-free phone while driving. We estimated adjusted prevalence ratios of the outcomes associated with handheld phone bans via modified Poisson regression.
ResultsDrivers in states with handheld bans were 13% less likely to self-report talking on any type of cellphone (handheld or handsfree) while driving. When broken down by cellphone type, drivers in states with handheld bans were 38% less likely to self-report talking on a handheld phone and 10% more likely to self-report talking on a hands-free phone while driving.
ConclusionsHandheld phone bans were associated with more self-reported talking on hands-free phones and less talking on handheld phones, consistent with a substitution hypothesis. Handheld bans were also associated with less talking on any phone while driving, supporting a net safety benefit.
Practical ApplicationsIn the absence of a national ban on handheld phone use while driving, our study supports state handheld phone bans to deter distracted driving and improve traffic safety. | epidemiology |
10.1101/2022.04.07.22273446 | Evaluation of patient-specific cell free DNA assays for monitoring of minimal residual disease in solid tumors | Real time monitoring of disease status is an essential part of cancer management. The low sensitivity and specificity of serum markers and the constraints and risks associated with radiological scans prompt the need for accurate non-invasive means to monitor minimal residual disease (MRD) in solid tumors. In this study we describe MRD evaluation via profiling of patient-specific gene variants in cell free tumor DNA (i.e., ctMRD). We evaluate the feasibility of this approach for real time monitoring of tumor load dynamics in response to anticancer treatments. We prioritized 162 hot spot mutations for designing ctMRD assays based on literature review. These ctMRD assays were evaluated in 436 plasma specimens with a median of 6 (range 3-18) longitudinal evaluations in a cohort of 48 patients with various solid tumors. In patients with partial radiological response (PR), Mutant Allele Fraction (MAF) showed high correlation (84%) with radiological response and tumor volume (cm3) compared to conventional CA markers (53%). Total plasma ctDNA level was significantly higher in patients with 2-5 metastatic sites compared with single metastatic site (P = 0.04) and discriminated patients with stable disease (SD) and progressive disease (PD) from patients with partial response (PR) (P =0.01 and P = 0.04, respectively). Collectively, the present study shows that changes in mutation burden evaluated using patient specific ctMRD assays is a highly sensitive approach for monitoring of therapy response. | oncology |
10.1101/2022.04.07.22273499 | Economic vulnerability and poor service delivery made it more difficult for shack-dwellers to comply with COVID-19 restrictions: The impracticability and inequitable burden of universal/unstratified public health policies. | In South Africa, demand for housing close to viable/sustained sources of employment has far outstripped supply; and the size of the population living in temporary structures/shacks (and in poorly serviced informal settlements) has continued to increase. While such dwellings and settlements pose a number of established risks to the health of their residents, the present study aimed to explore whether they might also undermine the potential impact of regulations intended to safeguard public health, such as the stringent lockdown restrictions imposed to curb the spread of COVID-19 in 2020 and 2021. Using a representative sample of 1,381 South African households surveyed in May-June 2021, the present study found that respondents in temporary structures/shacks were more likely to report non-compliance (or difficulty in complying) with lockdown restrictions when compared to those living in traditional/formal houses/flats/rooms/hostels (OR:1.61; 95%CI:1.06-2.45). However, this finding was substantially attenuated and lost precision following adjustment for preceding sociodemographic and economic determinants of housing quality (adjusted OR:1.20; 95%CI:0.78-1.87). Instead, respondents were far more likely to report non-compliance (or difficulty in complying) with COVID-19 lockdown restrictions if their dwellings lacked private/indoor toilet facilities (adjusted OR:1.56; 95%CI:1.08,2.22) or they were Black/African, young, poorly educated and under-employed (regardless of: their socioeconomic position, or whether they resided in temporary structures/shacks, respectively). Restrictions imposed to safeguard public health need to be more sensitively designed to accommodate the critical role that poverty and inadequate service delivery play in limiting the ability of residents living in temporary structures/shacks and inadequately serviced dwellings/settlements to comply. [250/250 words]
Significance of the main findingsO_LISouth Africans living in temporary structures/shacks are more likely to: be poorly educated and under-employed; with fewer assets and limited access to basic household services.
C_LIO_LIPoverty and inadequate service delivery were more important determinants of compliance with COVID-19 restrictions than housing quality.
C_LIO_LIIn the absence of improvements in economic circumstances and the delivery of basic household services, restrictions imposed to safeguard public health need to be more sensitively designed to take account of the structural barriers to compliance experienced by households where poverty and/or inadequate service delivery limit their ability to: stay at home; maintain hygiene; and/or practice social distancing. [100/100 words]
C_LI
O_QD"This idea of the "humbling pandemic"1 does not hold for people whose lives depend on informal economy and movement in the face of heavy restrictions on their respective activities such as... street hustle and domestic work. Therefore, the pandemic response - which employs tactics that come to determine how lives are to be lived - can be seen as an exacerbator of inequalities, by the hands of which precarious circumstances of living are a larger threat than the risk of infection"2
Stefan Ogedengbe (2021: 94)3
C_QD | health policy |
10.1101/2022.04.06.22273528 | Individual cardiorespiratory fitness exercise prescription in elderly based on BP neural network. | Cardiorespiratory fitness (CRF) declines as age increases in elderly. An individualized CRF exercise prescription can maintain the CRF level and delay aging process. Traditional exercise prescriptions are general and lack of individualization. In this paper, a new study based on back-propagation (BP) neural network, is investigated to predict the individualized CRF exercise prescriptions for elderly by correlate variables (age, sex, BMI, VO2max initial value, improvement etc.). The raw data are split to two parts, 90% for training the machine and the remaining 10% for testing the performance. Based on a database with 2078 people, the exercise prescription prediction models MAE, RMSE and R2 are1.5206,1.4383 and 0.9944. 26 female subjects aged 60-79 years are recruited to test the models validity. The VO2maxs expected improvement was set at 10%. Based on the basic information of these elder women, we get personalized exercise prescription (frequency, intensity, time and volume) of each subject. All of them finished their own exercise intervention. The results show that the post VO2max was significantly different from the pre VO2max and improved by 10.1%, and a total of 20 subjects(74.1%) improved within one standard deviation and 25 subjects(92.6%)improved within 1.96 times standard deviations. Our study shows that a high degree of accuracy in exercise suggestions for elderly was achieved by applying the BP neural network model. | health systems and quality improvement |
10.1101/2022.04.05.22273373 | Should I stay or should I go? Observation post-vaccination during the COVID-19 pandemic and the law of unintended consequences | BackgroundStandard practice after all vaccinations in Australia is to observe patients for 15 minutes. During the COVID-19 pandemic, could the risk of contracting and dying from COVID-19 acquired in the waiting room be greater than the risk of dying from post-vaccine anaphylaxis when leaving immediately?
MethodsThe risks are modelled for a patient aged 70+ years attending for annual influenza vaccination in a typical Australian general practice clinic. The risk of death from anaphylaxis is estimated based on known rates of anaphylaxis shortly after influenza vaccination. The risk of acquiring COVID-19 during a 15-minute wait and then dying from that infection is estimated using the COVID-19 Aerosol Transmission Estimator and COVID-19 Risk Calculator.
ResultsOther than at times of extremely low COVID-19 prevalence, the risk of death from anaphylaxis for a patient aged 70+ years leaving immediately after influenza vaccine is less than the risk of death from COVID-19 acquired via aerosol transmission during a 15-minute wait. The risk of death from COVID-19 is greatest for the unimmunised and when masks are not worn.
ConclusionsA more nuanced approach to advice post-vaccination is recommended that considers current COVID-19 prevalence and virulence, the characteristics of the waiting room, the risk of anaphylaxis, and the patients susceptibility to death from COVID-19. There are many circumstances where it would be safer for a patient to leave immediately after vaccination.
"If I go there will be trouble. And if I stay it will be double"
The Clash | infectious diseases |
10.1101/2022.04.06.22272763 | Immune Correlates Analysis of a Single Ad26.COV2.S Dose in the ENSEMBLE COVID-19 Vaccine Efficacy Clinical Trial | Anti-spike IgG binding antibody, anti-receptor binding domain IgG antibody, and pseudovirus neutralizing antibody measurements four weeks post-vaccination were assessed as correlates of risk of moderate to severe-critical COVID-19 outcomes through 83 days post-vaccination and as correlates of protection following a single dose of Ad26.COV2.S COVID-19 vaccine in the placebo-controlled phase of ENSEMBLE, an international, randomized efficacy trial. Each marker had evidence as a correlate of risk and of protection, with strongest evidence for 50% inhibitory dilution (ID50) neutralizing antibody titer. The outcome hazard ratio was 0.49 (95% confidence interval 0.29, 0.81; p=0.006) per 10-fold increase in ID50; vaccine efficacy was 60% (43, 72%) at nonquantifiable ID50 (< 2.7 IU50/ml) and rose to 89% (78, 96%) at ID50 = 96.3 IU50/ml. Comparison of the vaccine efficacy by ID50 titer curves for ENSEMBLE-US, the COVE trial of the mRNA-1273 vaccine, and the COV002-UK trial of the AZD1222 vaccine supported consistency of the ID50 titer correlate of protection across trials and vaccine types. | infectious diseases |
10.1101/2022.04.08.22273600 | XGBoost, a novel explainable AI technique, in the prediction of myocardial infarction, a UK Biobank cohort study | Background and objectiveMyocardial infarction is common and associated with high morbidity and mortality. This study assessed if "Explainable AI" in the form of extreme gradient boosting (XGBoost) could outperform traditional logistic regression in predicting myocardial infarction (MI) in a large cohort.
MethodsWe compared two machine learning methods, XGBoost and logistic regression in predicting risk of MI. The UK Biobank is a population-based prospective cohort including 502 506 volunteers with active consent, aged 40-69 years at recruitment from 2006 to 2010. These subjects were followed until end of 2019 and the primary outcome was myocardial infarction.
ResultsBoth models were trained using 90% of the cohort. The remaining 10% was used as a test set. Both models were equally precise, but the regression model classified more of the healthy class correctly. XGBoost was more accurate in identifying individuals who later suffered a myocardial infarction. Receiver operator characteristic (ROC) scores are class size invariant. In this metric XGBoost outperformed the logistic regression model, with ROC scores of 0.86 (accuracy 0.75; precision 0.99 and recall 0.75 for the healthy class; precision 0.07 and recall 0.81 for the MI-class) compared to 0.77 (accuracy 0.77; precision 0.99 and recall 0.77 for the healthy class; precision 0.07 and recall 0.78). Secondly, we demonstrate how SHAPley values can be used to visualize and interpret the predictions made by XGBoost models, both for the cohort test set and for individuals.
ConclusionsThe XGBoost machine learning model shows very promising results in evaluating risk of MI in a large and diverse population. This model can be used, and visualized, both for individual assessments and in larger cohorts. The predictions made by the XGBoost models, points towards a future where "Explainable AI" may help to bridge the gap between medicine and data science. | cardiovascular medicine |
10.1101/2022.04.08.22273590 | Plasma multi-omic and cardiac imaging network signatures predict poor long-term outcomes after acute myocardial infarction | BackgroundPrognostic biomarkers for patients admitted for a myocardial infarction (MI) episode are of great interest for risk stratification and follow-up care after discharge. Multi-omics analysis is a standard approach for the discovery of diagnostic and prognostic biomarkers, but few studies have evaluated the prognostic potential of molecular markers in combination with echocardiographic imaging variables.
MethodsWe measured the plasma proteome and lipidome in patients discharged from an acute MI and followed for secondary outcomes in New Zealand for a median time of 4.85 years (CDCS, N=741 for network inference, N=464 for predictive analysis) and in Singapore for a median time of 2.0 years (IMMACULATE, N=190 for validation). Using a network-based integrative analysis framework iOmicsPASS+, we mapped proteins, lipids, echocardiographic imaging variables and clinical biomarkers to a unified network and identified predictive subnetwork signatures of major adverse cardiac events (MACE) and heart failure hospitalization (HFH) in CDCS, with validation in IMMACULATE.
ResultsSpecific plasma proteins and lipids showed direct connections to cardiac imaging variables in the network. The gold standard biomarker, NT-proBNP, remained one of the best prognostic marker of MACE and HFH, but a number of plasma proteins involved in extracellular matrix organization, chemotaxis, inflammation, and apoptosis were also strong predictors of both outcomes. Hub proteins of subnetwork signatures were enriched in the heart, arteries, kidneys, liver and lungs. BMP10, CAPG, EFEMP1, FSTL3, RSPO4, and RELT were those directly connected to the echocardiographic variables and natriuretic peptides. In particular, EFEMP1 and FSTL3 in combination with diastolic function (E/e) were strongly predictive of HFH in both CDCS (AUC 0.78, 95%CI 0.72-0.83) and IMMACULATE (AUC 0.72, 0.61-0.84).
ConclusionsOur integrative analysis revealed competing signatures beyond established biomarkers of post-MI HFH, comprised of plasma proteins correlated with impaired diastolic function after the primary MI episode. | cardiovascular medicine |
10.1101/2022.04.07.22273518 | Predicting Transdiagnostic Social Impairments in Childhood using Connectome-based Predictive Modeling | BackgroundSocial impairments are core features of multiple neurodevelopmental disorders. Previous neuroimaging studies have focused on elucidating associations between brain function and social impairments within disorders but have not predicted these impairments from brain connectivity in a transdiagnostic manner, across several diagnostic categories. This study used a machine learning approach to examine functional connectivity that predicts elevated social impairments in a transdiagnostic sample of youths. We hypothesized that predictive edges would be from brain regions involved in social cognition.
MethodsConnectome-based predictive modeling (CPM) was used to build a transdiagnostic model of social impairments as measured by the Social Responsiveness Scale (SRS-2, raw score >75). We used functional connectivity data during a social movie-watching task from the Healthy Brain Network data (N=144, mean age=11.68 (3.52), 32% male). The average number of diagnoses was 3.4 (SD = 1.82, range = 0-11), including ASD (40.9%), ADHD (79%), mood disorders (15.9%), and anxiety disorders (43%). A similar transdiagnostic sample high SRS-2 scores (n=41) was used for replication.
ResultSRS-2 scores were predicted from functional connectivity data using both 10-fold cross-validation (median q2=0.32, r=0.57, p<.001) and leave-one-group-out cross-validation (median q2s>0.04, rs>0.36, ps<.001). Predictive connections were widely distributed across the brain but were rooted in regions involved in social cognition, the subcortex, and the salience network. The model successfully predicted SRS-2 scores in the replication sample (r=0.33, p<.035, df=39).
ConclusionWe identified connectivity patterns predictive of social impairments in a transdiagnostic sample. These networks have the potential to provide insight into development novel targeted interventions for social impairments across traditional diagnostic categories. | psychiatry and clinical psychology |
10.1101/2022.04.08.22273612 | Monitoring sustainable development goal 5.2: Cross-country cross-time invariance of measures for intimate partner violence | BackgroundThe persistence and impacts of violence against women motivated Sustainable Development Goal (SDG) 5.2 to end such violence. Global psychometric assessment of cross-country, cross-time invariance of items measuring intimate partner violence (IPV) is needed to confirm their utility for comparing and monitoring national trends.
MethodsAnalyses of seven physical-IPV items included 377,500 ever-partnered women across 20 countries (44 Demographic and Health Surveys (DHS)). Analyses of five controlling-behaviors items included 371,846 women across 19 countries (42 DHS). We performed multiple-group confirmatory factor analysis (MGCFA) to assess within-country, cross-time invariance of each item set. Pooled analyses tested cross-country, cross-time invariance using DHSs that showed configural invariance in country-level multiple-group confirmatory factor analysis (MGCFAs). Alignment optimization tested approximate invariance of each item set in the pooled sample of all datasets, and in the subset of countries showing metric invariance over at least two repeated cross-sectional surveys in country-level MGCFAs.
ResultsIn country-level MGCFAs, physical-IPV items and controlling-behaviors items functioned equivalently in repeated survey administrations in 12 and 11 countries, respectively. In MGCFA testing cross-country, cross-time invariance in pooled samples, neither item set was strictly equivalent; however, the physical-IPV items were approximately invariant. Controlling-behaviors items did not show approximate cross-country and cross-time invariance in the full sample or the sub-sample showing country-level metric invariance.
ConclusionPhysical-IPV items approached approximate invariance across 20 countries and were approximately invariant in 11 countries with repeated cross-sectional surveys. Controlling-behaviors items were cross-time invariant within 11 countries but did not show cross-country, cross-time approximate invariance. Currently, the physical-IPV item set is more robust for monitoring progress toward SDG5.2.1, to end IPV against women. | public and global health |
10.1101/2022.04.07.22273414 | COVID-19 vaccine for people who live and work in prisons worldwide: A scoping review. | Overcrowding, poor conditions, and high population turnover make prisons highly susceptible to COVID-19. Vaccination is key to controlling COVID-19, yet there is disagreement regarding whether people who live and work in prisons should be prioritised in national vaccination programmes. To help resolve this, we critically examine the extent, nature, and quality of extant literature regarding prioritisation of COVID-19 vaccinations for people who live and work in prisons.
Using a scoping review as our methodological framework, we conducted a systematic literature search of 17 databases. From 2,307 potentially eligible articles, we removed duplicates and screened titles and abstracts to retain 45 articles for review and quality appraisal.
Findings indicated that while most countries recognise that prisons are at risk of high levels of COVID-19 transmission, only a minority have explicitly prioritised people who live and work in prisons for COVID-19 vaccination. Even among those that have, prioritisation criteria varies considerably. This is set against a backdrop of political barriers, such as politicians questioning the moral deservingness of people in prison; policy barriers, such as the absence of a unified international framework of how vaccine prioritisation should proceed in prisons; logistical barriers regarding vaccine administration in prisons; and behavioural barriers including vaccine hesitancy.
We outline five strategies to prioritise people who live and work in prisons in COVID-19 vaccination plans: (1) improving data collection on COVID-19 vaccination, (2) reducing the number of people imprisoned, (3) tackling vaccine populism through advocacy, (4) challenging arbitrary prioritisation processes via legal processes, and (5) conducting more empirical research on COVID-19 vaccination planning, delivery, and acceptability. Implementing these strategies would help to reduce the impact of COVID-19 on the prison population, prevent community transmission, improve vaccine uptake in prisons beyond the current pandemic, foster political accountability, and inform future decision-making. | public and global health |
10.1101/2022.04.06.22273508 | Determinants of facility-based childbirth among adolescents and young women in Guinea: a secondary analysis of the 2018 Demographic and Health Survey | IntroductionMaternal mortality remains very high in Sub-Saharan African countries and the risk is higher among adolescent girls. Maternal mortality occurs in these settings mainly around the time of childbirth and the first 24 hours after birth. Therefore, skilled attendance in an enabling environment is essential to reduce the occurrence of adverse outcomes for both women and their children. This study aims to analyze the determinants of facility childbirth among adolescents and young women in Guinea.
MethodsWe used the Guinea Demographic and Health Survey (DHS) conducted in 2018. All females who were adolescents (15 -19) or young women (20-24 years) at the time of their most recent live birth in the five years before the survey were included. We examined the use of health facilities for childbirth and its determinants using multivariable logistic regression, built through the Andersen health-seeking model.
ResultsOverall, 58% of adolescents and 57% of young women gave birth in a health facility. Young women were more likely to have used private sector facilities compared to adolescents (p<0.001). Factors significantly associated with a facility birth in multivariable regression included: secondary or higher educational level (aOR=1.81; 95%CI:1.20-2.64) compared to no formal education; receipt of 1-3 antenatal visits (aOR=8.93; 95%CI: 5.10-15.55) and 4+ visits (aOR=15.1; 95%CI: 8.50-26.84) compared to none; living in urban (aOR=2.13; 95%CI: 1.40-3.37) compared to rural areas. Women from poorest households were least likely to give birth in health facilities. There was substantial variation in the likelihood of birth in a health facility by region, with highest odds in NZerekore and lowest in Labe.
ConclusionThe percentage of births in health facilities among adolescents and young women in Guinea increased since 2012 but remains suboptimal. Socio-economic characteristics, region of residence and antenatal care use were the main determinants of its use. Efforts to improve maternal health among this group should target care discontinuation between antenatal care and childbirth (primarily by removing financial barriers) and increasing the demand for facility-based childbirth services in communities, while paying attention to the quality and respectful nature of healthcare services provided there. | sexual and reproductive health |
10.1101/2022.04.10.22273665 | Distinguishing Lewy Body Dementia from Alzheimer's Disease using Machine Learning on Heterogeneous Data: A Feasibility Study | Dementia with Lewy Bodies (DLB) is the second most common form of dementia, but diagnostic markers for DLB can be expensive and inaccessible, and many cases of DLB are undiagnosed. This work applies machine learning techniques to determine the feasibility of distinguishing DLB from Alzheimers Disease (AD) using heterogeneous data features. The Repeated Incremental Pruning to Produce Error Reduction (RIPPER) algorithm was first applied using a Leave-One-Out Cross-Validation protocol to a dataset comprising DLB and AD cases. Then, interpretable association rule-based diagnostic classifiers were obtained for distinguishing DLB from AD. The various diagnostic classifiers generated by this process had high accuracy over the whole dataset (mean accuracy of 94%). The mean accuracy in classifying their out-of-sample case was 80.5%. Every classifier generated consisted of very simple structure, each using 1-2 classification rules and 1-3 data features. As a group, the classifiers were heterogeneous and used several different data features. In particular, some of the classifiers used very simple and inexpensive diagnostic features, yet with high diagnostic accuracy. This work suggests that opportunities may exist for incorporating accessible diagnostic assessments while improving diagnostic rate for DLB.
Clinical RelevanceSimple and interpretable high-performing machine learning algorithms identified a variety of readily available clinical assessments for differential diagnosis of dementia, offering the opportunities to incorporate various simple and inexpensive screening tests for DLB and addressing the problem of DLB underdiagnosis. | neurology |
10.1101/2022.04.08.22273547 | Right hemispheric white matter hyperintensities improve the prediction of spatial neglect severity in acute stroke | White matter hyperintensities (WMH) are frequently observed in brain scans of elderly people. They are associated with an increased risk of stroke, cognitive decline, and dementia. However, it is unknown yet if measures of WMH provide information that improve the understanding of poststroke outcome compared to only state-of-the-art stereotaxic structural lesion data. We implemented high-dimensional machine learning models, based on support vector regression (SVR), to predict the severity of spatial neglect in 103 acute right hemispheric stroke patients. We found that (1) the additional information of right hemispheric voxel-based topographic WMH extent indeed yielded an improvement in predicting acute neglect severity (compared to the voxel-based stroke lesion map alone). (2) Periventricular WMH appeared more relevant for prediction than deep subcortical WMH. (3) Among different WMH measures, voxel-based maps as measures of topographic extent allowed more accurate predictions compared to the use of traditional ordinally assessed visual rating scales (Fazekas scale, Cardiovascular Health Study scale). In summary, topographic WMH appears to be a valuable clinical imaging biomarker for predicting the severity of cognitive deficits and bears great potential for rehabilitation guidance of acute stroke patients. | neurology |
10.1101/2022.04.06.22273539 | Racial/Ethnic Heterogeneity in Diet of Low-income Adult Women in the United States: Results from National Health and Nutrition Examination Surveys 2011-2018 | BackgroundPoor diet is a major risk factor of cardiovascular and chronic diseases, particularly for low-income women. However, the pathways by which race/ethnicity plays a role in this risk factor have not been fully explored.
ObjectiveThis observational study aims to identify dietary consumption differences by race/ethnicity of US women living at or below the 130% poverty income level from 2011-2018.
DesignA total of 3005 adult women aged 20-80 years from the National Health and Nutrition Examination Survey (2011-2018) living at or below the 130% poverty-income level with at least one complete 24-hr dietary recall were classified into 5 self-identified racial/ethnic subgroups (Mexican, Other Hispanic, Non-Hispanic White, Non-Hispanic Black, Non-Hispanic Asian). Dietary consumption patterns were defined by 29 major food groups summarized from the Food Pattern Equivalents Database and derived via a robust profile clustering model which identifies foods that share consumption patterns across all low-income adult women, and foods that differ in consumption patterns based on race/ethnic subgroups.
ResultsLegumes (protein and vegetable) were the most differentiating foods identified across all racial/ethnic subgroups and were primarily consumed by Mexican and Other Hispanic women. Non-Hispanic Asian women were most likely to favor a high consumption of prudent foods (fruits, vegetables, whole grains). Non-Hispanic White and Black women shared the most similarities in consumption patterns but differed in foods such as milk, poultry, and eggs.
ConclusionsDifferences among consumption behaviors of low-income women were found along racial/ethnic lines. Efforts to improve nutritional health of low-income adult women should consider racial/ethnic differences in diet to appropriately focus interventions.
DisclaimersN/A.
Sources of SupportStudy supported in part by National Heart Lung and Blood Institute (NHLBI) grant R25 (HL105400) to Victor G. Davila-Roman and DC Rao. | nutrition |
10.1101/2022.04.10.22273663 | The relation between COVID-19 vaccinations and public governance to improve preparedness of next pandemic impacts and crisis management: a global study | The goal of this study is to analyze the relationship between COVID-19 vaccinations and public governance performing a global analysis of more than 110 countries worldwide. Methodology applies the Independent Samples T-Test that compares the means of two independent groups (countries with high/low level of vaccinations) to determine whether there is statistical evidence that the associated population means of indicators of public governance are significantly different. Findings suggest that high levels of governance can support a better function of health systems in the rollout of vaccinations to cope with COVID-19 pandemic crisis. This study may assist long-run policy of governments to improve good governance and health systems of countries in order to reinforce the preparedness to face next pandemic threats and in general future crisis management in society. | health policy |
10.1101/2022.04.11.22272784 | Evolution of a globally unique SARS-CoV-2 Spike E484T monoclonal antibody escape mutation in a persistently infected, immunocompromised individual. | Prolonged infections in immunocompromised individuals may be a source for novel SARS-CoV-2 variants, particularly when both the immune system and antiviral therapy fail to clear the infection, thereby promoting adaptation. Here we describe an approximately 16-month case of SARS-CoV-2 infection in an immunocompromised individual. Following monotherapy with the monoclonal antibody Bamlanivimab, the individuals virus was resistant to this antibody via a globally unique Spike amino acid variant (E484T) that evolved from E484A earlier in infection. With the emergence and spread of the Omicron Variant of Concern, which also contains Spike E484A, E484T may arise again as an antibody-resistant derivative of E484A. | infectious diseases |
10.1101/2022.04.10.22273627 | Germline variants associated with immunotherapy-related adverse events | Immune checkpoint inhibitors (ICIs) have yielded remarkable responses in patients across multiple cancer types, but often lead to immune related adverse events (irAEs). Although a germline cause for irAEs has been hypothesized, no systematic genome wide association study (GWAS) has been performed and no individual variants associated with the overall likelihood of developing irAEs have yet been identified. We carried out a Genome-Wide Association Study (GWAS) of 1,751 patients on ICIs across 12 cancer types, with replication in an independent cohort of 196 patients and independent clinical trial data from 2275 patients. We investigated two irAE phenotypes: (i) high-grade (3-5) events defined through manual curation and (ii) all detectable events (including high-grade) defined through electronic health record (EHR) diagnosis followed by manual confirmation. We identified three genome-wide significant associations (p<5x10-8) in the discovery cohort associated with all-grade irAEs: rs16906115 near IL7 (combined p=1.6x10-11; hazard ratio (HR)=2.1), rs75824728 near IL22RA1 (combined p=6.6x10-9; HR=1.9), and rs113861051 on 4p15 (combined p=1.3x10-8, HR=2.0); with rs16906115 replicating in two independent studies. The association near IL7 colocalized with the gain of a novel cryptic exon for IL7, a critical regulator of lymphocyte homeostasis. Patients carrying the IL7 germline variant exhibited significantly increased lymphocyte stability after ICI initiation than non-carriers, and this stability was predictive of downstream irAEs and improved survival.
DisclosuresD.A.B. reports nonfinancial support from Bristol Myers Squibb, honoraria from LM Education/Exchange Services, and personal fees from MDedge, Exelixis, Octane Global, Defined Health, Dedham Group, Adept Field Solutions, Slingshot Insights, Blueprint Partnerships, Charles River Associates, Trinity Group, and Insight Strategy, outside of the submitted work.
K.K. reports receiving honoraria from IBM and Roche.
M.M.A reports grants and personal fees from Genentech, grants and personal fees from Bristol-Myers Squibb, personal fees from Merck, grants and personal fees from AstraZeneca, grants from Lilly, personal fees from Maverick, personal fees from Blueprint Medicine, personal fees from Syndax, personal fees from Ariad, personal fees from Nektar, personal fees from Gritstone, personal fees from ArcherDX, personal fees from Mirati, personal fees from NextCure, personal fees from Novartis, personal fees from EMD Serono, personal fees from Panvaxal/NovaRx, outside the submitted work.
O.R. reports research support from Merck. Speaker for activities supported by educational grants from BMS and Merck. Consultant for Merck, Celgene, Five Prime, GSK, Bayer, Roche/Genentech, Puretech, Imvax, Sobi, Boehringer Ingelheim. Patent "Methods of using pembrolizumab and trebananib" pending.
S.A.S. reports nonfinancial support from Bristol-Myers Squibb, and equity in Agenus Inc., Agios Pharmaceuticals, Breakbio Corp., Bristol-Myers Squibb and Lumos Pharma.
T.K.C. reports research/advisory boards/consultancy/Honorarium (Institutional and personal, paid and unpaid): AstraZeneca, Aveo, Bayer, Bristol Myers-Squibb, Eisai, EMD Serono, Exelixis, GlaxoSmithKline, IQVA, Ipsen, Kanaph, Lilly, Merck, Nikang, Novartis, Pfizer, Roche, Sanofi/Aventis, Takeda, Tempest. Travel, accommodations, expenses, medical writing in relation to consulting, advisory roles, or honoraria. Stock options: Pionyr, Tempest. Other: Up-to-Date royalties, CME-related events (e.g.: OncLIve, PVI, MJH Life Sciences) honorarium. NCI GU Steering Committee. Patents filed, royalties or other intellectual properties (No income as of current date): related to biomarkers of immune checkpoint blockers and ctDNA. No speakers bureau.
Z.B. reports research support from the imCORE Network on behalf of Genentech, Inc. and Bristol-Myers Squibb. Honoraria from UpToDate. | genetic and genomic medicine |
10.1101/2022.04.10.22273627 | Germline variants associated with immunotherapy-related adverse events | Immune checkpoint inhibitors (ICIs) have yielded remarkable responses in patients across multiple cancer types, but often lead to immune related adverse events (irAEs). Although a germline cause for irAEs has been hypothesized, no systematic genome wide association study (GWAS) has been performed and no individual variants associated with the overall likelihood of developing irAEs have yet been identified. We carried out a Genome-Wide Association Study (GWAS) of 1,751 patients on ICIs across 12 cancer types, with replication in an independent cohort of 196 patients and independent clinical trial data from 2275 patients. We investigated two irAE phenotypes: (i) high-grade (3-5) events defined through manual curation and (ii) all detectable events (including high-grade) defined through electronic health record (EHR) diagnosis followed by manual confirmation. We identified three genome-wide significant associations (p<5x10-8) in the discovery cohort associated with all-grade irAEs: rs16906115 near IL7 (combined p=1.6x10-11; hazard ratio (HR)=2.1), rs75824728 near IL22RA1 (combined p=6.6x10-9; HR=1.9), and rs113861051 on 4p15 (combined p=1.3x10-8, HR=2.0); with rs16906115 replicating in two independent studies. The association near IL7 colocalized with the gain of a novel cryptic exon for IL7, a critical regulator of lymphocyte homeostasis. Patients carrying the IL7 germline variant exhibited significantly increased lymphocyte stability after ICI initiation than non-carriers, and this stability was predictive of downstream irAEs and improved survival.
DisclosuresD.A.B. reports nonfinancial support from Bristol Myers Squibb, honoraria from LM Education/Exchange Services, and personal fees from MDedge, Exelixis, Octane Global, Defined Health, Dedham Group, Adept Field Solutions, Slingshot Insights, Blueprint Partnerships, Charles River Associates, Trinity Group, and Insight Strategy, outside of the submitted work.
K.K. reports receiving honoraria from IBM and Roche.
M.M.A reports grants and personal fees from Genentech, grants and personal fees from Bristol-Myers Squibb, personal fees from Merck, grants and personal fees from AstraZeneca, grants from Lilly, personal fees from Maverick, personal fees from Blueprint Medicine, personal fees from Syndax, personal fees from Ariad, personal fees from Nektar, personal fees from Gritstone, personal fees from ArcherDX, personal fees from Mirati, personal fees from NextCure, personal fees from Novartis, personal fees from EMD Serono, personal fees from Panvaxal/NovaRx, outside the submitted work.
O.R. reports research support from Merck. Speaker for activities supported by educational grants from BMS and Merck. Consultant for Merck, Celgene, Five Prime, GSK, Bayer, Roche/Genentech, Puretech, Imvax, Sobi, Boehringer Ingelheim. Patent "Methods of using pembrolizumab and trebananib" pending.
S.A.S. reports nonfinancial support from Bristol-Myers Squibb, and equity in Agenus Inc., Agios Pharmaceuticals, Breakbio Corp., Bristol-Myers Squibb and Lumos Pharma.
T.K.C. reports research/advisory boards/consultancy/Honorarium (Institutional and personal, paid and unpaid): AstraZeneca, Aveo, Bayer, Bristol Myers-Squibb, Eisai, EMD Serono, Exelixis, GlaxoSmithKline, IQVA, Ipsen, Kanaph, Lilly, Merck, Nikang, Novartis, Pfizer, Roche, Sanofi/Aventis, Takeda, Tempest. Travel, accommodations, expenses, medical writing in relation to consulting, advisory roles, or honoraria. Stock options: Pionyr, Tempest. Other: Up-to-Date royalties, CME-related events (e.g.: OncLIve, PVI, MJH Life Sciences) honorarium. NCI GU Steering Committee. Patents filed, royalties or other intellectual properties (No income as of current date): related to biomarkers of immune checkpoint blockers and ctDNA. No speakers bureau.
Z.B. reports research support from the imCORE Network on behalf of Genentech, Inc. and Bristol-Myers Squibb. Honoraria from UpToDate. | genetic and genomic medicine |
10.1101/2022.04.09.22273625 | Evaluating the efficacy and mechanism of metformin targets on reducing Alzheimers disease risk in the general population: a Mendelian randomization study | Aims/hypothesisMetformin use has been associated with reduced incident dementia in diabetic patients in observational studies. However, the causality between the two in the general population is unclear. This study uses Mendelian randomization (MR) to investigate the causal effect of metformin targets on Alzheimers disease (AD) and potential causal mechanisms in the brain linking the two.
MethodsGenetic proxies for the effects of metformin drug targets were identified as variants in the gene for the corresponding target that associated with HbA1c level (N=344,182) and expression level of the corresponding gene (N[≤]31,684). The cognitive outcomes were derived from genome-wide association studies comprising of 527,138 middle-aged Europeans, including 71,880 AD or AD-by-proxy patients. MR estimates representing lifelong metformin use on AD and cognitive function in the general population were generated. Effect of expression level of 22 metformin-related genes in brain cortex (N=6,601 donors) on AD was further estimated.
ResultsGenetically proxied metformin use equivalent to a 6.75 mmol/mol (1.09%) reduction of HbA1c was associated with 4% lower odds of AD (odds ratio [OR]=0.964, 95%CI=0.982[~]0.946, P=1.06x10-4) in non-diabetic individuals. One metformin target, mitochondrial complex 1 (MCI), showed a robust effect on AD (OR=0.88, P=4.73x10-4) that was independent of AMPK. MR of expression in brain cortex tissue showed that decreased MCI-related gene, NDUFA2, expression was associated with reduced AD risk (OR=0.95, P=4.64x10-4) and less cognitive decline.
Conclusion/interpretationMetformin use is likely to cause reduced AD risk in the general population. Mitochondrial function and the NDUFA2 gene are likely mechanisms of action in dementia protection.
Research in contextO_ST_ABSWhat is already known about this subjectC_ST_ABSO_LIMetformin is an anti-diabetic drug with repurposing potential for dementia prevention.
C_LIO_LIIn a search of PubMed, Embase and clinicaltrials.gov, a few observational studies suggested the association of metformin use with reduced dementia incidence in diabetic patients
C_LI
What is the key question?O_LIWhat is the effect of genetically proxied metformin use on Alzheimers disease (AD) and cognitive function in the general population, especially for those without diabetes? Is the causal role between the two at least partly influenced by mechanisms in the brain?
C_LI
What are the new findings?O_LIIn a Mendelian randomization analysis of over 527,138 individuals (71,880 AD or AD-by-proxy cases), genetically proxied metformin use equivalent to a 6.75 mmol/mol (1.09%) reduction of HbA1c was associated with 14% lower odds of AD (odds ratio=0.86), where mitochondrial complex I is a key effect modifier.
C_LIO_LIExpression level of a mitochondrial complex I related gene, NDUFA2, showed an effect on reducing AD risk and less cognitive decline in brain.
C_LI
How might this impact on clinical practice in the foreseeable future?O_LIOur study predicts the efficacy of metformin on reducing AD risk and reducing cognitive decline in the general population, especially for those without diabetes.
C_LIO_LIMitochondrial function and a mitochondrial related gene, NDUFA2, could be considered as a novel drug target for dementia prevention.
C_LI
Graphical abstract
O_FIG O_LINKSMALLFIG WIDTH=200 HEIGHT=107 SRC="FIGDIR/small/22273625v1_ufig1.gif" ALT="Figure 1">
View larger version (19K):
[email protected]@5f4ca4org.highwire.dtl.DTLVardef@a0b290org.highwire.dtl.DTLVardef@2b0d49_HPS_FORMAT_FIGEXP M_FIG C_FIG Tweet Effect of metformin targets reduced 4% of Alzheimers disease risk in non-diabetic individuals. @oldz84 @tomgaunt @mendel_random @mrc_ieu | epidemiology |
10.1101/2022.04.08.22273322 | Correlations between kidney and heart function bioindicators and the expressions of Toll-Like, ACE2, and NRP-1 receptors in COVID-19 | BackgroundCOVID-19 impacts the cardiovascular system resulting in myocardial damage and also affects the kidneys leading to renal dysfunction. This effect is mostly through the binding with angiotensin-converting enzyme-2 (ACE2) and Neuropilin-1(NRP-l) receptors. Toll-Like Receptors (TLRs) typically combine with microbial pathogens and provoke an inflammatory response.
AimThis work aims to compare the changes in kidney and heart function bioindicators and expressions of TLRs (TLR2 and TLR2) as well as ACE2 and NRP-l receptors in moderate and severe COVID-19 patients. The correlations between kidney and heart function bioindicators and expressions of these receptors are also studied.
Patients and MethodsIn this study, 50 healthy control and 100 COVID-19 patients (55 male and 45 female) were enrolled. According to WHO guidelines, these participants were divided into severe (50 cases) and moderate (50 cases). Serum creatinine, blood urea, CKMB, LDH, and Troponin I were estimated. We measured the gene expression for Toll-Like Receptors (TLR2, TLR4), ACE2, and NRP-1 in the blood samples using quantitative real-time PCR (qRT -PCR).
ResultsIn comparison with the healthy group, all patients exhibited a significant elevation in the serum creatinine, blood urea, cardiac enzymes, and CRP. As well, all studied patients revealed a significant elevation in the expression levels of TLR2, TLR4, ACE2, and NRP-1 mRNA. In all patients, CKMB, ACE2, and NRP-1 mRNA expression levels were positively correlated to both TLR2 and TLR4 expression levels. Moreover, serum creatinine and blood urea were positively correlated to both TLR2 and TLR 4 expression levels in the severe group only.
ConclusionsOur study concluded that expression levels for TLR2, TLR4, ACE2, and NRP-1 mRNA in both severe and moderate patients were positively correlated with renal biomarkers and cardiac enzymes. Innate immune markers can be important because they correlate with the severity of illness in COVID-19. | infectious diseases |
10.1101/2022.04.06.22273515 | A Method to improve the Reliability of Saliency Scores applied to Graph Neural Network Models in Patient Populations | Graph Neural Networks (GNN), a novel method to recognize features in heterogeneous information structures, have been recently used to model patients with similar diagnoses, extract relevant features and in this way predict for instance medical procedures and therapies. For applications in a medical field is relevant to leverage the interpretability of GNNs and evaluate which model inputs are involved in the computation of the model outputs, which is a useful information to analyze correlations between diagnoses and therapies from large datasets. We present in this work a method to sample the saliency scores of GNNs models computed with three different methods, gradient, integrated gradients, and DeepLIFT. The final sample of scores informs the customers if they are reliable if and only if all of them are convergent. This method will be relevant to inform customers which is the degree of confidence and interpretability of the computed predictions obtained with GNNs models. | health informatics |
10.1101/2022.04.05.22273075 | Persistent racial disparities in deep brain stimulation for Parkinson's disease | We sought to determine whether racial and socio-economic disparities in the utilization of deep brain stimulation (DBS) for Parkinsons disease (PD) have improved over time. We examined DBS utilization and analyzed factors associated with placement of DBS. The odds of DBS placement increased across the study period while White PD patients were 5 times more likely than Black patients to undergo DBS. Individuals, regardless of racial background, with two or more comorbidities were 14 times less likely to undergo DBS. Privately insured patients were 1.6 times more likely to undergo DBS. Despite increasing DBS utilization, significant disparities persist in access to DBS. | health systems and quality improvement |
10.1101/2022.04.11.22273327 | Effectiveness of the BNT162b vaccine fourth dose in reducing SARS-CoV-2 infection among healthcare workers in Israel, a multi-center cohort study | During December 2021 the fifth COVID-19 wave started in Israel, caused mostly by the Omicron variant, affecting the unvaccinated and vaccinated population. Ninety percent of the Israeli adults, including most healthcare workers (HCWs), received three doses of the BNT162b2 vaccine until September 2021. Following the success and safety of the 3rd dose in preventing infection and severe disease, on December 30, 2021, the Israeli Ministry of Health recommended a voluntary 4th vaccine dose to adults above 60 years, immunocompromised, and HCWs. We compared breakthrough infections in HCWs, between 3 and 4-dose recipients.
Hospitals collected data on personnel vaccinations and infections dates. The study cohort included all HCWs in eleven hospitals in Israel, who have been vaccinated with three doses up to September 30, 2021, and had not contracted COVID-19 before the vaccination campaign (January 2, 2022).
We calculated breakthrough infection rates in 4-dose recipients (more than six days after vaccination) vs. 3-dose recipients. Rate-ratios were calculated for the entire cohort and for subgroups (hospital, sex, age-groups, and profession). Additionally, we repeated the calculations on 4-dose and 3-dose recipients who received the 3rd dose on the same date and were matched for sex, age group, profession and hospital. We generated time-dependent Cox-regression models to account for 4th dose administration timing (Supplement).
There were 29,612 HCWs who received 3 vaccine doses between August and September 2021; of these, 5,331 (18.0%) received the 4th dose during January 2022 and were not infected by the first week after vaccination. Overall breakthrough infection rates in the 4-dose and 3-dose groups were 368/5331 (6.9%) and 4802/24280 (19.8%), respectively. The RR (95%CI) was 0.35 (0.32 to 0.39) for crude analysis, and 0.61 (0.54 to 0.71) in the matched analysis. The adjusted HR in the Cox-regression model was 0.56 (0.50 to 0.63). In both groups, severe disease and death were not reported.
Our data shows that the 4th BNT162b2 dose resulted in reduced breakthrough infection rates among HCWs. This reduction, similar to the findings in the Israeli elderly population, is lower than that observed after the 3rd dose.
Nevertheless, considering the high infectivity of the Omicron variant, which led to critical medical staff shortages, a 4th vaccine dose should be considered to mitigate the infection rate among HCWs. | infectious diseases |
10.1101/2022.04.11.22273617 | Identifying type 1 and 2 diabetes in population level data: assessing the accuracy of published approaches | AimsPopulation datasets are increasingly used to study type 1 or 2 diabetes, and inform clinical practice. However, correctly classifying diabetes type, when insulin treated, in population datasets is challenging. Many different approaches have been proposed, ranging from simple age or BMI cut offs, to complex algorithms, and the optimal approach is unclear. We aimed to compare the performance of approaches for classifying insulin treated diabetes for research studies, evaluated against two independent biological definitions of diabetes type.
MethodWe compared accuracy of thirteen reported approaches for classifying insulin treated diabetes into type 1 and type 2 diabetes in two population cohorts with diabetes: UK Biobank (UKBB) n=26,399 and DARE n=1,296. Overall accuracy and predictive values for classifying type 1 and 2 diabetes were assessed using: 1) a type 1 diabetes genetic risk score and genetic stratification method (UKBB); 2) C-peptide measured at >3 years diabetes duration (DARE).
ResultsAccuracy of approaches ranged from 71%-88% in UKBB and 68%-88% in DARE. All approaches were improved by combining with requirement for early insulin treatment (<1 year from diagnosis). When classifying all participants, combining early insulin requirement with a type 1 diabetes probability model incorporating continuous clinical features (diagnosis age and BMI only) consistently achieved high accuracy, (UKBB 87%, DARE 85%). Self-reported diabetes type alone had high accuracy (UKBB 87%, DARE 88%) but was available in just 15% of UKBB participants. For identifying type 1 diabetes with minimal misclassification, using models with high thresholds or young age at diagnosis (<20 years) had the highest performance. An online tool developed from all UKBB findings allows the optimum approach of those tested to be selected based on variable availability and the research aim.
ConclusionSelf-reported diagnosis and models combining continuous features with early insulin requirement are the most accurate methods of classifying insulin treated diabetes in research datasets without measured classification biomarkers. | endocrinology |
10.1101/2022.04.12.22273568 | Estimating the epidemiology of chronic Hepatitis B Virus (HBV) infection in the UK: what do we know and what are we missing? | BackgroundHBV is the leading global cause of cirrhosis and primary liver cancer. The viruss attributable disease burden in the UK is concentrated in vulnerable populatons including ethinic minorities, people experiencing homelessness and people born in high-prevalence countries. Despite this the UK HBV population has not been well characterised, and estimates of UK HBV prevalence and/or positivity rate vary widely across sources. We summarised datasets that are available to represent UK CHB epidemiology, consider differences between sources, and discuss deficiencies in current estimates.
MethodsWe searched for estimates of CHB case numbers in the UK (incorporating incidence and/or prevalence-like data) across a range of available sources, including UK-wide reports from government bodies, publications from independent bodies (including medical charities and non-governmental organisations) and articles in peer-reviewed scientific journals. We present positivity rates from each respective data source but caution that estimates may not be representative of the true UK-wide population prevalence.
Results and DiscussionSix CHB case number estimates were identified, with three estimates reporting information concerning population subgroups, including number of infected individuals across age, sex and ethnicity categories., Estimates among sources reporting prevalence varied from 0.27% to 0.73%. An alterantive proxy for population prevalence (obtained via the UK antenatal screening programme which achieves over 95% coverage of every pregnant woman) estimated a CHB prevalence of <0.5%. Estimates varied by sources of error, bias and missingness, data linkage, and substantial "blind spots" in consistent testing and registration of HBV diagnoses. Multi-parameter evidence synthesis and back-calculation model methods similar to those used to generate estimates of HCV ad HIV population-wide prevalence may be applicable to HBV. | epidemiology |
10.1101/2022.04.12.22273752 | Protection conferred by vaccine plus previous infection (hybrid immunity) with vaccines of three different platforms during the Omicron variant period in Brazil | Hybrid immunity (infection plus vaccination) provided high protection against infection and severe disease in the periods of delta and gamma variants of concern. However, the protection of hybrid immunity in the Omicron era remains unknown. We performed a test-negative study using Brazilian national databases between January 01 and March 22, 2022, a period of predominant circulation of the Omicron variant in Brazil. Hybrid immunity offered low protection against infection, with rapid waning, compared to unvaccinated with or without previous infection. For severe illness (hospitalisation or death), the protection, although already high for unvaccinated pre-infected increased regardless of the type of vaccine (Ad26.COV2.S, BNT162b2, ChAdOx-1 or CoronaVac).
In conclusion, during the Omicron-dominant period in Brazil, hybrid immunity offered high protection against severe illness and low protection against infection. | epidemiology |
10.1101/2022.04.11.22273166 | Evaluation of Outdoor swimming courses as an intervention to refresh and revitalise NHS workers | BackgroundFrontline healthcare staff working in the National Health Service (NHS) have been, and continue to be, under a significant level of work related stress as a result of the COVID-19 pandemic. Long hours and greater clinical need have impacted negatively on work-life balance. The results of our preliminary studies indicate that outdoor swimming may be an effective treatment for anxiety and depression. We therefore hypothesised that the activity could improve symptoms of work-related burnout and stress in NHS workers. The primary objective of this study was to gather feedback from NHS staff participating in supervised swimming sessions that took place in an outdoor pool in London and the sea in Cornwall on the value and effectiveness of this initiative as they perceived it.
MethodsFollowing ethical approval (University of Portsmouth Science and Health research ethics committee SHFEC 2021-066), participants who had signed up to outdoor swimming courses provided by NHS Improvement in Cornwall and London were asked to give their consent to participate in an online survey. They were asked to complete them at three time-points: the week prior to, upon completion and six weeks after completion of the outdoor swimming course. As well as being asked for qualitative feedback, participants completed the Short Warwick-Edinburgh Mental Wellbeing Scale and the Copenhagen Burnout Inventory.
Results85 (63.9%) of the 133 Participants who signed up to outdoor swimming courses completed the first survey, 62 (49.6%) the second and 43 (35.5%) the third. 41 (33.8%) completed all three surveys. Overall, there was a 14.8% increase in wellbeing scores when comparing the scores before and after the courses which was statistically significant (p<0.0001, d= 1.02). Compared to scores before the course, the scores at its conclusion were reduced by 25%, 18% and 18% in personal, work-related and client-related burnout respectively. These burnout scores were significantly different for personal (P<0.0001) and work-related burnout (P=0.0018). Qualitative feedback was overwhelmingly positive with the effects being broadly divided into those relating to mood and physical health, the social aspects of the group activity, feelings of achievement and self-care and mindfulness.
ConclusionThis research suggests that the outdoor swimming activity, as a workplace intervention, can be an effective way of promoting staff wellbeing and reducing personal and work-related burnout. Further, formal trials of this intervention are justified. | psychiatry and clinical psychology |
10.1101/2022.04.07.22273527 | Extracellular vesicle-bound DNA in urine is indicative of kidney allograft injury | Extracellular vesicle-bound DNA (evDNA) is an understudied extracellular vesicle (EV) cargo, particularly in cancer-unrelated fundamental and biomarker research. Although evDNA has been detected in urine, little is known about its characteristics, localization, and biomarker potential for kidney pathologies. To address this, we enriched EVs from urine of well-characterized kidney transplant recipients undergoing allograft biopsy, characterized their evDNA and its association to allograft injury. Using DNase treatment and immunogold labelling TEM, we show that DNA is bound to the surface of urinary EVs. Although the urinary evDNA and cell-free DNA correlated in several characteristics, the DNA integrity index showed evDNA was less fragmented (P < 0.001). Urinary EVs from patients with rejection and non-rejection allograft injury were significantly larger (mean: P = 0.045, median: P = 0.031) and have bound more DNA as measured by normalized evDNA yield (P = 0.018) and evDNA copy number (P = 0.007), compared to patients with normal histology. Urinary evDNA characteristics associated with the degree of interstitial inflammation, combined glomerulitis and peritubular capillaritis, and inflammation in areas of fibrosis (all P < 0.050). The normalized dd-evDNA copy numbers differed between the antibody- and T cell-mediated rejection (P = 0.036). Our study supports the importance of DNA as urine EV cargo, especially as potential non-invasive kidney allograft injury biomarker. | transplantation |
10.1101/2022.04.11.22273677 | Germline sequencing of DNA-damage-repair genes in two hereditary prostate cancer cohorts reveals rare risk-associated variants | BackgroundRare, inherited variants in DNA damage repair (DDR) genes play an important role in prostate cancer (PrCa) susceptibility.
ObjectiveTo interrogate two independent high-risk familial PrCa datasets to identify rare DDR variants that contribute to disease risk.
DesignMassively parallel sequencing data from Australian and North American familial PrCa datasets were examined for rare, likely deleterious variants in 35 DDR genes. Putative high-risk variants were prioritised based on frequency (minor allele frequency <1%), mutation type (nonsense, missense, or splice), segregation with disease, and in silico predicted deleteriousness. Six prioritised variants were genotyped in a total of 1,963 individuals (700 familial and 459 sporadic PrCa cases, 482 unaffected relatives and 322 screened controls) and MQLS association analysis performed.
Results and LimitationsStatistically significant associations between PrCa risk and rare variants in ERCC3 (rs145201970, p=2.57x10-4) and BRIP1 (rs4988345, p=0.025) were identified in the combined Australian and North American datasets. A variant in PARP2 (rs200603922, p=0.028) was significantly associated with risk in the Australian dataset alone, while a variant in MUTYH (rs36053993, p=0.031) was significantly associated with risk in the North American dataset. Putative pathogenic variants may have been missed due to their very low frequency in the datasets, which precluded statistical evaluation.
ConclusionsOur study implicates multiple rare germline DDR variants in PrCa risk, whose functional and/or biological effects and role in inherited risk in other PrCa cohorts should be evaluated. These findings will significantly impact the use of gene-based therapies targeting the DDR pathway in PrCa patients.
Patient SummaryHere, we looked at genetic changes in a group of genes involved in DNA repair, as testing for such genetic changes is proving important in PrCa diagnosis and treatment. We report, for the first time, several new genetic changes in these genes associated with PrCa. | urology |
10.1101/2022.04.11.22273639 | Influence of immunosuppressive regimen on diffusivity and oxygenation of kidney transplants: Analysis of functional MRI data from the randomized ZEUS trial | The ZEUS study was a multi-center randomized controlled trial investigating the effect of an early conversion from ciclosporin-based to an everolimus-based regimen on graft function twelve months post-transplantation. In this investigator-initiated sub-study, functional magnetic resonance imaging (fMRI) of kidney grafts was prospectively performed to non-invasively assess differences in graft oxygenation, diffusion and perfusion between groups and time-points using diffusion-weighted imaging (DWI) and blood oxygen level-dependent (BOLD)-MRI. Sixteen patients underwent DWI and BOLD-MRI at months 4.5 and 12 post-transplantation on a 3 Tesla and 1.5 Tesla (n=3) MR scanner. After exclusion due to image quality, outlier values or missing data, DWI was analyzed in ten, BOLD in eight subjects. The diffusion coefficient ADCD decreased in the CsA-group over time, whereas it increased in the EVE-group (p=0.046, medulla). The change in ADCD from month 4.5 to 12 significantly differed between groups in cortex (p=0.033) and medulla (p=0.019). In BOLD, cortico-medullary transverse relaxation rate R2* increased (decreased tissue oxygen) in the CsA-treated and decreased in the EVE-treated group over time. Similarly, R2* values at month 12 were higher in the CsA- vs. EVE-treated group. There was no significant difference for the perfusion fraction FP. In conclusion, this prospective sub-study of the ZEUS trial suggests an impact of immunosuppressive regimen on fMRI parameters of the kidney graft. | nephrology |
10.1101/2022.04.11.22273736 | Neurodevelopmental, cognitive, behavioural and mental health impairments following childhood malnutrition: a systematic review | BackgroundSevere childhood malnutrition impairs growth and development short-term, but current understanding of long-term outcomes is limited. We aimed to identify studies assessing neurodevelopmental, cognitive, behavioural and mental health outcomes following childhood malnutrition.
MethodsWe systematically searched MEDLINE, EMBASE, Global Health and PsychINFO for studies assessing these outcomes in those exposed to childhood malnutrition in low- and middle-income settings. We included studies assessing undernutrition measured by low mid-upper arm circumference, weight-for-height, weight-for-age or nutritional oedema. We used guidelines for synthesis of results without meta-analysis to analyse three outcome areas: neurodevelopment, cognition/academic achievement, behaviour/mental health.
ResultsWe identified 30 studies, including some long-term cohorts reporting outcomes through to adulthood. There is strong evidence that malnutrition in childhood negatively impacts neurodevelopment based on high-quality studies using validated neurodevelopmental assessment tools. There is also strong evidence that malnutrition impairs academic achievement with agreement across seven studies investigating this outcome. 8 of 11 studies showed association between childhood malnutrition and impaired cognition. This moderate evidence is limited by some studies failing to measure important confounders such as socioeconomic status. 5 of 7 studies found a difference in behavioural assessment scores in those exposed to childhood malnutrition compared to controls but this moderate evidence is similarly limited by unmeasured confounders. Mental health impacts were difficult to ascertain due to few studies with mixed results.
ConclusionsChildhood malnutrition is associated with impaired neurodevelopment, academic achievement, cognition and behavioural problems but evidence regarding possible mental health impacts is inconclusive. Future research should explore the interplay of childhood and later-life adversities on these outcomes. Whilst evidence on improving nutritional and clinical therapies to reduce long-term risks is also needed, preventing and eliminating child malnutrition is likely to be the best way of preventing long-term neurocognitive harms.
PROSPERO registration numberCRD42021260498
KEY QUESTIONSO_ST_ABSWhat is already known?C_ST_ABS- High mortality risk and impaired growth are well-recognised short-term risks of childhood malnutrition
- Whilst there is increasing appreciation of longer-term risks for survivors, notably adult cardiometabolic non-communicable disease, other longer-term risks have been poorly described.
What are the new findings?- There is strong evidence that malnutrition impairs neurodevelopment and academic achievement in childhood which has significant implications for future wellbeing and prospects of those affected
- Childhood malnutrition is associated with impaired cognition and behavioural problems with evidence of effects through to adolescence and adulthood but the effect of nutritional treatment and interplay with childhood adversity, co-existing illness such as HIV and environmental factors in influencing these outcomes is unclear
What do the new findings imply?- Study findings imply that there are likely to be long-term effects of childhood malnutrition on cognition and wellbeing lasting through adolescence and adulthood
- Long-term needs of malnutrition survivors need to be carefully considered in treatment programmes. Further research is needed on the effects of nutritional therapy, adversity and environmental factors to tailor future interventions, particularly with regards to mental health which has been little researched | nutrition |
10.1101/2022.04.12.22273805 | Declining National Codeine Distribution in United States Hospitals and Pharmacies from 2011 to 2019 | BackgroundPast research has identified pronounced regional disparities in use of different opioids but less is known for codeine. The primary objective of this study was to analyze the trends of distribution of prescriptions containing codeine in the United States (US) from 2010 to 2019. In addition, this study aimed to identify regional disparities in prescribed milligrams of codeine per person in 2019 and identify any unusual states.
MethodsThe distribution of codeine via pharmacies, hospitals, and practitioners in kilograms was obtained from the Drug Enforcement Administrations Automated Reports and Consolidated Ordering System (ARCOS) from 2010 to 2019. In addition, the number of prescriptions of codeine per 1,000 Medicaid enrollees was obtained from the State Drug Utilization Database.
ResultsThe total grams of codeine decreased (-25.0%) through all distributors from 2010 to 2019. The largest increase in total grams of codeine distributed between two consecutive years (2014 to 2015) was +28.9%. For a given distributor type, the largest decrease from 2010 to 2019 was hospitals (-89.6%). In 2019, the total mg of codeine per person distributed in Texas (11.46) was significantly higher relative to the national average (3.06, 1.88 SD). Codeine prescriptions to Medicaid patients peeked in the third quarter of 2016.
ConclusionThe peak of prescription codeine in 2011 was consistent with the overall peak in prescription opioids, with a subsequent decrease over the decade. This could be explained by relatively recent recommendations regarding the therapeutic use of codeine and how other antitussive agents may be of better use. The precipitous rise of codeine in Texas that we observed has been recognized in prior studies. These state-level disparities warrant further attention by opioid stewardship committees. | pain medicine |
10.1101/2022.04.11.22273727 | White matter hyperintensity load varies depending on definition of subjective cognitive decline | BackgroundIncreased age and cognitive impairment is associated with an increase in cerebrovascular pathology often seen as increased signal in T2-weighted and FLAIR structural MRIs and known as white matter hyperintensities (WMHs). Whether WMHs differ between healthy older adults with subjective cognitive decline (SCD+) and without subjective cognitive decline (SCD-) remains conflicting. The goal of this study was to examine if four common methods of defining SCD provide different patterns of WMH accumulation between SCD+ and SCD-.
MethodsA total of 535 cognitively normal older adults with 1353 time points from the Alzheimers Disease Neuroimaging Initiative were included in this study. SCD was operationalized using four different methods: Cognitive Change Index (CCI), Everyday Cognition Scale (ECog), ECog+Worry, and Worry. Linear mixed-effects models were used to investigate the associations between SCD status and overall and regional WMH burden at baseline and over time.
ResultsNo differences in WMH burden were found at baseline for any SCD definition, however SCD+ vs. SCD- WMH group differences over time varied based on the SCD definition used. Higher WMH burden change over time was observed in SCD+ compared to SCD- in only the temporal and parietal regions using the CCI (temporal, p=.01; parietal p=.03) and ECog (temporal, p=.03; parietal p=.01). For both the ECog+Worry and Worry definitions, change in WMH burden over time was increased in SCD+ compared to SCD- for overall, frontal, temporal, and parietal WMH burden (p<.05).
ConclusionThe results observed here show that WMH burden differences in healthy older adults with and without SCD depend on the SCD definition used and the approach used to measure WMHs. The various methods used to define SCD may reflect different types of dementia. | geriatric medicine |
10.1101/2022.04.08.22273439 | Health Economic Burden of COVID-19 in Saudi Arabia | BackgroundThe coronavirus disease 2019 (COVID-19) pandemic has placed a massive economic burden on health care systems worldwide. Saudi Arabia is one of the numerous countries that have been economically affected by this pandemic. The objective of this study was to provide real-world data on the health economic burden of COVID-19 on the Saudi health sector and assess the direct medical costs associated with the management of COVID-19.
MethodsA retrospective cohort study was conducted based on data collected from patients hospitalized with COVID-19 across ten institutions in eight different regions in Saudi Arabia. The study calculated the estimated costs of all cases during the study period by using direct medical costs. These costs included costs directly related to medical services, such as the health care treatment, hospital stays, laboratory investigations, treatment, outcome, and other related care.
ResultsA total of 5,286 adult patients admitted with COVID-19 during the study period were included in the study. The average age of the patients was 54 years, and the majority were male. Among the COVID-19 patients hospitalized in a general ward, the median hospital length of stay was 5.5 days (mean: 9.18 days), while the ICU stay was 4.26 days (mean: 7.94 days). The total medical costs for general ward and ICU patients were 14,585,640 SAR and 90,776,250 SAR, respectively. The total laboratory investigations ranked as the highest-cost services (22,086,296 SAR), followed by treatment (14,574,233.1 SAR). Overall, the total cost of all medical services for patients hospitalized with COVID-19 was 193,394,103.1 SAR.
ConclusionThis national study found that COVID-19 was not only a serious concern for patients but also a serious economic burden on the health care system in Saudi Arabia.
Key pointsO_LIThe nursing costs and length of stay were lower in the ICU than in the general ward.
C_LIO_LIThe costs of hospitalization in general medical wards were less than those of admission to the ICU.
C_LIO_LIThese cost data will be valuable for future researchers evaluating the COVID-19 pandemics increasing health care economic burden in Saudi Arabia and the implementation of cost-effective models to assess the possible implications of COVID-19 prevention and treatment initiatives.
C_LI | health economics |
10.1101/2022.04.06.22273523 | Privacy Risks of Whole-Slide Image Sharing in Digital Pathology | Access to large volumes of so called whole-slide images, high-resolution scans of complete pathological slides, has become a cornerstone of development of novel artificial intelligence methods in digital pathology, but has broader impact for medical research and education/training. However, a methodology based on risk analysis for sharing such imaging data and applying the principle "as open as possible and as closed as necessary" is still lacking. In this article we develop a model for privacy risk analysis for whole-slide images, which focuses primarily on identity disclosure attacks, as these are the most important from a regulatory perspective. We develop a mathematical model for risk assessment and design a taxonomy of whole-slide images with respect to privacy risks. Based on these risk assessment model and the taxonomy, we design a series of experiments to demonstrate the risks on real-world imaging data. Finally, we develop guidelines for risk assessment and recommendations for data sharing based on identified risks to promote low-risk sharing of the data. | health informatics |
10.1101/2022.04.06.22273523 | Privacy Risks of Whole-Slide Image Sharing in Digital Pathology | Access to large volumes of so called whole-slide images, high-resolution scans of complete pathological slides, has become a cornerstone of development of novel artificial intelligence methods in digital pathology, but has broader impact for medical research and education/training. However, a methodology based on risk analysis for sharing such imaging data and applying the principle "as open as possible and as closed as necessary" is still lacking. In this article we develop a model for privacy risk analysis for whole-slide images, which focuses primarily on identity disclosure attacks, as these are the most important from a regulatory perspective. We develop a mathematical model for risk assessment and design a taxonomy of whole-slide images with respect to privacy risks. Based on these risk assessment model and the taxonomy, we design a series of experiments to demonstrate the risks on real-world imaging data. Finally, we develop guidelines for risk assessment and recommendations for data sharing based on identified risks to promote low-risk sharing of the data. | health informatics |
10.1101/2022.04.05.22273443 | Semi-automated socio-anthropologic analysis of the medical discourse on rheumatoid arthritis: potential impact in public health | The debilitating effects of non-communicable diseases (NCDs) and the accompanying chronic inflammation, represent a significant obstacle for the sustainability of our development, with efforts being spread worldwide to contrast NCDs diffusion, as per the United Nations Sustainable Development Goals (SDG 3). In fact, despite efforts of variable intensities in numerous directions (from innovations in biotechnology to lifestyle modifications), NCDs incidence remains pandemic.
The present work wants to contribute to this major concern with a specific focus on the fragmentation of the medical approaches, via an interdisciplinary analysis of the medical discourse, i.e. the heterogenous reporting that biomedical scientific literature uses to describe the anti-inflammatory therapeutic landscape in NCDs. The aim is to better capture the roots of this compartmentalization and the power relations existing among three segregated pharmacological, experimental and unstandardized biomedical approaches, to ultimately empower collaboration beyond medical specialties and possibly untap a more ample and effective reservoir of integrated therapeutic opportunities.
Using as exemplar disease rheumatoid arthritis (RA), twenty-eight articles were manually translated each into a nine-dimensional categorical variable of medical socio-anthropological relevance, relating in particular (but not only) to legitimacy, temporality and spatialization. This digitalized picture (9 x 28 table) of the medical discourse was further analyzed by simple automatic learning approaches to identify differences and highlight commonalities among the biomedical categories.
Interpretation of these results gives original insights including the suggestion to: empower scientific communication between unstandardized approaches and basic biology; promote non-pharmacological therapies repurposing to enhance robustness of experimental approaches; align the spatial representation of diseases and therapies in pharmacology to effectively embrace the systemic approach promoted by modern personalized and preventive medicines. We hope this original work may expand and foster interdisciplinarity among public health stakeholders, ultimately contributing to the achievement of SGD3. | health policy |
10.1101/2022.04.09.22273541 | Blood Draw Site and Analytic Device Influence Hemoglobin Measurements | Anemia is a continuing global public health concern and a priority for international action. The prevalence of anemia is estimated from the hemoglobin (Hb) levels within target populations, yet the procedures for measuring Hb are not standardized and different approaches may result in discrepancies. Several analytical variables have been proposed to influence Hb measurements, but it is difficult to understand the impact on specific variables from large population or field studies. Therefore, we designed a highly controlled protocol that minimized most technical parameters to specifically investigate the impact of blood draw site and analytic device on Hb measurements. A diverse cohort of sixty healthy adults each provided a sequential capillary and venous blood sample that were measured for Hb using an automated hematology analyzer (ADVIA-2120) and two point-of-care devices (HemoCue 201+ and HemoCue 301). Comparing blood draw sites, the mean Hb content was 0.32-0.47 g/dL (2-4%) higher in capillary compared to venous blood from the same donors. Comparing different Hb measuring instruments, the mean Hb content was 0.19-0.46 g/dL (1-4%) higher measured with HemoCue devices compared to ADVIA-2120 in both capillary and venous blood from the same donors. The maximum variance in measurement was also higher with HemoCue devices using blood from venous (5-6% CV) and capillary (21-25% CV) sites compared to ADVIA-2120 (0.6-2% CV). Other variables including blood collection tube manufacturer did not affect mean Hb content. These results demonstrate that even when most technical variables are minimized, the blood draw site and the analytical device can have a small but statistically significant effect on the mean and dispersion of Hb measurements. Even in this study, the few participants identified as mildly anemic using venous blood measured by ADVIA-2120 would not have been classified as anemic using their capillary blood samples or point-of-care analyzers. Thus, caution is warranted when comparing Hb values between studies having differences in blood draw site and Hb measuring device. Future anemia testing should maintain consistency in these analytical variables. | hematology |
10.1101/2022.04.12.22273807 | Change in methicillin-resistant Staphylococcus aureus testing in the Intensive Care Unit as an antimicrobial stewardship initiative | BackgroundMethicillin-resistant Staphylococcus aureus (MRSA) associated infections are a cause of morbidity/mortality in the Intensive Care Unit (ICU). Vancomycin is an option for treatment but is not without its own risks.
PurposeTo institute a testing change to decrease time between ordering of MRSA tests and availability of results in patients admitted to the adult ICU.
ProceduresA MRSA testing change was implemented at two adult (i.e., tertiary and community) ICUs located in a U.S. Midwestern health system. The change was implemented in 2018 and included the switch from culture to polymerase chain reaction (PCR) in ICU admitted patients. Study data were collected from 2016-2020 and a Bayesian quantile regression model was fit to examine median level change in time to results and to calculate a counterfactual estimate.
Main FindingsDuring the 58-month period, 71% of 19,975 patients seen at the two ICUs received MRSA testing. In the pre-change period, 91% and 99% of patients at the tertiary and community hospitals received testing via culture, respectively. Culture was used 1% and [~]0% of the time at the hospitals in the post-change period. The counterfactual estimated 36 (95% CrI: 35, 37) and 32 fewer hours (95% CrI: 31, 33) until results were available at the tertiary and community hospital, respectively.
ConclusionsStudy revealed MRSA results were available in less time at both facilities after testing change. This information can aid anti-microbial stewardship via possibly delaying initiation and/or quicker de-escalation of therapy when results are known. | infectious diseases |
10.1101/2022.04.12.22273802 | Different HMGCR-inhibiting statins vary in their association with increased survival in patients with COVID-19 | BackgroundIn response to the challenge to rapidly identify treatment options for COVID-19, several studies reported that statins, as a drug class, reduce mortality in these patients. Here we explored the possibility that different statins might differ in their ability to exert protective effects based on computational predictions.
MethodsA Bayesian network tool was used to predict drugs that shift the host transcriptomic response to SARS-CoV-2 infection towards a healthy state. Drugs were predicted using 14 RNA-sequencing datasets from 72 autopsy tissues and 465 COVID-19 patient samples or from cultured human cells and organoids infected with SARS-CoV-2, with a total of 2,436 drugs investigated. Top drug predictions included statins, which were tested in Vero E6 cells infected with SARS-CoV-2 and human endothelial cells infected with a related OC43 coronavirus. A database containing over 4,000 COVID-19 patients on statins was also analyzed to determine mortality risk in patients prescribed specific statins versus untreated matched controls.
FindingsSimvastatin was among the most highly predicted compounds (14/14 datasets) and five other statins were predicted to be active in > 50% of analyses. In vitro testing of SARS- CoV-2 infected cells revealed simvastatin to be a potent inhibitor whereas most other statins were less effective. Simvastatin also inhibited OC43 infection and reduced cytokine production in endothelial cells. Analysis of the clinical database revealed that reduced mortality risk was only observed in COVID-19 patients prescribed a subset of statins, including simvastatin and atorvastatin.
InterpretationDifferent statins may differ in their ability to sustain the lives of COVID-19 patients despite having a shared target and lipid-modifying mechanism of action. These findings highlight the value of target-agnostic drug prediction coupled with patient databases to identify and validate non-obvious mechanisms and drug repurposing opportunities.
FundingDARPA, Wyss Institute, Hess Research Fund, UCSF Program for Breakthrough Biomedical Research, and NIH | infectious diseases |
10.1101/2022.04.12.22273802 | Different HMGCR-inhibiting statins vary in their association with increased survival in patients with COVID-19 | BackgroundIn response to the challenge to rapidly identify treatment options for COVID-19, several studies reported that statins, as a drug class, reduce mortality in these patients. Here we explored the possibility that different statins might differ in their ability to exert protective effects based on computational predictions.
MethodsA Bayesian network tool was used to predict drugs that shift the host transcriptomic response to SARS-CoV-2 infection towards a healthy state. Drugs were predicted using 14 RNA-sequencing datasets from 72 autopsy tissues and 465 COVID-19 patient samples or from cultured human cells and organoids infected with SARS-CoV-2, with a total of 2,436 drugs investigated. Top drug predictions included statins, which were tested in Vero E6 cells infected with SARS-CoV-2 and human endothelial cells infected with a related OC43 coronavirus. A database containing over 4,000 COVID-19 patients on statins was also analyzed to determine mortality risk in patients prescribed specific statins versus untreated matched controls.
FindingsSimvastatin was among the most highly predicted compounds (14/14 datasets) and five other statins were predicted to be active in > 50% of analyses. In vitro testing of SARS- CoV-2 infected cells revealed simvastatin to be a potent inhibitor whereas most other statins were less effective. Simvastatin also inhibited OC43 infection and reduced cytokine production in endothelial cells. Analysis of the clinical database revealed that reduced mortality risk was only observed in COVID-19 patients prescribed a subset of statins, including simvastatin and atorvastatin.
InterpretationDifferent statins may differ in their ability to sustain the lives of COVID-19 patients despite having a shared target and lipid-modifying mechanism of action. These findings highlight the value of target-agnostic drug prediction coupled with patient databases to identify and validate non-obvious mechanisms and drug repurposing opportunities.
FundingDARPA, Wyss Institute, Hess Research Fund, UCSF Program for Breakthrough Biomedical Research, and NIH | infectious diseases |
10.1101/2022.04.13.22273821 | Slight increase in fomite route transmission risk of SARS-CoV-2 Omicron variant compared with the ancestral strain in households | The Omicron SARS-CoV-2 variant has become the dominant lineage worldwide, and experimental study had shown that SARS-CoV-2 Omicron variant was more stable on various environmental surfaces than ancestral strain. However, how the changes of stability on surfaces would influence the role of fomite route in SARS-CoV-2 transmission is still unknown. In this study, we modeled the Omicron and ancestral strain SARS-CoV-2 transmission within a household over 1-day period from multiple pathways, i.e., airborne, droplet and contact route. We assumed there were 2 adults and 1 child in the household, and one of the adults was infected with SARS-CoV-2. We assume a scenario of pre-/asymptomatic infection, i.e., SARS-CoV-2 was emitted by breathing and talking, and symptomatic infection, i.e., SARS-CoV-2 was emitted by breathing, talking, and coughing. In pre-/asymptomatic infection, all three routes contributed a role, contact route contribute most (37%-45%), followed by airborne route (34%-38%) and droplet route (21%-28%). In symptomatic infection, droplet route was the dominant pathway (48%-71%), followed by contact route (25%-42%), airborne route played a negligible role (<10%). In the contact route, indirect contact (fomite) route dominated (contributed more than 97%). Compared with ancestral strain, though the contribution of contact route increased in Omicron variant transmission, the increase was slight, from 25%-41% to 30%-45%. | infectious diseases |
10.1101/2022.04.12.22273792 | Health behaviours the month prior to COVID-19 infection and the development of self-reported long COVID and specific long COVID symptoms: A longitudinal analysis of 1,811 UK adults | BackgroundDemographic and infection-related characteristics have been identified as risk factors for long COVID, but research on the influence of health behaviours (e.g., exercise, smoking) immediately preceding the index infection is lacking.
Methods1,811 UK adults from the UCL COVID-19 Social Study and who had previously been infected with COVID-19 were analysed. Health behaviours in the month before infection were weekly exercise frequency, days of fresh air per week, sleep quality, smoking, consuming more than the number of recommended alcoholic drinks per week (>14), and the number of mental health care behaviours (e.g., online mental health programme). Logistic regressions controlling for covariates (e.g., COVID-19 infection severity and pre-existing health conditions) examined the impact of health behaviours on long COVID and three long COVID symptoms (difficulty with mobility, cognition, and self-care).
ResultsIn the month before infection with COVID-19, poor quality sleep increased the odds of long COVID (odds ratio [OR]: 3.53; (95% confidence interval [CI]: 2.01 to 6.21), as did average quality sleep (OR: 2.44; 95% CI: 1.44 to 4.12). Having smoked (OR: 8.39; 95% CI: 1.86 to 37.91) increased and meeting recommended weekly physical activity guidelines (3+ hours) (OR: 0.05; 95% CI: 0.01 to 0.39) reduced the likelihood of difficulty with self-care (e.g., washing all over or dressing) amongst those with long COVID.
ConclusionResults point to the importance of sleep quality for long COVID, potentially helping to explain previously demonstrated links between stress and long COVID. Results also suggest that exercise and smoking may be modifiable risk factors for preventing the development of difficulty with self-care.
FundingThe Nuffield Foundation [WEL/FR-000022583], the MARCH Mental Health Network funded by the Cross-Disciplinary Mental Health Network Plus initiative supported by UK Research and Innovation [ES/S002588/1], and the Wellcome Trust [221400/Z/20/Z and 205407/Z/16/Z].
What is already known on the topicLong COVID is rapidly becoming a public health concern. Although existing evidence to date has identified health characteristics such as obesity as risk factors, hardly any research on modifiable risk factors such as health behaviours has been conducted.
What this study addsThis study adds to the dearth of evidence on modifiable risk factors occurring before COVID-19 infection. Findings suggest a role of poor sleep quality for the development of long COVID, and for meeting physical activity guidelines (3+ hours per week) and not smoking as modifiable risk factors for self-care difficulties amongst those with long COVID. | epidemiology |
10.1101/2022.04.12.22273761 | Multiplex RT-qPCR assay (N200) to detect and estimate prevalence of multiple SARS-CoV-2 Variants of Concern in wastewater | Wastewater-based surveillance (WBS) has become an effective tool around the globe for indirect monitoring of COVID-19 in communities. Quantities of viral fragments of SARS-CoV-2 in wastewater are related to numbers of clinical cases of COVID-19 reported within the corresponding sewershed. Variants of Concern (VOCs) have been detected in wastewater by use of reverse transcription quantitative polymerase chain reaction (RT-qPCR) or sequencing. A multiplex RT-qPCR assay to detect and estimate the prevalence of multiple VOCs, including Omicron/Alpha, Beta, Gamma, and Delta, in wastewater RNA extracts was developed and validated. The probe-based multiplex assay, named "N200" focuses on amino acids 199-202, a region of the N gene that contains several mutations that are associated with variants of SARS- CoV-2 within a single amplicon. Each of the probes in the N200 assay are specific to the targeted mutations and worked equally well in single- and multi-plex modes. To estimate prevalence of each VOC, the abundance of the targeted mutation was compared with a non- mutated region within the same amplified region. The N200 assay was applied to monitor frequencies of VOCs in wastewater extracts from six sewersheds in Ontario, Canada collected between December 1, 2021, and January 4, 2022. Using the N200 assay, the replacement of the Delta variant along with the introduction and rapid dominance of the Omicron variant were monitored in near real-time, as they occurred nearly simultaneously at all six locations. The N200 assay is robust and efficient for wastewater surveillance can be adopted into VOC monitoring programs or replace more laborious assays currently being used to monitor SARS- CoV-2 and its VOCs. | epidemiology |
10.1101/2022.04.13.22273825 | Effectiveness of COVID-19 vaccines against hospitalization and death in Canada: A multiprovincial test-negative design study | BackgroundA major goal of COVID-19 vaccination is to prevent severe outcomes (hospitalizations and deaths). We estimated the effectiveness of mRNA and ChAdOx1 COVID-19 vaccines against severe outcomes in four Canadian provinces between December 2020 and September 2021.
MethodsWe conducted this multiprovincial retrospective test-negative study among community-dwelling adults aged [≥]18 years in Ontario, Quebec, British Columbia, and Manitoba using linked provincial databases and a common study protocol. Multivariable logistic regression was used to estimate province-specific vaccine effectiveness against COVID-19 hospitalization and/or death. Estimates were pooled using random effects models.
ResultsWe included 2,508,296 tested subjects, with 31,776 COVID-19 hospitalizations and 5,842 deaths. Vaccine effectiveness was 83% after a first dose, and 98% after a second dose, against both hospitalization and death (separately). Against severe outcomes (hospitalization or death), effectiveness was 87% (95%CI: 71%-94%) [≥]84 days after a first dose of mRNA vaccine, increasing to 98% (95%CI: 96%-99%) [≥]112 days after a second dose. Vaccine effectiveness against severe outcomes for ChAdOx1 was 88% (95%CI: 75%-94%) [≥]56 days after a first dose, increasing to 97% (95%CI: 91%-99%) [≥]56 days after a second dose. Lower one-dose effectiveness was observed for adults aged [≥]80 years and those with comorbidities, but effectiveness became comparable after a second dose. Two doses of vaccines provided very high protection for both homologous and heterologous schedules, and against Alpha, Gamma, and Delta variants.
ConclusionsTwo doses of mRNA or ChAdOx1 vaccines provide excellent protection against severe outcomes of hospitalization and death. | public and global health |
10.1101/2022.04.13.22273823 | A Systematic Approach to Identify Neuroprotective Interventions for Motor Neuron Disease | BackgroundMotor neuron disease (MND) is an incurable progressive neurodegenerative disease with limited treatment options. There is a pressing need for innovation in identifying therapies to take to clinical trial.
ObjectivesHere we detail a systematic, structured, and unbiased evidence-based approach to guide selection of drugs for clinical evaluation in the Motor Neuron Disease - Systematic Multi-arm Adaptive Randomised Trial (MND-SMART, clinicaltrials.gov registration number: NCT04302870), an adaptive platform trial.
MethodsWe conducted a two-stage systematic review and meta-analysis to identify potential neuroprotective interventions. In stage one, we identified drugs from the clinical literature tested in at least one study in MND or in two or more cognate diseases with potential shared pivotal pathways (Alzheimers disease, Huntingtons disease, Parkinsons disease, or multiple sclerosis). We scored and ranked 66 drugs thus identified using a predefined framework evaluating safety, efficacy, study size and quality of studies. In stage two, we conducted a systematic review of the MND preclinical literature describing efficacy of these drugs in animal models, multicellular eukaryotic models and human induced pluripotent stem cell studies; 17 of these drugs were reported to improve survival in at least one preclinical study. An expert panel then shortlisted and ranked 22 drugs considering stage one and stage two findings, mechanistic plausibility, safety and tolerability, findings from previous clinical trials in MND, and feasibility for use in clinical trials.
ResultsBased on this process, the panel selected memantine and trazodone for testing in MND-SMART.
DiscussionFor future drug selection, we will incorporate automation tools, text-mining and machine learning techniques to the systematic reviews and consider data generated from other domains, including high-throughput phenotypic screening of human induced pluripotent stem cells.
STRENGTHS AND LIMITATIONS OF THIS STUDYO_LIWe described a systematic, evidence-based approach towards drug repurposing in motor neuron disease (MND), specifically for Motor Neuron Disease - Systematic Multi-arm Adaptive Randomised Trial (MND-SMART), a phase III multi-arm multi-stage clinical trial in MND.
C_LIO_LISystematic reviews of clinical studies in neurodegenerative diseases and MND preclinical studies provided a robust evidence base to inform expert panel decisions on drug selection for clinical trials.
C_LIO_LIProviding a contemporary evidence base using traditional systematic reviews is challenging given their time-consuming and labour-intensive nature.
C_LIO_LIIncorporation of machine learning and automation tools for systematic reviews, and data from experimental drug screening can be helpful for future drug selection.
C_LI | neurology |
10.1101/2022.04.13.22273823 | A Systematic Approach to Identify Neuroprotective Interventions for Motor Neuron Disease | BackgroundMotor neuron disease (MND) is an incurable progressive neurodegenerative disease with limited treatment options. There is a pressing need for innovation in identifying therapies to take to clinical trial.
ObjectivesHere we detail a systematic, structured, and unbiased evidence-based approach to guide selection of drugs for clinical evaluation in the Motor Neuron Disease - Systematic Multi-arm Adaptive Randomised Trial (MND-SMART, clinicaltrials.gov registration number: NCT04302870), an adaptive platform trial.
MethodsWe conducted a two-stage systematic review and meta-analysis to identify potential neuroprotective interventions. In stage one, we identified drugs from the clinical literature tested in at least one study in MND or in two or more cognate diseases with potential shared pivotal pathways (Alzheimers disease, Huntingtons disease, Parkinsons disease, or multiple sclerosis). We scored and ranked 66 drugs thus identified using a predefined framework evaluating safety, efficacy, study size and quality of studies. In stage two, we conducted a systematic review of the MND preclinical literature describing efficacy of these drugs in animal models, multicellular eukaryotic models and human induced pluripotent stem cell studies; 17 of these drugs were reported to improve survival in at least one preclinical study. An expert panel then shortlisted and ranked 22 drugs considering stage one and stage two findings, mechanistic plausibility, safety and tolerability, findings from previous clinical trials in MND, and feasibility for use in clinical trials.
ResultsBased on this process, the panel selected memantine and trazodone for testing in MND-SMART.
DiscussionFor future drug selection, we will incorporate automation tools, text-mining and machine learning techniques to the systematic reviews and consider data generated from other domains, including high-throughput phenotypic screening of human induced pluripotent stem cells.
STRENGTHS AND LIMITATIONS OF THIS STUDYO_LIWe described a systematic, evidence-based approach towards drug repurposing in motor neuron disease (MND), specifically for Motor Neuron Disease - Systematic Multi-arm Adaptive Randomised Trial (MND-SMART), a phase III multi-arm multi-stage clinical trial in MND.
C_LIO_LISystematic reviews of clinical studies in neurodegenerative diseases and MND preclinical studies provided a robust evidence base to inform expert panel decisions on drug selection for clinical trials.
C_LIO_LIProviding a contemporary evidence base using traditional systematic reviews is challenging given their time-consuming and labour-intensive nature.
C_LIO_LIIncorporation of machine learning and automation tools for systematic reviews, and data from experimental drug screening can be helpful for future drug selection.
C_LI | neurology |
10.1101/2022.04.07.22273556 | Non-intuitive trends of fetal fraction development related to gestational age and fetal gender, and their practical implications for non-invasive prenatal testing | ObjectiveDiscovery of fetal cell-free DNA fragments in maternal blood revolutionized prenatal diagnostics. Although non-invasive prenatal testing (NIPT) is already a matured screening test with high specificity and sensitivity, the accurate estimation of the proportion of fetal fragments, called fetal fraction, is crucial to avoid false-negative results.
MethodsWe collected 6999 samples from women undergoing NIPT testing with a single male fetus to demonstrate the influence of fetal fraction by the maternal and fetal characteristics.
ResultsWe show several fetal fraction discrepancies that contradict the generally presented conventional view. At first, the fetal fraction is not consistently rising with the maturity of the fetus due to a drop in 15 weeks of maturation. Secondly, the male samples have a lower fetal fraction than female fetuses, arguably due to the smaller gonosomal chromosomes. Finally, we discuss not only the possible reasons why this inconsistency exists but we also outline why these differences have not yet been identified and published.
ConclusionWe demonstrate two non-intuitive trends to better comprehend the fetal fraction development and more precise selection of patients with sufficient fetal fraction for accurate testing.
Bulleted statementsO_ST_ABSWhat is already known about this topic?C_ST_ABSO_LINon-invasive prenatal testing has become a well-known mature screening test, and the fetal fraction is studied in detail by research teams worldwide.
C_LI
What does this study add?O_LIHere we demonstrate two non-intuitive trends to better comprehend fetal fraction development that can further increase the sensitivity of routine testing by proper selection of blood sampling according to gestational age and fetus gender.
C_LI | obstetrics and gynecology |
10.1101/2022.04.13.22273812 | Associations of genetic and infectious risk factors with coronary heart disease | Background and PurposeCoronary heart disease (CHD) is one of the most pressing health problems of our time and a major cause of preventable death. CHD results from complex interactions between genetic and environmental factors. Using multiplex serological testing for persistent or frequently recurring infections and genome-wide analysis in a prospective population study, we delineate the respective and combined influences of genetic variation, infections, and low-grade inflammation on the risk of incident CHD.
Participants and MethodsStudy participants are enrolled in the CoLaus|PsyCoLaus study, a longitudinal, population-based cohort with baseline assessments from 2003 through 2008 and follow-up visits every five years. We analyzed a subgroup of 3459 individuals with available genome-wide genotyping data and immunoglobulin G levels for 22 persistent or frequently recurring pathogens. All reported CHD events were evaluated by a panel of specialists. We identified independent associations with incident CHD using univariable and multivariable stepwise Cox proportional hazards regression analyses.
ResultsOf the 3459 study participants, 210 (6.07%) had at least one CHD event during the 12 years of follow-up. Multivariable stepwise Cox regression analysis, adjusted for known cardiovascular risk factors, socioeconomic status and statin intake, revealed that high polygenic risk (hazard ratio (HR) 1.31, 95% CI 1.10-156, P = 2.64e-03) and infection with Fusobacterium nucleatum (HR 1.63, 95% CI 1.08-2.45, P = 1.99e-02) were independently associated with incident CHD.
ConclusionIn a prospective, population-based cohort, high polygenic risk and infection with Fusobacterium nucleatum have a small, yet independent impact on CHD risk. | genetic and genomic medicine |
10.1101/2022.04.11.22273755 | Localization of Abnormal Brain Regions in Parkinsonian Disorders: An ALE Meta-Analysis | Parkinsonism is a feature of several neurodegenerative disorders, including Parkinsons disease (PD), progressive supranuclear palsy (PSP), corticobasal degeneration syndrome (CBS) and multiple system atrophy (MSA). Neuroimaging studies have yielded insights into parkinsonism; however it remains unclear whether there is a common neural substrate amongst disorders. The aim of the present meta-analysis was to identify consistent brain alterations in parkinsonian disorders (PD, PSP, CBS, MSA) both individually, and combined, to elucidate the shared substrate of parkinsonism. 33,505 studies were systematically screened following searches of MEDLINE Complete and Embase databases. A series of whole-brain activation likelihood estimation meta-analyses were performed on 126 neuroimaging studies (64 PD; 25 PSP; 18 CBS; 19 MSA) utilizing anatomical MRI, perfusion or metabolism positron emission tomography and single photon emission computed tomography. Abnormality of the caudate, thalamus, middle frontal and temporal gyri was common to all parkinsonian disorders. Localizations of commonly affected brain regions in individual disorders aligned with current diagnostic imaging markers, localizing the midbrain in PSP, putamen in MSA-parkinsonian variant and brainstem in MSA-cerebellar variant. Regions of the basal ganglia and precuneus were most commonly affected in PD, while CBS was characterized by caudate abnormality. To our knowledge, this is the largest meta-analysis of neuroimaging studies in parkinsonian disorders. Findings support the notion that parkinsonism may share a common neural substrate, independent of the underlying disease process, while also highlighting characteristic patterns of brain abnormality in each disorder. | neurology |
10.1101/2022.04.12.22273806 | Low Intensity Vibration Protects the Weight Bearing Skeleton and Suppresses Fracture Incidence in Boys with Duchenne Muscular Dystrophy | Introduction/AimsThe ability of low intensity vibration (LIV) to combat skeletal decline in Duchenne Muscular Dystrophy (DMD) was evaluated in a randomized controlled trial.
MethodsTwenty DMD boys were enrolled, all ambulant and treated with glucocorticoids (mean age 7.6, height-adjusted Z-scores (HAZ) of hip BMD -2.3). Ten DMD boys were assigned to stand for 10min/d on an Active LIV platform (0.4g @ 30Hz), while 10 stood on a Placebo device. Baseline and 14-month BMC and BMD of spine, hip and total body were measured with DXA, and trabecular bone density (TBD) of tibia with QCT.
ResultsAll children tolerated the LIV intervention well, with daily compliance averaging 78%. At 14 months, TBD in the proximal and distal tibia remained unchanged in Placebo (-0.7% & -0.8%), while rising 4.1% and 4.5% in LIV. HAZ for hip BMD and BMC in Placebo declined 22% and 13% respectively, contrasting with no change from baseline (0.9% and 1.4%) in LIV. Fat mass in the leg increased 33% in Placebo, contrasting with 20% in LIV subjects. Across the 14-month study, there were four incident fractures in three placebo patients (30%), with no new fractures identified in LIV subjects.
ConclusionsThese data suggest that non-invasive LIV can help protect the skeleton of DMD children against the disease progression, the consequences of diminished load bearing, and the complications of chronic steroid use. | neurology |
10.1101/2022.04.12.22273810 | The effect of population migration on the diffusion of cholera outbreaks in metapopulations | In this study, an improved Susceptible-Infected-Recovered (SIR) epidemic diffusion model for cholera is extended by including migration for susceptible people. This model is applied to a metapopulation that consists of two isolated cities where just susceptible individuals can migrate between the cities. The disease-free equilibrium, the endemic equilibrium points, and the basic reproductive number with unequal migration rates are analyzed for this metapopulation. Firstly, the study showed that the basic reproductive number depends on the migration rates between the cities. Then, showed that when the epidemic SIR system is stable, then the infected cases for cholera outbreak can reach zero in one city, but the infected cases in the other city still can stay positive. Finally, discussed three scenarios that depend on population sizes and migration rates of susceptible people between the cities and showed how important the migration rates are in the diffusion of the cholera outbreak by visualizing these three scenarios.
Mathematics Subject ClassificationPrimary: 92B05; Secondary: 92D40 | infectious diseases |
10.1101/2022.04.13.22273796 | Genetic and potential antigenic evolution of influenza A(H1N1)pdm09 viruses circulating in Kenya during 2009-2018 influenza seasons | BackgroundInfluenza viruses undergo rapid evolutionary changes, which requires continuous surveillance to monitor for genetic and potential antigenic changes in circulating viruses that can guide control and prevention decision making.
MethodsWe sequenced and phylogenetically analyzed A(H1N1)pdm09 virus genome sequences obtained from specimens collected from hospitalized patients of all ages with or without pneumonia between 2009 and 2018 from seven sentinel surveillance sites across Kenya. We compared these sequences with recommended vaccine strains during the study period to infer genetic and potential antigenic changes in circulating viruses and determinants of clinical outcome.
ResultsWe generated and analyzed a total of 383 A(H1N1)pdm09 virus genome sequences. Phylogenetic analyses revealed that multiple genetic groups (clades, subclades, and subgroups) of A(H1N1)pdm09 virus circulated in Kenya over the study period; these evolved away from their vaccine strain, forming clades 7 and 6, subclades 6C, 6B, and 6B.1, and subgroups 6B.1A and 6B.1A1. Several amino acid substitutions among circulating viruses were associated with continued evolution of the viruses, especially in antigenic epitopes and receptor binding sites (RBS) of circulating viruses. Disease severity reduced with increase in age among children aged <5 years.
ConclusionOur study highlights the utility of genomic surveillance to monitor the evolutionary changes of influenza viruses. Routine influenza surveillance with broad geographic representation and whole genome sequencing capacity to inform on the severity of circulating strains could improve selection of influenza strains for inclusion in vaccines. | infectious diseases |
10.1101/2022.04.04.22271864 | Real-time RT-PCR for Venezuelan equine encephalitis complex, Madariaga and Eastern equine encephalitis viruses: application in clinical diagnostic and mosquito surveillance | BackgroundVenezuelan equine encephalitis (VEE) complex, eastern equine encephalitis virus (EEEV), and Maradiaga virus (MADV), are associated with severe neurological disease in human and equine hosts. We aimed to overcome the lack of available molecular diagnostics for these pathogens by designing and clinically evaluating real-time reverse transcription PCRs (rRT-PCRs) for 1) VEE complex and 2) MADV/EEEV.
Methodology/Principal FindingsPrimers and probes were designed from alignments of all publicly available complete genome sequences for VEE complex, EEEV and MADV. Linear range for the VEE complex and MADV/EEEV rRT-PCRs extended from 2.0 to 8.0 log10 copies/{micro}L with exclusive detection of the respective viruses. Fifteen serum samples from febrile patients collected from 2015 and 2017 outbreaks in Panama were tested to evaluate the new assays. Ten samples (66.7%) samples were positive for VEEV, and in one of these ten samples, rRT-PCR detected both VEEV and MADV. The remaining five samples were negative for both MADV and VEEV. Three acute samples ([≤] 3 days since onset) with coincidental anti-VEEV IgM and IgG antibodies were also found positive for VEEV viral RNA. Of 150 mosquito pools, 3 tested positive for VEEV by rRT-PCR. Two of these yielded VEEV isolates.
ConclusionsThe VEE complex and MADV/EEEV rRT-PCRs provide accurate detection while yielding significant benefits over currently available molecular methods. Outcomes from the clinical evaluation provide further evidence for VEE complex and MADV co-infections and suggest that re-infection with VEE complex viruses is possible.
Author SummaryThe alphavirus encephalitis viruses include species of the Venezuelan equine encephalitis (VEE) complex, Maradiaga virus (MADV), and Eastern equine encephalitis virus (EEEV). They persist in sylvatic-enzootic cycles throughout the Americas and are transmitted to humans by Culex spp. mosquitoes. Human and equine outbreaks are reported sporadically in endemic regions. Detection of VEE complex, MADV and EEEV infections in the acute phase is complicated by their non-specific clinical manifestations and lack of widely available diagnostic tools. Current molecular tests lack optimal performance metrics characteristics necessary for routine diagnostic testing. Here we design two real-time RT-PCRs for VEE complex and MADV/EEEV, which were then evaluated by testing a set of clinical and mosquito samples collected during two outbreaks of VEEV and MADV in Panama. The VEE complex and MADV/EEEV rRT-PCRs provide accurate detection while yielding significant benefits over currently available molecular methods. | infectious diseases |