id
stringlengths 16
27
| title
stringlengths 18
339
| abstract
stringlengths 95
38.7k
| category
stringlengths 7
44
|
---|---|---|---|
10.1101/2022.03.20.22272663 | Administration approaches of nursing assistants in hospitals: a scoping review | ObjectivesAdministration of nursing assistants is closely associated with patient outcomes, but the current circumstances are tough and need improvement. There was limited research evaluating the intrahospital administration of nursing assistants, and there is a lack of available systematic reviews of the area. The aim of this article was to identify and synthesize the literature on intrahospital nursing assistants administration approaches.
DesignScoping review.
Search strategyWe searched PubMed, Embase, APA PsycInfo, Wanfang Med, SinoMed, CINAHL, Ovid Emcare, Scopus, ProQuest, CNKI, NICE, AHRQ, CADTH, JBI EBP and Cochrane DSR for English and Chinese language articles published between January 2011 and March 2022. Publications on administration approaches, models and appraisal tools of nursing assistants in hospitals, including qualitative, quantitative, mix-methods studies and evidence syntheses, were considered eligible.
Results36 eligible studies were included for the review with an acceptable quality. We identified one administration model, nine administration methods, fifteen educational programmes and seven appraisal tools from the included studies. The frequency effect size analysis yielded 15 topics of main focus at four levels, whilst suggesting that the previous articles were mostly (33%) focused on the competency of nursing assistants, and the lectures were the most (80%) used strategy in quality improvement projects. The evidence-based quality of the original studies was considerably low, indicating the huge gaps between the evidence-based research and the management practice.
ConclusionsA series of practical intrahospital administration approaches was revealed, and fifteen mostly focused topics were identified. We need probe more thoroughly in this area, based more on effective management theories and frameworks, and employing methods of higher quality. This scoping review will help managers find more effective methods to improve the quality of care. Researchers may focus more on evidence-based nursing skills and methods in nursing assistant administration using the 15 topics as breakthrough points.
Strengths and limitationsO_LIFirst scoping review of practical administration approaches for nursing assistants in hospitals.
C_LIO_LIPresenting the main topics and focus of related articles.
C_LIO_LIDevelopment of the nursing assistant administration was widely varied among countries.
C_LIO_LIMost of the included studies were of moderate-to-low methodological quality, and huge gap exists between the evidence-based research and the management practice.
C_LI | nursing |
10.1101/2022.03.21.22272728 | Where the truth really lies: Listening to voices from African American communities in the Southern States about COVID-19 vaccine information and communication | BackgroundHigh uptake of COVID-19 vaccine is one of the most promising measures to control the pandemic. However, some African American (AA) communities exhibit vaccination hesitancy due to mis-or dis-information. It is important to understand the challenges in accessing reliable COVID-19 vaccine information and to develop feasible health communication interventions based on voices from AA communities.
MethodsWe conducted two focus group discussions (FGDs) among 18 community leaders recruited from three counties in South Carolina on October 8 and October 29, 2021. The FGDs were conducted online via Zoom meetings. The FGD data were managed and thematically analyzed using QSR NVivo 12 software.
ResultsParticipants (73% female and 61% between the ages of 18 and 30) worked primarily in colleges (55.5%), churches (39%), and health agencies (5.5%). We found that challenges of accessing reliable COVID-19 vaccine information in AA communities primarily included structural barriers, information barriers, and lack of trust. Community leaders recommended recruiting trusted messengers, using homecoming events, football games, and other social events to reach target populations and conducting health communication campaigns through open dialogue among stakeholders.
ConclusionHealth communication interventions on COVID-19 vaccine uptake should be grounded in ongoing community engagement, trust-building activities, and transparent communication about vaccine development. Tailoring health communication interventions to different groups may help reduce misinformation spread and thus promote vaccination in AA communities in the Southern States. | infectious diseases |
10.1101/2022.03.22.22272769 | Vaccine effectiveness of two and three doses of BNT162b2 and CoronaVac against COVID-19 in Hong Kong | BackgroundHong Kong maintained extremely low circulation of SARS-CoV-2 until a major community epidemic of Omicron BA.2 starting in January 2022. Both mRNA BNT162b2 (BioNTech/Fosun Pharma) and inactivated CoronaVac (Sinovac) vaccines are widely available, however coverage has remained low in older adults. Vaccine effectiveness in this predominantly infection-naive population is unknown.
MethodsWe used individual-level case data on mild/moderate, severe/fatal and fatal hospitalized COVID-19 from December 31, 2021 to March 8, 2022, along with census information and coverage data of BNT162b2 and CoronaVac. We used a negative binomial model, adjusting for age and calendar day to estimate vaccine effectiveness of one, two and three dose schedules of both vaccines, and relative effectiveness by number of doses and vaccine type.
FindingsA total of 12.7 million vaccine doses were administered in Hong Kongs 7.3 million population, and we analyzed data from confirmed cases with mild/moderate (N=5,474), severe/fatal (N=5,294) and fatal (N=4,093) COVID-19. Two doses of either vaccine protected against severe disease and death, with higher effectiveness among adults [≥]60 years with BNT162b2 (VE: 88.2%, 95% confidence interval, CI: 84.4%, 91.1%) compared to CoronaVac (VE: 74.1%, 95% CI: 67.8%, 79.2%). Three doses of either vaccine offered very high levels of protection against severe outcomes (VE: 98.1%, 95% CI: 97.1%, 98.8%).
InterpretationThird doses of either BNT162b2 or CoronaVac provide substantial additional protection against severe COVID-19 and should be prioritized, particularly in older adults who received CoronaVac primary schedules. Longer follow-up is needed to assess persistence of different vaccine platforms and schedules.
FundingCOVID-19 Vaccines Evaluation Program, Chinese Center for Disease Control and Prevention | infectious diseases |
10.1101/2022.03.22.22272769 | Vaccine effectiveness of two and three doses of BNT162b2 and CoronaVac against COVID-19 in Hong Kong | BackgroundHong Kong maintained extremely low circulation of SARS-CoV-2 until a major community epidemic of Omicron BA.2 starting in January 2022. Both mRNA BNT162b2 (BioNTech/Fosun Pharma) and inactivated CoronaVac (Sinovac) vaccines are widely available, however coverage has remained low in older adults. Vaccine effectiveness in this predominantly infection-naive population is unknown.
MethodsWe used individual-level case data on mild/moderate, severe/fatal and fatal hospitalized COVID-19 from December 31, 2021 to March 8, 2022, along with census information and coverage data of BNT162b2 and CoronaVac. We used a negative binomial model, adjusting for age and calendar day to estimate vaccine effectiveness of one, two and three dose schedules of both vaccines, and relative effectiveness by number of doses and vaccine type.
FindingsA total of 12.7 million vaccine doses were administered in Hong Kongs 7.3 million population, and we analyzed data from confirmed cases with mild/moderate (N=5,474), severe/fatal (N=5,294) and fatal (N=4,093) COVID-19. Two doses of either vaccine protected against severe disease and death, with higher effectiveness among adults [≥]60 years with BNT162b2 (VE: 88.2%, 95% confidence interval, CI: 84.4%, 91.1%) compared to CoronaVac (VE: 74.1%, 95% CI: 67.8%, 79.2%). Three doses of either vaccine offered very high levels of protection against severe outcomes (VE: 98.1%, 95% CI: 97.1%, 98.8%).
InterpretationThird doses of either BNT162b2 or CoronaVac provide substantial additional protection against severe COVID-19 and should be prioritized, particularly in older adults who received CoronaVac primary schedules. Longer follow-up is needed to assess persistence of different vaccine platforms and schedules.
FundingCOVID-19 Vaccines Evaluation Program, Chinese Center for Disease Control and Prevention | infectious diseases |
10.1101/2022.03.21.22272721 | Socioeconomic status and immune aging in older US Adults in the Health and Retirement Study | BackgroundLife course socioeconomic and demographic factors including educational attainment, race and ethnicity, and childhood (SES) are very powerful predictors of large inequalities in aging, morbidity, and mortality. Immune aging, including accumulation of late-differentiated, senescent-like lymphocytes and lower level of naive lymphocytes, may play a role in the development of the age-related health inequalities. However, there has been little research investigating association between socioeconomic status and immune aging, particularly age-related changes in lymphocyte percentages.
MethodsThis study used nationally representative data from more than 9000 US adults from the Health and Retirement Study to investigate associations between educational attainment, race and ethnicity, and childhood SES and lymphocyte percentages.
ResultsRespondents with lower educational attainment, Hispanic adults, and those who had a parent with less than high school education had lymphocyte percentages consistent with more highly accelerated immune aging compared to those with greater educational attainment, non-Hispanic White adults, and respondents who had parents with a high school education, respectively. Associations between education, Hispanic ethnicity, and parents education and late differentiated senescent-like T lymphocytes (TemRA) and B cells were largely driven by cytomegalovirus (CMV) seropositivity.
ConclusionsResults suggest that CMV is a major driver of observed SES inequalities in immunosenescence and may therefore be an important target for interventions. Naive T lymphocytes may be particularly affected by socioeconomic position and may therefore be of particular interest to research interested in inequalities in health and aging. | epidemiology |
10.1101/2022.03.22.22272745 | Effect of prior infection, vaccination, and hybrid immunity against symptomatic BA.1 and BA.2 Omicron infections and severe COVID-19 in Qatar | BACKGROUNDProtection offered by five different forms of immunity, combining natural and vaccine immunity, was investigated against SARS-CoV-2 Omicron symptomatic BA.1 infection, symptomatic BA.2 infection, BA.1 hospitalization and death, and BA.2 hospitalization and death, in Qatar, between December 23, 2021 and February 21, 2022.
METHODSSix national, matched, test-negative case-control studies were conducted to estimate effectiveness of BNT162b2 (Pfizer-BioNTech) vaccine, mRNA-1273 (Moderna) vaccine, natural immunity due to prior infection with pre-Omicron variants, and hybrid immunity from prior infection and vaccination.
RESULTSEffectiveness of only prior infection against symptomatic BA.2 infection was 46.1% (95% CI: 39.5-51.9%). Effectiveness of only two-dose BNT162b2 vaccination was negligible at -1.1% (95% CI: -7.1-4.6), but nearly all individuals had received their second dose several months earlier. Effectiveness of only three-dose BNT162b2 vaccination was 52.2% (95% CI: 48.1-55.9%). Effectiveness of hybrid immunity of prior infection and two-dose BNT162b2 vaccination was 55.1% (95% CI: 50.9-58.9%). Effectiveness of hybrid immunity of prior infection and three-dose BNT162b2 vaccination was 77.3% (95% CI: 72.4-81.4%). Meanwhile, prior infection, BNT162b2 vaccination, and hybrid immunity all showed strong effectiveness >70% against any severe, critical, or fatal COVID-19 due to BA.2 infection. Similar levels and patterns of effectiveness were observed for BA.1 and for the mRNA-1273 vaccine.
CONCLUSIONSThere are no discernable differences in the effects of prior infection, vaccination, and hybrid immunity against BA.1 versus BA.2. Hybrid immunity resulting from prior infection and recent booster vaccination confers the strongest protection against either subvariant. Vaccination enhances protection of those with a prior infection. | epidemiology |
10.1101/2022.03.20.22272675 | Synthesizing evidence from the earliest studies to support decision-making: to what extent could the evidence be reliable? | In evidence-based practice, new topics generally only have a few studies available for synthesis. As a result, the evidence of such meta-analyses raised large concerns. We investigated the robustness of the evidence of meta-analyses from these earliest studies. Real-world data from the Cochrane Database of Systematic Reviews (CDSR) were collected. We emulated meta-analyses with the earliest 1 to 10 studies through cumulative meta-analysis from eligible meta-analyses. The magnitude and the direction of meta-analyses with the earliest few studies were compared to the full meta-analyses. From the CDSR, we identified 20,227 meta-analyses of binary outcomes and 7,683 meta-analyses of continuous outcomes. Under the tolerable difference of 20% on the magnitude of the effects, the convergence proportion ranged from 24.24% (earliest 1 study) to 77.45% (earliest 10 studies) for meta-analyses of few earliest studies with binary outcomes. For meta-analyses of continuous outcomes, the convergence proportion ranged from 13.86% to 56.52%. In terms of the direction on the effects, even when only 3 studies were available at the earliest stage, the majority had the same direction to full meta-analyses; Only 19% for binary outcomes and 12% for continuous outcomes changed the direction as further evidence accumulated. Synthesizing evidence from the earliest studies is feasible to support urgent decision-making, and in most cases, the decisions would be reasonable. Considering the potential uncertainties, it is essential to evaluate the confidence of the evidence of these meta-analyses and update the evidence when necessary. | epidemiology |
10.1101/2022.03.21.22272606 | Assessment of a vancomycin dosing guideline and identification of predictive factors associated with dose and drug trough levels | BackgroundGuidelines and therapeutic drug monitoring are widely used to optimise vancomycin dosing, but their impact remains unclear.
ObjectivesTo assess and optimise vancomycin dosing guidelines in a UK teaching hospital group.
MethodsWe conducted a retrospective study to evaluate guideline compliance and drug levels following the introduction of a new vancomycin guideline. We used multivariable regression models to investigate factors associated with dosing compliance, drug levels and acute kidney injury (AKI).
Results3,767 patients received intravenous vancomycin for [≥]24h between 01-January-2016 and 01-June-2021. Compliance with recommended loading and initial maintenance doses increased reached 84% and 70% respectively; 72% of subsequent maintenance doses were correctly adjusted. Only 26% first and 32% subsequent levels reached the target range. Drug levels were independently higher in older patients (1.14mg/L per 10 years older with twice-daily dosing [95%CI 1.03,1.25]), with lower eGFR (0.46 mg/L per 10 mL/min/1.73m2lower [0.40,0.52]) and higher Elixhauser scores (0.72mg/L per unit higher [0.57,0.88]). For patients with ongoing vancomycin treatment, the conditional probability of achieving target levels at 5 days was >50% and close to 90% at 10 days. Incidence of AKI was low (5.7%), with no evidence that AKI risk increased after guideline implementation (OR=1.09 per year [0.97,1.22], p=0.14). Model estimates were used to propose updated age, weight and eGFR specific guidelines.
ConclusionDespite good compliance with guidelines for vancomycin dosing, the proportion of drug levels achieving the target range remained suboptimal. Adjusting existing guidelines is required to ensure appropriate dosing and better attainment of therapeutic drug levels. | pharmacology and therapeutics |
10.1101/2022.03.21.22272706 | What works for whom with telemental health? A rapid realist review | BackgroundTelemental health (delivering mental health care via video calls, telephone calls or text messages) is increasingly widespread. Telemental health appears to be useful and effective in providing care to some service users in some settings, especially during an emergency restricting face-to-face contact such as the COVID-19 pandemic. However, important limitations have been reported, and telemental health implementation risks reinforcing pre-existing inequalities in service provision. If it is to be widely incorporated in routine care, a clear understanding is needed of when and for whom it is an acceptable and effective approach, and when face-to-face care is needed.
ObjectiveThe aim of this rapid realist review was to develop theory about which telemental health approaches work, or do not work, for whom, in which contexts and through what mechanisms.
MethodsRapid realist reviewing involves synthesising relevant evidence and stakeholder expertise to allow timely development of context-mechanism-outcome (CMO) configurations in areas where evidence is urgently needed to inform policy and practice. The CMOs encapsulate theories about what works for whom, and by what mechanisms. Sources included eligible papers from (a) two previous systematic reviews conducted by our team on telemental health, (b) an updated search using the strategy from these reviews, (c) a call for relevant evidence, including "grey literature", to the public and key experts, and (d) website searches of relevant voluntary and statutory organisations. CMOs formulated from these sources were iteratively refined, including through (a) discussion with an expert reference group including researchers with relevant lived experience and front-line clinicians and (b) consultation with experts focused on three priority groups: 1) children and young people, 2) users of inpatient and crisis care services, and 3) digitally excluded groups.
ResultsA total of 108 scientific and grey literature sources were included. From our initial CMOs, we derived 30 overarching CMOs within four domains: 1) connecting effectively; 2) flexibility and personalisation; 3) safety, privacy and confidentiality; and 4) therapeutic quality and relationship. Reports and stakeholder input emphasised the importance of personal choice, privacy and safety, and therapeutic relationships in telemental health care. The review also identified particular service users likely to be disadvantaged by telemental health implementation, and a need to ensure that face-to-face care of equivalent timeliness remains available. Mechanisms underlying successful and unsuccessful application of telemental health are discussed.
ConclusionsService user choice, privacy and safety, the ability to connect effectively and fostering strong therapeutic relationships, need to be prioritised in delivering telemental health care. Guidelines and strategies co-produced with service users and frontline staff are needed to optimise telemental health implementation in real-world settings. | psychiatry and clinical psychology |
10.1101/2022.03.21.22272729 | Development and clinimetric testing of the Unified Drug-Induced Movement Disorders Scale (UDIMS) | ObjectiveSeveral movement disorders develop secondary to the use of psychotropic drugs, for which multiple symptom rating scales are in common use. We planned to develop the Unified Drug-Induced Movement Scale (UDIMS) to assess the severity and impact of drug-induced dyskinesia, tremor, drug-induced parkinsonism, akathisia, dystonia and myoclonus with a single instrument.
MethodsBased on a literature review, consultation and pilot work, a 12-item instrument was developed, with each item rated on a 0-4 scale. The clinimetric properties of UDIMS were examined in 53 psychiatric patients on psychotropic medications, using established ratings scales for validation. The factor structure of the scale was examined, and the movement disorder correlates of distress and disability were determined.
ResultsThe instrument has good inter-rater reliability. Its correspondence with three other scales - Abnormal Involuntary Movements Scale, Simpson-Angus Scale and Prince Henry Hospital Akathisia Scale - for the relevant items was high. A principal components analysis yielded four factors, considered to represent tremor, parkinsonism, akathisia and dyskinesia. Overall movement-disorder related disability was related to parkinsonism and dyskinesia, while distress to all four components.
ConclusionsUDIMS is a reliable and valid scale to quantify a range of drug-induced movement disorders (DIMDs), that obviates the need for the use of multiple rating scales. Its widespread use by both clinicians and researchers, and further refinement based on this, will help promote the detection and treatment of drug-induced movement disorders, thereby reducing both distress and disability. | psychiatry and clinical psychology |
10.1101/2022.03.22.22272743 | Association between tobacco exposure, blood pressure, and arterial stiffness in South African adults and children | IntroductionSociodemographic factors, health status and health behaviour have all been associated with arterial stiffness. We examined the association between tobacco use or exposure and pulse wave velocity (PWV, a marker of arterial stiffness) in black South African adults and children against a background of other known risk factors.
MethodsTwo datasets were used: African-PREDICT (A-P; n=587 apparently healthy black adult men and women, 20-30 years) and Birth-to-Twenty-Plus (Bt20; n=95 black adult women, 28-68 years and n=47 black children, 4-10 years). A cotinine value >10 ng/ml in urine (Bt20) or serum (A-P) was considered as tobacco exposed and carotid-femoral PWV was measured using the SphygmoCor XCEL device. Regression analysis included cotinine and other known risk factors.
ResultsOne third of adults (32%) and almost half of all children (45%) were tobacco exposed with the prevalence of elevated blood pressure (BP) approximately twice as high as their non-exposed counterparts (adults, p=0.014; children, p=0.017). Cotinine was the only variable that significantly associated positively with PWV in both adults and children in univariate analysis (p<0.05), but only MAP remained significant for adults in multivariate analysis (p=0.001).
ConclusionsIn this sample, tobacco exposure was adversely associated with vascular health in adults and children. BP was higher in the tobacco exposed adults and children compared to their non-exposed counterparts. These findings suggest tobacco cessation programs for adults should include screening for blood pressure and consider the impact of tobacco exposure on childrens vascular health. | public and global health |
10.1101/2022.03.21.22272722 | The dramatic surge of excess mortality in the United States between 2017 and 2021 | ImportancePrevious studies have shown the pre-pandemic emergence of a gap in mortality between the United States and other high-income nations. This gap was estimated to account for nearly one in seven US deaths in 2017.
ObjectiveThe number and proportion of US deaths that can be attributed to this gap is expected to have grown during the pandemic. This study aims at quantifying this growth at the end of 2021.
DesignThis cross-sectional study uses publicly available 2017 to 2021 data on deaths from all causes and 2020-2021 Covid-19 death counts in the United States and five European countries. Number of deaths are combined with publicly available estimates of population size by sex and age group in these European countries to produce sex- and age-specific mortality rates. These rates are averaged and applied to the US population by sex and age-group to compare the resulting and actual US death counts.
SettingThe United States and the five most populous countries in Western Europe.
ParticipantsAll deceased individuals in these countries are included in the death counts. This study does not use individual records.
InterventionNone.
Main Outcome and MeasureTotal number of US deaths and their counterfactuals when substituting average sex- and age-specific mortality rates for the European countries to those that prevailed in the United States.
ResultsThe counterfactuals reveal 711,109 to 747,783 excess deaths in the United States in 2021, amounting to 20.8% to 21.7% of all US deaths that year (3,426,249). Excess deaths increased by 320,436 to 331,477 between 2017 and 2021, 80.0% to 82.7% more than the 2017 estimate. A decomposition of the lower figure showed that population change contributed little to the 2017-2021 increase in excess deaths (33,545 additional deaths). Covid-19 mortality alone continue contributed 230,672 excess deaths in 2021, more than 2020, whereas other causes of deaths contributed 56,219 more excess deaths in 2021 than in 2017.
DiscussionThe proportion of US deaths that would be avoided if the United States could achieve the mortality of its West European counterparts surged to more than one in five during the pandemic. The large and still growing contribution of Covid-19 to excess mortality in the United States is consistent with vaccination rates plateauing at lower levels than in European countries. The overall increase in mortality from other causes during the pandemic increasingly separates the United States from West European countries as well.
Key PointsO_ST_ABSQuestionC_ST_ABSHow many fewer deaths would the United States experience with the same mortality rates as comparable European countries?
FindingsUsing death counts from the United States and five large European countries, this cross-sectional study estimates 711,109 to 744,783 fewer US deaths in 2021 if the United States could achieve the average sex- and age-specific mortality rates of these five countries. This is an 80.0% to 82.7% increase relative to the same figure for 2017.
MeaningThe dramatic surge in excess mortality in the United States during the pandemic demonstrates chronic societal vulnerabilities rooted in the fundamental determinants of mortality. | public and global health |
10.1101/2022.03.20.22272677 | Acceptance of COVID-19 vaccine among healthcare workers in Katsina state, Northwest Nigeria | High acceptance of COVID-19 vaccines is crucial to ending the COVID-19 pandemic. Healthcare workers (HCWs) are frontline responders in the fight against COVID-19, they were prioritized to receive the COVID-19 vaccine in Nigeria. This study assessed the acceptance of the COVID-19 vaccine among HCWs in Katsina State using an online structured questionnaire and predicted variables that could increase the acceptance of the vaccine among HCWs using logistic regression analysis.
A total of 793 HCWs were included in this study. Of these, 65.4% (n=519) of them were male and 36.2% (n=287) were aged between 30-39 years. Eighty percent (80%, n=638) of the HCWs have been tested for the SARS-CoV-2 out of which 10.8% (n=65) of them tested positive. The majority of the HCWs (97.3%, n=765) believed that the COVID-19 vaccine was safe and 90% (n=714) of the HCWs have received the first dose of the COVID-19 vaccine. Our findings showed that the age of the HCW, their COVID-19 testing status, and the type of health facility they work (either public or private) were the main predictors for the acceptance of the COVID-19 vaccine among HCWs in Katsina State. HCWs between the age of 30-39 years were more likely (OR:7.06; 95% CI: 2.36, 21.07; p < 0.001) to accept the vaccine than others. In the same vein, HCWs that have been tested for COVID-19 were more likely (OR:7.64; 95% CI: 3.62, 16.16; p < 0.001) to accept the vaccine than those that have not been tested. In addition, HCWs in public health facilities were more likely (OR: 2.91; 95% CI: 1.17, 6.11; p = 0.094) to accept the COVID-19 vaccine than their counterparts in private HFs.
There was a high acceptance of the COVID-19 vaccines among HCWs in Katsina State. More emphasis should be paid on adherence to non-pharmaceutical interventions and availability of vaccines for HCWs in private hospitals. | public and global health |
10.1101/2022.03.21.22272600 | Adverse events of iron and/or erythropoiesis-stimulating agent therapy in preoperatively anaemic elective surgery patients: a systematic review | BackgroundIron supplementation and erythropoiesis-stimulating agent (ESA) administration represent the hallmark therapies in preoperative anaemia treatment, as reflected in a set of evidence-based treatment recommendations made during the 2018 International Consensus Conference on Patient Blood Management. However, little is known about the safety of these therapies. This systematic review investigated the occurrence of adverse events (AEs) during or after treatment with iron and/or ESAs.
MethodsFive databases (The Cochrane Library, MEDLINE, Embase, Transfusion Evidence Library, Web of Science) and two trial registries (ClinicalTrials.gov, WHO ICTRP) were searched until November 6 2020. Randomized controlled trials (RCTs), cohort and case-control studies investigating any AE during or after iron and/or ESAs administration in adult elective surgery patients with preoperative anaemia were eligible for inclusion, and judged using the Cochrane Risk of Bias tools. The GRADE approach was used to assess the overall certainty of evidence.
Results26 RCTs and 16 cohort studies involving a total of 6062 patients were included, providing data on 6 comparisons: (1) Intravenous (IV) versus oral iron, (2) IV iron versus usual care/no iron, (3) IV ferric carboxymaltose versus IV iron sucrose, (4) ESA+iron versus control (placebo and/or iron, no treatment), (5) ESA+IV iron versus ESA+oral iron, and (6) ESA+IV iron versus ESA+IV iron (different ESA dosing regimens). Most AE data concerned mortality/survival (n=24 studies), thromboembolic (n=22), infectious (n=20), cardiovascular (n=19) and gastrointestinal (n=14) AEs. Very low-certainty of evidence was assigned to all but one outcome category. This uncertainty results from both low quantity and quality of AE data due to high risk of bias caused by limitations in study design, data collection and reporting.
ConclusionsIt remains unclear if ESA and/or iron therapy is associated with AEs in preoperatively anaemic elective surgery patients. Future trial investigators should pay more attention to the systematic collection, measurement, documenting and reporting of AE data. | surgery |
10.1101/2022.03.20.22272667 | A Standardised Protocol for Neuro-endoscopic Lavage for Post-haemorrhagic Ventricular Dilatation: A Delphi Consensus Approach | PurposeNeuro-endoscopic lavage (NEL) has shown promise as an emerging procedure for intraventricular haemorrhage (IVH) and post-haemorrhagic ventricular dilatation (PHVD). However, there is considerable variation with regards to the indications, objectives, and surgical technique in NEL. There is currently no randomised trial evidence that supports the use of NEL in the context of PHVD. This study aims to form a consensus on technical variations in the indications and procedural steps of NEL.
MethodsA mixed methods modified Delphi consensus process was conducted between consultant paediatric neurosurgeons across the United Kingdom. Stages involved literature review, survey, focused online consultation and iterative revisions until > 80% consensus was achieved.
ResultsTwelve consultant paediatric neurosurgeons from 10 centres participated. A standardised protocol including indications, a 3-phase operative workflow (pre-ventricular, intraventricular, post-ventricular) and post-operative care was agreed upon by 100% of participants. Case- and surgeon-specific variation was considered and included through delineation of mandatory, optional, and not recommended steps.
ConclusionExpert consensus on a standardised protocol for NEL was achieved, delineating the surgical workflow into three phases: pre-ventricular, intraventricular, and post-ventricular, each consisting of mandatory, optional, and not recommended steps. The work provides a platform for future trials, training, and implementation of NEL. | surgery |
10.1101/2022.03.20.22272634 | Presumptive Multidrug-Resistant Escherichia coli Isolated in Drinking Water and Soil Sources from Kadamaian, Sabah | Multi-drug-resistant Escherichia coli poses a great threat to human health. E. coli is often used as fecal indicator bacteria to study the sources and fate of fecal contamination in the environment. This study focused on water and soil samples in tourist parks around Kadamaian, Kota Belud from three different villages of Kampung Napungguk, Kampung Pinalabuh and Kampung Tinata. From 23 of tested presumptive E. coli isolates, the percentage of antimicrobial-resistant against a series of antibiotic concentrations ranging from 25, 50, 100 to 500 g/ml showed resistance towards Chloramphenicol (91.3%), Penicillin (82.6%), Tetracycline (73.9%) and Kanamycin (30.43%). This finding suggested the presence of multi-drug resistance (MDR) of presumptive E. coli from soil samples of Kampung Napungguk and Kampung Tinata. This study alarms the presence of MDR which need further research on identifying the bacterial strains using molecular methods. The findings can then be used to create awareness of potential disease outbreaks in the tourist areas. | occupational and environmental health |
10.1101/2022.03.21.22272544 | Design and methodological considerations for biomarker discovery and validation in the Integrative Analysis of Lung Cancer Etiology and Risk (INTEGRAL) Program | The Integrative Analysis of Lung Cancer Etiology and Risk (INTEGRAL) program is an NCI-funded initiative with an objective to develop tools to optimize lung cancer screening. Here, we describe the rationale and design for the Risk Biomarker and Nodule Malignancy projects within INTEGRAL.
The overarching goal of these projects is to systematically investigate circulating protein markers to include on a panel for use (i) pre-LDCT, to identify people likely to benefit from screening, and (ii) post-LDCT, to differentiate benign versus malignant nodules. To identify informative proteins, the Risk Biomarker project measured 1,161 proteins in a nested-case control study within 2 prospective cohorts (n=252 lung cancer cases and 252 controls) and replicated associations for a subset of proteins in 4 cohorts (n=479 cases and 479 controls). Eligible participants had any history of smoking and cases were diagnosed within 3 years of blood draw. The Nodule Malignancy project measured 1,077 proteins among participants with a heavy smoking history within 4 LDCT screening studies (n=425 cases within 5 years of blood draw, 398 benign-nodule controls, and 430 nodule-free controls).
The INTEGRAL panel will enable absolute quantification of 21 proteins. We will evaluate its lung cancer discriminative performance in the Risk Biomarker project using a case-cohort study including 14 cohorts (n=1,696 cases and 2,926 subcohort representatives), and in the Nodule Malignancy project within 5 LDCT screening studies (n=675 cases, 648 benign-nodule controls, and 680 nodule-free controls). Future progress to advance lung cancer early detection biomarkers will require carefully designed validation, translational, and comparative studies. | oncology |
10.1101/2022.03.22.22272761 | Exploring patient experiences and concerns in the online Cochlear Implant community: a natural language processing approach | ImportanceThere is a paucity of research examining patient experiences of cochlear implants, with existing studies limited by small sample sizes and closed question-answer style formats.
ObjectiveTo use natural language processing methods to explore patient experiences and concerns in the online cochlear implant (CI) community when in conversation with each other.
DesignCross-sectional study of the online Reddit r/CochlearImplants forum. No date restrictions were imposed; data were retrieved on 11 November 2021 and analysed between November 2021 and March 2022. The following details were extracted for each post: 1) original post content, 2) post comments, and 3) metadata for the original post and user.
SettingOnline Reddit forum r/CochlearImplants.
ParticipantsConsecutive sample of all users posting on the r/CochlearImplants forum from 1 March 2015 to 11 November 2021.
Main Outcomes and MeasuresNatural language processing using the BERTopic automated topic modelling technique was employed to cluster posts into semantically similar topics. Topic categorisation was manually validated by two independent reviewers and Cohens Kappa calculated to determine inter-rater reliability between machine vs human and human vs human categorisation.
ResultsWe retrieved 987 posts from 588 unique Reddit users on the r/CochlearImplants forum. Posts were initially categorised by BERTopic into 16 different Topics, which were increased to 22 Topics following manual inspection. The most popular topics related to CI connectivity (n = 112), adults considering getting a CI (n=107), surgery-related posts (n = 89) and day-to-day living with a CI (n = 85). Topics with the most comments included Choosing cochlear implant brand (mean = 12.9 comments) and Adults considering getting a CI (mean = 12.2 comments). Cohens kappa among all posts was 0.62 (machine vs human) and 0.72 (human vs human), and among categorised posts was 0.85 (machine vs human) and 0.84 (human vs human).
Conclusions and RelevanceThis cross-sectional study of social media discussions amongst the online cochlear implant community identified common attitudes, experiences and concerns of patients living with, or seeking, a cochlear implant. Our validation of natural language processing methods to categorise topics shows that automated analysis of similar Otolaryngology-related content is a viable and accurate alternative to manual qualitative approaches.
Key pointsO_ST_ABSQuestionC_ST_ABSWhat are the main patient experiences and concerns reported by the online cochlear implant (CI) community?
FindingsIn this cross-sectional study of 987 posts from the Reddit forum r/CochlearImplants, we employed and validated a natural language processing approach to identify a variety of common themes discussed by current and prospective CI users. These ranged from CI connectivity and day-to-day living with a CI to queries about choosing a CI brand and post-op expectations.
MeaningAnalysis of social media using automated topic modelling techniques is a viable and insightful method to explore CI user experiences. | otolaryngology |
10.1101/2022.03.22.22272640 | Seasonality of Common Human Coronaviruses in the United States, 2014-2021 | The four common human coronaviruses (HCoVs), including two alpha (HCoV-NL63 and HCoV-229E) and two beta (HCoV-HKU1 and HCoV-OC43) types, generally cause mild, upper respiratory illness. HCoVs are known to have seasonal patterns and variation in predominant types each year, but defined measures of seasonality are needed. We defined seasonality of HCoVs during July 2014 to November 2021 in the United States using a retrospective method applied to National Respiratory and Enteric Virus Surveillance System (NREVSS) data. In the six HCoV seasons prior to 2020-2021, onsets ranged from October to November, peaks from January to February, and offsets from April to June; most (>93%) HCoV detections occurred within the defined seasonal onsets and offsets. The 2020-2021 HCoV season onset was delayed by 11 weeks compared to prior seasons, likely due to COVID-19 mitigation efforts. Better defining HCoV seasonality can inform clinical preparedness and the expected patterns of emerging HCoVs.
Article Summary LineThe typical common HCoV season in the United States starts between October and November, peaks towards the end of January, and ends between April and June, but the 2020-2021 season was markedly delayed compared to prior seasons. | infectious diseases |
10.1101/2022.03.22.22272640 | Seasonality of Common Human Coronaviruses in the United States, 2014-2021 | The four common human coronaviruses (HCoVs), including two alpha (HCoV-NL63 and HCoV-229E) and two beta (HCoV-HKU1 and HCoV-OC43) types, generally cause mild, upper respiratory illness. HCoVs are known to have seasonal patterns and variation in predominant types each year, but defined measures of seasonality are needed. We defined seasonality of HCoVs during July 2014 to November 2021 in the United States using a retrospective method applied to National Respiratory and Enteric Virus Surveillance System (NREVSS) data. In the six HCoV seasons prior to 2020-2021, onsets ranged from October to November, peaks from January to February, and offsets from April to June; most (>93%) HCoV detections occurred within the defined seasonal onsets and offsets. The 2020-2021 HCoV season onset was delayed by 11 weeks compared to prior seasons, likely due to COVID-19 mitigation efforts. Better defining HCoV seasonality can inform clinical preparedness and the expected patterns of emerging HCoVs.
Article Summary LineThe typical common HCoV season in the United States starts between October and November, peaks towards the end of January, and ends between April and June, but the 2020-2021 season was markedly delayed compared to prior seasons. | infectious diseases |
10.1101/2022.03.22.22272751 | Predictors of poor outcome in patients diagnosed with drug-resistant tuberculosis in the Torres Strait / Papua New Guinea border region | Drug-resistant tuberculosis (DR-TB) is an ongoing challenge in the Torres Strait / Papua New Guinea border region. Treatment success rates have historically been poor for patients diagnosed with DR-TB, leading to increased transmission and resistance multiplication risk. This study aimed to identify predictors of poor outcome in patients diagnosed with DR-TB to inform programmatic improvements.
A retrospective study of all DR-TB cases who presented to Australian health facilities in the Torres Strait between 1 March 2000 and 31 March 2020 was performed. This time period covers four distinct TB programmatic approaches. Univariate and multivariate predictors of poor outcome were analysed. Poor outcome was defined as treatment default, treatment failure and death (versus cure or completion).
In total, 133 patients with resistance to at least one TB drug was identified. The vast majority (123/133; 92%) of DR-TB patients had pulmonary involvement; and of these, 41% (50/123) had both pulmonary and extrapulmonary TB. Poor outcomes were observed in 29% (39/133) of patients. Patients living with human immunodeficiency virus, renal disease or diabetes (4/133; 4/133; 3/133) had an increased frequency of poor outcome (p <0.05), but numbers were very small. Among all 133 DR-TB patients, 41% had a low lymphocyte count, which was significantly associated with poor outcome (p <0.05). Overall, outcome improved in recent years with a 50% increase in the chance of a good outcome per year group over the study period; on binary logistic regression analysis. Being a close contact of a known TB case was associated with improved outcome.
While DR-TB treatment outcomes have improved over time, it remains important to prevent DR-TB spread and resistance multiplication resulting from suboptimal treatment. Enhanced surveillance for DR-TB, better cross border collaboration and consistent diagnosis and management of comorbidities and other risk factors should further improve patient care and outcomes. | infectious diseases |
10.1101/2022.03.20.22271891 | Immunogenic superiority and safety of Biological E CORBEVAX vaccine compared to COVISHIELD (ChAdOx1 nCoV-19) vaccine studied in a phase III, single blind, multicenter, randomized clinical trial | BackgroundOptimum formulation of Biological Es CORBEVAX vaccine that contains protein sub unit of Receptor Binding Domain (RBD) from the spike protein of SARS-COV-2 formulated with aluminum hydroxide (Al3+) and CpG1018 as adjuvants was selected in phase-1 and 2 studies and proven to be safe, well tolerated and immunogenic in healthy adult population. In the current study, additional data was generated to determine immunogenic superiority of CORBEVAX vaccine over COVISHIELD vaccine and safety in larger and older population.
MethodsThis is a phase III prospective, single blinded, randomized, active controlled study (CTRI/2021/08/036074) conducted at 20 sites across India in healthy adults aged between 18-80 years. This study has two arms; immunogenicity arm and safety arm. Participants in immunogenicity arm were randomized equally to either CORBEVAX or COVISHIELD vaccination groups to determine the immunogenic superiority. Healthy adults without a history of Covid-19 vaccination or SARS-CoV-2 infection, were enrolled.
FindingsThe safety profile of CORBEVAX vaccine was comparable to the comparator vaccine COVISHIELD in terms of overall AE rates, related AE rates and medically attended AEs. Majority of reported AEs were mild in nature, and overall CORBEVAX appeared to cause fewer local and systemic adverse reactions/events. Overall, two grade-3 serious AEs (Dengue fever and femur fracture) were reported and they are unrelated to study vaccine. Neutralizing Antibody titers, against both Ancestral and Delta strain, induced post two-dose vaccination regimen were higher in the CORBEVAX arm as compared to COVISHIELD and the analysis of GMT ratios demonstrated immunogenic superiority of CORBEVAX in comparison with COVISHIELD. Both CORBEVAX and COVISHIELD vaccines showed comparable seroconversion post vaccination when assessed against anti-RBD IgG response. The subjects in CORBEVAX cohort also exhibited higher Interferon-gamma secreting PBMCs post stimulation with SARS-COV-2 RBD peptides than the subjects in COVISHIELD cohort.
InterpretationsNeutralizing antibody titers induced by CORBEVAX vaccine against Delta and Ancestral strains were protective, indicative of vaccine effectiveness of >90% for prevention of symptomatic infections based on the Correlates of Protection assessment performed during Moderna and Astra-Zeneca vaccine Phase III studies. Safety findings revealed that CORBEVAX vaccine has excellent safety profile when tested in larger and older population.
FundingBIRAC-division of Department of Biotechnology, Government of India, and the Coalition for Epidemic Preparedness Innovations funded the study. | infectious diseases |
10.1101/2022.03.21.22272672 | Determinants of passive antibody effectiveness in SARS-CoV-2 infection | Neutralising antibodies are an important correlate of protection from SARS-CoV-2 infection. Multiple studies have investigated the effectiveness of passively administered antibodies (either monoclonal antibodies, convalescent plasma or hyperimmune immunoglobulin) in preventing acquisition of or improving the outcome of infection. Comparing the results between studies is challenging due to different study characteristics including disease stage, trial enrolment and outcome criteria, and different product factors, including administration of polyclonal or monoclonal antibody, and antibody targets and doses. Here we integrate data from 37 randomised controlled trials to investigate how the timing and dose of passive antibodies predicts protection from SARS-CoV-2 infection. We find that both prophylactic and early therapeutic administration (to symptomatic ambulant subjects) have significant efficacy in preventing infection or progression to hospitalisation respectively. However, we find that effectiveness of passive antibody therapy in preventing clinical progression is significantly reduced with administration at later clinical stages (p<0.0001). To compare the dose-response relationship between different treatments, we normalise the administered antibody dose to the predicted neutralisation titre (after dilution) compared to the mean titre observed in early convalescent subjects. We use a logistic model to analyse the dose-response curve of passive antibody administration in preventing progression from symptomatic infection to hospitalisation. We estimate a maximal protection from progression to hospitalisation of 70.2% (95% CI: 62.1 - 78.3%). The dose required to achieve 50% of the maximal effect (EC-50) for prevention of progression to hospitalisation was 0.19-fold (95% CI: 0.087 - 0.395) of the mean early convalescent titre. This suggests that for current monoclonal antibody regimes, doses between 7- and >1000-fold lower than currently used could still achieve around 90% of the current effectiveness (depending on the variant) and allow much more widespread use at lower cost. For convalescent plasma, most current doses are lower than required for high levels of protection. This work provides a framework for the rational design of future passive antibody prophylaxis and treatment strategies for COVID-19. | infectious diseases |
10.1101/2022.03.21.22272725 | Prevalence of neutralizing antibody to human coronavirus 229E in Taiwan | BackgroundFour members in the Coronaviruses family including 229E circulating in the community were known to cause mild respiratory tract infections in humans. The epidemiologic information of the seasonal human coronavirus (HCoV) may help gain insight into the development of the ongoing pandemic of coronavirus disease since 2019 (COVID-19).
MethodsPlasma collection containing 1558 samples was obtained in 2010 for an estimate of the prevalence and severity of 2009 pandemic influenza A H1N1 in Taiwan. Of 1558 samples, 200 were randomly selected from those aged < 1 year to > 60 years. The neutralizing antibody titers to HCoV-229E were determined in the serums using live virus ATCC(R) VR-740 cultivating in the Huh-7 cell line.
ResultsSeroconversion of HCoV-229E (titer [≥] 1:2) was identified as early as less than 5 years of age. Among 140 subjects aged younger than or equal to 40 years, all of them had uniformly low titers (< 1:10) and the geometric mean titers (GMTs) were not significantly different for those aged 0-5, 6-12, 13-18 and 19-40 years (P > 0.1). For 60 subjects greater than 40 years old, a majority (39, 65%) of them had high titers [≥] 1:10 and the GMTs were significantly increased with advanced age (P < 0.0001). Age was the most significant factor predicting seropositivity in the multivariate analysis, with an adjusted odds ratio of 1.107 and a 95% adjusted confidence interval of 1.061-1.155 (P < 0.0001).
ConclusionHCoV-229E infection occurred as early as younger than 5 years old in Taiwanese and the magnitudes of neutralizing titers against HCoV-229E increased with advanced age beyond 40 years. | infectious diseases |
10.1101/2022.03.21.22272698 | Assessment of cardiovascular & pulmonary pathobiology in vivo during acute COVID-19 | ImportanceAcute COVID-19-related myocardial, pulmonary and vascular pathology, and how these relate to each other, remains unclear. No studies have used complementary imaging techniques, including molecular imaging, to elucidate this.
ObjectiveWe used multimodality imaging and biochemical sampling in vivo to identify the pathobiology of acute COVID-19.
Design, Setting and ParticipantsConsecutive patients presenting with acute COVID-19 were recruited during hospital admission in a prospective cross-sectional study. Imaging involved computed-tomography coronary-angiography (CTCA - identified coronary disease), cardiac 2-deoxy-2-[fluorine-18]fluoro-D-glucose positron-emission tomography/computed-tomography (18F-FDG-PET/CT - identified vascular, cardiac and pulmonary inflammatory cell infiltration) and cardiac magnetic-resonance (CMR - identified myocardial disease), alongside biomarker sampling.
ResultsOf 33 patients (median age 51 years, 94% male), 24 (73%) had respiratory symptoms, with the remainder having non-specific viral symptoms. Nine patients (35%, n=9/25) had CMR defined myocarditis. 53% (n=5/8) of these patients had myocardial inflammatory cell infiltration. Two patients (5%) had elevated troponin levels. Cardiac troponin concentrations were not significantly higher in patients with myocarditis (8.4ng/L [IQR 4.0-55.3] vs 3.5ng/L [2.5-5.5], p=0.07) or myocardial cell infiltration (4.4ng/L [3.4-8.3] vs 3.5ng/L [2.8-7.2], p=0.89). No patients had obstructive coronary artery disease or vasculitis. Pulmonary inflammation and consolidation (percentage of total lung volume) was 17% (IQR 5-31%) and 11% (7-18%) respectively. Neither were associated with presence of myocarditis.
Conclusions and relevanceMyocarditis was present in a third patients with acute COVID-19, and the majority had inflammatory cell infiltration. Pneumonitis was ubiquitous, but this inflammation was not associated with myocarditis. The mechanism of cardiac pathology is non-ischaemic, and not due to a vasculitic process.
Key PointsO_ST_ABSQuestionC_ST_ABSWhat is the pathobiology of the cardiac, pulmonary and vascular systems during acute COVID-19 infection ?
FindingsOver a third of patients with acute COVID-19 had myocarditis by cardiac MRI criteria. Myocardial inflammatory cell infiltration was present in about two thirds of patients with myocarditis. No associations were observed between the degree of pulmonary involvement and presence of myocarditis. There was no evidence of obstructive coronary artery disease or evidence of large vessel vasculitis.
MeaningMyocarditis is common in acute COVID-19 infection, and may be present in the absence of significant pulmonary involvement. The cause of myocarditis is inflammatory cell infiltration in the majority of cases, but in about a third of cases this is not present. The mechanism of cardiac pathology in acute COVID-19 is non-ischaemic, and vascular thrombosis in acute COVID-19 is not due to significant coronary artery disease or a vasculitic process. | cardiovascular medicine |
10.1101/2022.03.18.22272447 | Child protection contact among children of culturally and linguistically diverse backgrounds: a South Australian linked data study | AimTo determine the cumulative incidence of child protection (CP) system contact, maltreatment type, source of reports up to age 7, and sociodemographic characteristics for Culturally and Linguistically Diverse (CALD) Australian children.
MethodsA South Australian (SA) whole-of-population linked administrative data study of children followed from birth up to age 7, using child protection, education, health, and birth registrations data. Participants: SA born children enrolled in their first year of school from 2009-2015 (N = 76 563). CALD defined as non-Aboriginal or Torres Strait Islander, spoken language other than English, Indigenous or Sign, or had at least one parent born in a non-English speaking country. Outcomes measures: For CALD and non-CALD children the cumulative incidence of CP reports up to age 7, relative risk and risk differences for all CP contact (reporting through to out-of-home care (OOHC)) and age, primary maltreatment type, reporter type, and socioeconomic characteristics were estimated. Sensitivity analyses explored population selection and different CALD definitions.
ResultsBy age 7, 11.2% of CALD children were screened in compared to 18.8% of non-CALD (RD 7.6 percentage points (95% CI: 6.9-8.3)), and 0.6% of CALD children experienced OOHC compared to 2.2% of non-CALD (RD 1.6 percentage points (95% CI: 1.3-1.8)). Among both groups, most common abuse type was emotional and the most common reporter types were police and education sector. Socioeconomic characteristics were broadly similar. Sensitivity analyses results were consistent with primary analyses.
ConclusionBy age 7, contact with any level of child protection was lower for CALD compared to non-CALD children. Estimates based on primary and sensitivity analyses suggested CALD children were 5 to 9 percentage points less likely have a report screened-in, and from 1.0 to 1.7 percentage points less likely to have experienced OOHC.
What is already known on this topicIn Australia, Aboriginal and/or Torres Strait Islander children are over represented in the child protection system.
In England and United States child protection contact vary between ethnicities with over-representation of Black and African American, and under-representation of Asian children.
Disparities in child protection contact between ethnicities are largely driven by socioeconomic disadvantage.
What this paper addsCALD children were less likely to have contact across all levels of child protection from reporting to OOHC. By age 7, 11.2% of CALD children were screened-in compared to 18.8% of non-CALD.
Sensitivity analysis suggested CALD children were 5 to 9 percentage points less likely to have a report screened-in, and from 1.0 to 1.7 percentage points less likely to have experienced OOHC.
CALD and non-CALD groups did not markedly differ by the type of maltreatment, source of report, or on background socioeconomic factors. | epidemiology |
10.1101/2022.03.22.22272748 | Venous Thromboembolism in Ambulatory Covid-19 patients: Clinical and Genetic Determinants | BackgroundSubstantial evidence suggests that severe Covid-19 leads to an increased risk of Venous Thromboembolism (VTE). We aimed to quantify the risk of VTE associated with ambulatory Covid-19, study the potential protective role of vaccination, and establish key clinical and genetic determinants of post-Covid VTE.
MethodsWe analyzed a cohort of ambulatory Covid-19 patients from UK Biobank, and compared their 30-day VTE risk with propensity-score-matched non-infected participants. We fitted multivariable models to study the associations between age, sex, ethnicity, socio-economic status, obesity, vaccination status and inherited thrombophilia with post-Covid VTE.
ResultsOverall, VTE risk was nearly 20-fold higher in Covid-19 vs matched non-infected participants (hazard ratio [HR] 19.49, 95% confidence interval [CI] 11.50 to 33.05). However, the risk was substantially attenuated amongst the vaccinated (HR: 2.79, 95% CI 0.82 to 9.54). Older age, male sex, and obesity were independently associated with higher risk, with adjusted HRs of 2.00 (1.61 to 2.47) per 10 years, 1.66 (1.28 to 2.15), and 1.85 (1.29 to 2.64), respectively. Further, inherited thrombophilia led to an HR 2.05, 95% CI 1.15 to 3.66.
ConclusionsAmbulatory Covid-19 was associated with a striking 20-fold increase in incident VTE, but no elevated risk after breakthrough infection in the fully vaccinated. Older age, male sex, and obesity were clinical determinants of Covid-19-related VTE. Additionally, inherited thrombophilia doubled risk further, comparable to the effect of 10-year ageing. These findings reinforce the need for vaccination, and call for targeted strategies to prevent VTE during outpatient care of Covid-19. | epidemiology |
10.1101/2022.03.18.22272576 | Biofabrication of multiplexed electrochemical immunosensors for simultaneous detection of clinical biomarkers in complex fluids | Simultaneous detection of multiple disease biomarkers in unprocessed whole blood is considered the gold standard for accurate clinical diagnosis. Here, we report the development of a 4-plex electrochemical (EC) immunosensor with on-chip negative control capable of detecting a range of biomarkers in small volumes (15 {micro}L) of complex biological fluids, including serum, plasma, and whole blood. A framework for fabricating and optimizing multiplexed sandwich immunoassays is presented that is enabled by use of EC sensor chips coated with an ultra-selective, antifouling, nanocomposite coating. Cyclic voltammetry evaluation of sensor performance was carried out by monitoring the local precipitation of an electroactive product generated by horseradish peroxidase linked to a secondary antibody. EC immunosensors demonstrated high sensitivity and specificity without background signal with a limit of detection in single-digit pg/mL in multiple complex biological fluids. These multiplexed immunosensors enabled simultaneous detect of four different biomarkers in plasma and whole blood with excellent sensitivity and selectivity. This rapid and cost-effective biosensor platform can be further adapted for use with different high affinity probes for any biomarker, and thereby create for a new class of highly sensitive and specific multiplexed diagnostics. | neurology |
10.1101/2022.03.18.22272564 | Oral injury in Kendo players: a cross-sectional survey | This study evaluated the risk of oral injury in Kendo. We hypothesized that Kendo players in Japan may experience oral injury due to the use of face protector for kendo (men) with inaccurate measurements or wearing it incorrectly. The survey included 400 kendo players (male, 276; female, 174) and covered four areas: the relationship between the characteristics, percentage of oral injury, temporomandibular disorder (TMD), and use of men. Based on the who experience of oral injury, participants were classified into trauma and non-trauma groups. Those who suffered oral injury were 179 (44.8 %; males, n=118; females, n=61). In the past month, 32 (8.2%) Kendo players reported that their matches/training were affected by dental/oral problems. For TMD, 50 participants had a total score >8.5 from four screening questions. For 289 (72.3%) participants, the men they are currently wearing fit well. Of those that reported that their men do not fit properly, 48 (12.0%) felt that their men were too large, and 14 (3.5%) felt that their men were too small. Years of experience and clenching of ones jaws during an offensive Kendo movement significantly contributed to oral injury. Clenching ones jaws during a defensive Kendo move, current men fit, sleep bruxism, and morning symptoms from sleep bruxism significantly contributed to TMD. Men with an accurate sizing allows proper field of vision, eliminating head shifting inside men during strikes and movements. This suggests that vision can be secured by adjusting the ties on men, chin movement, and the way one looks through the men. Since Kendo equipment is originally a Japanese craft, it has high artistry. Thus, properly worn Kendo equipment brings higher functionality and esthetic appeal. | sports medicine |
10.1101/2022.03.19.22272537 | Prognostic accuracy of qSOFA score, SIRS criteria, and EWSs for in-hospital mortality among adult patients presenting with suspected infection to the emergency department (PASSEM): protocol for an international multicentre prospective external validation cohort study | IntroductionEarly identification of a patient with infection who may develop sepsis is of utmost importance. Unfortunately, this remains elusive because no single clinical measure or test can reflect complex pathophysiological changes in patients with sepsis. However, multiple clinical and laboratory parameters indicate impending sepsis and organ dysfunction. Screening tools using these parameters can help identify the condition, such as SIRS, quick SOFA (qSOFA), National Early Warning Score (NEWS), or Modified Early Warning Score (MEWS). We aim to externally validate qSOFA, SIRS, and NEWS/NEWS2/MEWS for in-hospital mortality among adult patients with suspected infection who presenting to the emergency department.
Methods and analysisPASSEM study is an international prospective external validation cohort study. For 9 months, each participating centre will recruit consecutive adult patients who visited the emergency departments with suspected infection and are planned for hospitalisation. We will collect patients demographics, vital signs measured in the triage, initial white blood cell count, and variables required to calculate Charlson Comorbidities Index; and follow patients for 90 days since their inclusion in the study. The primary outcome will be 30-days in-hospital mortality. The secondary outcome will be intensive care unit (ICU) admission, prolonged stay in the ICU (i.e., >72 hours), and 30-as well as 90-days all-cause mortality. The study started in December 2021 and planned to enrol 2851 patients to reach 200 in-hospital death. The sample size is adaptive and will be adjusted based on prespecified consecutive interim analyses.
Ethics and disseminationThe Aseer Regional Committee for Research Ethics in the General Directorate of Health Affairs-Aseer Region has approved this study. Informed consent is not required for this study due to its complete observational nature and absence of interventions or invasive procedures. We will publish studys results in peer-reviewed journals and may present them at scientific conferences.
Study registration numberClinicalTrials.gov identifier: NCT05172479 | emergency medicine |
10.1101/2022.03.19.22271723 | Support needs of people living with obesity during transition from tertiary obesity treatment to community care | BackgroundAs the number of people living with obesity increases, the maintenance of treatment outcomes is especially pertinent. Treatment at tertiary obesity services have proven to be successful, but patients need to be transitioned out of these services to community-based care to accommodate the influx of new patients. Little is known about the support needs of patients after transition from acute tertiary obesity services. It is important to establish the supports needed by these patients, especially in the context of maintaining treatment outcomes and ensuring continuity of care.
MethodsA qualitative study was conducted to identify the support needs of people with obesity as they transition to community care. Patients and clinicians recruited from a tertiary obesity clinic participated in semi-structured interviews and focus groups to explore factors influencing transition and supports needed in the community. Data was collected through audio recordings, transcribed verbatim and analysed thematically.
ResultsA total of 16 patients and 7 clinicians involved in the care of these patients participated between July 2020 and July 2021. Themes identified included the influence of clinic and individual factors on transition, the benefits of phased transition, patient-centred communication, and the role of social support. It was found that dependency and lack of self-efficacy, as well as low social support, hindered transition efforts. It was also identified that patients required substantial integrated professional and social support structures in the community to adequately address their care needs both during and following transition.
ConclusionInterventions are needed to provide social community services following transition to ensure adequate community care that can support the maintenance of treatment outcomes. Such services should be integrated and address the social needs of people living with obesity. | endocrinology |
10.1101/2022.03.21.22272734 | Prevalence and determinants of peripheral arterial disease in children with nephrotic syndrome | Peripheral arterial disease (PAD) is the least studied complication of nephrotic syndrome (NS). Risk factors which predispose children with NS to developing PAD include hyperlipidaemia, hypertension and prolonged use of steroids. The development of PAD significantly increases the morbidity and mortality associated with NS as such children are prone to sudden cardiac death. The ankle brachial index (ABI) is a tool that has been proven to have high specificity and sensitivity in detecting PAD even in asymptomatic individuals. We aimed to determine the prevalence of PAD in children with NS and to identify risk factors that can independently predict its development. A comparative cross-sectional study was conducted involving 200 subjects (100 with NS and 100 apparently healthy comparative subjects that were matched for age, sex and socioeconomic class). Systolic blood pressures were measured in all limbs using the pocket Doppler machine (Norton Doppler scan machine). ABI was calculated as a ratio of ankle to arm systolic blood pressure. PAD was defined as ABI less than 0.9. The prevalence of PAD was significantly higher in children with NS than matched comparison group (44.0% vs 6.0%, p < 0.001). Average values of waist and hip circumference were significantly higher in subjects with PAD than those without PAD (61.68{+/-} 9.1cm and 67.6{+/-} 11.2 cm vs 57.03 {+/-} 8.3cm and 65.60{+/-} 12.5cm respectively, p< 0.005). Serum lipids (triglyceride, very low density lipoprotein, total cholesterol and low density lipoprotein) were also significantly higher in subjects with PAD than those without PAD [106.65mg/dl (67.8-136.7) vs 45.72mg/dl (37.7-61.3), 21.33mg/dl (13.6-27.3) vs 9.14mg/dl (7.5-12.3), 164.43mg/dl (136.1-259.6) vs 120.72mg/dl (111.1-142.1) and 93.29mg/dl (63.5-157.3) vs 61.84mg/dl (32.6-83.1), respectively p< 0.05]. Increasing duration since diagnosis of NS, having a steroid resistant NS and increasing cumulative steroid dose were independent predictors of PAD in children with NS; p< 0.05 respectively). | pediatrics |
10.1101/2022.03.19.22272563 | Translation and validation of the Cultural Competence Self-assessment Checklist of Central Vancouver Island Multicultural Society for Health Professionals | IntroductionThe World Health Organisation emphasizes the importance of training future healthcare practitioners to practice respectful and person-centred health care. The importance of this can be demonstrated in the example of cultural competence, which has been observed to be associated with improved patient satisfaction and concordance with recommended treatment. The aim of this study was to translate and validate in the Greek language the Cultural Competence Self-assessment Checklist of the Central Vancouver Island Multicultural Society and test it in the population of health scientists in Cyprus.
MethodsA cross-sectional analysis took place between October 2021 and January 2022 in 300 health scientists in Cyprus using convenient sampling. The sample consisted of doctors, nurses, psychologists, social workers and physiotherapists. In order to test the questionnaires internal consistency reliability we used the Cronbachs coefficient alpha.
ResultsAfter the translation of the Cultural Competence Self-assessment Checklist there was a Cronbachs alpha indication of 0.7 in all three thematic units of the checklist. 300 participants filled in the research tool, 241 women (80.3%) and 59 men (19.6). Only 2% of the sample had attended a cultural competence training before or had an expertise.
ConclusionsThe Greek version of the Cultural Competence Self-assessment Checklist of the Central Vancouver Island Multicultural Society is a valid instrument that can be used in the Greek language referring to health scientists both in Cyprus and Greece. | public and global health |
10.1101/2022.03.19.22272419 | Post- intravitreal injection endophthalmitis pattern during the COVID-19 pandemic with implementation of patient masking | PurposeTo evaluate the role of patient facial mask on the occurrence of post- intravitreal injection (IVI) endophthalmitis in a real word setting.
DesignRetrospective cohort.
ParticipantsPatients receiving IVIs between 20 February 2019 and 20 February 2021; a 12-month period before the official beginning of COVID-19 epidemic in Iran and a 12-month period after that.
InterventionIn the pre-COVID era patients underwent IVI without a facial mask while in the COVID era patients were treated with an untapped facial mask. Physicians and staff had facial mask in both periods. IVIs were administered in a dedicated operating room and no strict talk policy was followed.
Main outcome measureThe rate of post-IVI endophthalmitis.
ResultsA total number of 53927 injections was performed during the study period: 34277 in pre-COVID and 19650 in COVID periods; with a 42.7 % decrease in the number of injections. The endophthalmitis occurred in 7 eyes (0.02%) in pre-COVID and 7 eyes (0.03%) in COVID era (p=0.40). In multivariate analysis, after adjustment for intercorrelations between eyes and multiple injections in one patient, there was no statistically significant association between wearing facial masks by the patients and risk of endophthalmitis (relative risk= 1.47, 95% confidence interval of 0.97-2.22; p=0.071).
ConclusionPatients facial masking is probably not associated with increased risk of post-injection endophthalmitis. | ophthalmology |
10.1101/2022.03.21.22272720 | Vegetable intake and metabolic risk factors: A Mendelian randomization study | BackgroundThe associations between vegetable intake and metabolic risk factors remain inconsistent. This study was aimed to investigate the association between cooked and raw vegetable intake with serum lipids, body mass index (BMI), blood pressure and glycemic traits.
MethodsThis was a two-sample Mendelian randomization study. Nine and 19 genetic variants were identified from genome-wide association studies (GWAS) as instrumental variables for cooked and raw vegetable intake. Summary-level statistics were used from GWAS of total cholesterol, triglycerides, high-density lipoprotein, low-density lipoprotein, systolic blood pressure, diastolic blood pressure, BMI, fasting glucose, fasting insulin, glycated haemoglobin and 2-hour glucose after oral glucose tolerance test. Multivariable MR with inverse-variance weighted method was performed as primary analysis, while median-based method and MR-Egger method were performed as sensitivity analyses.
ResultsVegetable intake was not associated with total cholesterol (-0.06 [-0.30, 0.18] and -0.02 [-0.20, 0.16] for each serving increase of cooked and raw vegetable intake) and triglyceride (-0.12 [-0.36, 0.13] and 0.03 [-0.15, 0.22]). Null evidence was observed for associations between vegetable intake and lipoproteins, blood pressure, BMI, insulin and glycemic measures. Sensitivity analyses generated similar null associations.
ConclusionsWe found null association between vegetable intake and metabolic risk factors. This is consistent with previous MR findings of null associations between diet-derived antioxidants and metabolic risk factors. | cardiovascular medicine |
10.1101/2022.03.21.22272719 | Vegetable intake and cardiovascular risk: genetic evidence from Mendelian randomization | BackgroundObservational studies have demonstrated inverse associations between vegetable intake and cardiovascular diseases. However, the results are prone to residual confounding. The separate effects of cooked and raw vegetable intake remain unclear. This study aims to investigate the association between cooked and raw vegetable intake with cardiovascular outcomes using Mendelian randomization (MR).
MethodsWe identified 15 and 28 genetic variants associated statistically and biologically with cooked and raw vegetable intake, which were used as instrumental variables to estimate the associations with coronary heart disease (CHD), stroke, heart failure (HF) and atrial fibrillation (AF). In one-sample analysis using individual participant data from UK Biobank, we adopted two stage least square approach. In two-sample analysis, we used summary level statistics from genome-wide association analyses. The independent effects of cooked and raw vegetable intake were examined with multivariable MR analysis. The one-sample and two-sample estimates were combined via meta-analysis. Bonferroni correction was applied for multiple comparison.
ResultsIn the meta-analysis of 1.2 million participants on average, we found null evidence for associations between cooked and raw vegetable intake with CHD, HF, or AF. Raw vegetable intake was nominally associated with stroke (odds ratio [95% confidence interval] 0.82 [0.69 - 0.98] per 1 serving increase daily, p = 0.03), but this association did not pass corrected significance level.
ConclusionsCooked and raw vegetable intake was not associated with CHD, AF or HF. Raw vegetable intake is likely to reduce risk of stroke, but warrants more research. Solely increasing vegetable intake may have limited protection, if any, on cardiovascular health. This calls for more rigorous assessment on health burden associated with low vegetable consumption. | cardiovascular medicine |
10.1101/2022.03.20.22272549 | Viral load dynamics of SARS-CoV-2 Delta and Omicron variants following multiple vaccine doses and previous infection | An important, and often neglected, aspect of vaccine effectiveness is its impact on pathogen transmissibility, harboring major implications for public health policies. As viral load is a prominent factor affecting infectivity, its laboratory surrogate, qRT-PCR cycle threshold (Ct), can be used to investigate the infectivity-related component of vaccine effectiveness. While vaccine waning has previously been observed for viral load, during the Delta wave, it is yet unknown how Omicron viral load is affected by vaccination status, and whether vaccine-derived and natural infection protection are sustainable. By analyzing results of more than 460,000 individuals we show that while recent vaccination reduces Omicron viral load, its effect wanes rapidly. In contrast, a significantly slower waning rate is demonstrated for recovered COVID-19 individuals. Thus, while the vaccine is effective in decreasing morbidity and mortality, their relative minute effect on transmissibility and rapid waning call for reassessment of the scientific justification for "vaccine certificate", as it may promote false reassurance and promiscuous behavior. | epidemiology |
10.1101/2022.03.20.22272571 | Burden of PCR-Confirmed SARS-CoV-2 Reinfection in the U.S. Veterans Administration, March 2020 - January 2022 | An essential precondition for successful "herd immunity" strategies for the control of SARS-CoV-2 is that reinfection with the virus be relatively rare. Some infection control, prioritization, and testing strategies for SARS-CoV-2 were designed on the premise of rare re-infection. The U.S. Veterans Health Administration (VHA) includes 171 medical centers and 1,112 outpatient sites of care, with widespread SARS-CoV-2 test availability. We used the VHAs unified, longitudinal electronic health record to measure the frequency of re-infection with SARS-CoV-2 at least 90 days after initial diagnosis
We identified 308,051 initial cases of SARS-CoV-2 infection diagnosed in VHA between March 2020 and January 2022; 58,456 (19.0%) were associated with VHA hospitalizations. A second PCR-positive test occurred in 9,203 patients in VA at least 90-days after their first positive test in VHA; 1,562 (17.0%) were associated with VHA hospitalizations. An additional 189 cases were identified as PCR-positive a third time at least 90-days after their second PCR-positive infection in VHA; 49 (25.9%) were associated with VHA hospitalizations.
The absolute number of re-infections increased from less than 500 per month through November 2021, to over 4,000 per month in January 2022. | public and global health |
10.1101/2022.03.20.22272324 | Impact of Transition from Conventional Open Radical Cystectomy to Laparoscopic Radical Cystectomy for Neobladder: A Retrospective Study | BackgroundEarly operative recovery and good Quality of life are important goals of radical cystectomy. We compare the pre, peri and post operative data between Open radical cystectomy (ORC) and Laparoscopic radical cystectomy (LRC) surgery of neobladder.
Patients and MethodsRetrospective analysis of 13 male consecutive patients who underwent radical cystectomy by a single surgeon was done. Diagnosis of all patients was of invasive bladder cancer. Abdominal and preoperative staging was done using computed tomography. None of them received neoadjuvant chemotherapy. All the patients received same standard template bilateral pelvic lympadenenectomy. The urinary diversion included orthotopic neobladder. All patients were consented prior to study participation.
ResultsOf the 13 male patients, six had ORC with neobladder while 7 underwent LRC surgery. Baseline characteristics (age, BMI, comorbidities, tumour grade, lymph node status) were similar in both groups. Incision length was significantly smaller in LRC as compared to ORC group (p <0.0001). Although the operative time was longer in LRC group as compared to ORC it was sufficed by reduced time for analgesics, shorter hospital stay (p<0.05), besides earlier time to liquid intake with immediate removal of nasogastric tube (p<0.001). No major complications were observed in the LRC unlike ORC group where one patient died at 30 days.
ConclusionsBased on the observations of our small study sample peri and postoperative outcomes are promising for LRC compared to ORC for patients undergoing neobladder in terms of the smaller incision length associated with less pain and complications, with speedy recovery without jeopardizing oncological outcomes. Transition of surgeon from ORC to LRC was advantageous to patients. | urology |
10.1101/2022.03.21.22271857 | Development and Validation of the Michigan Chronic Disease Simulation Model (MICROSIM) | Strategies to prevent or delay Alzheimers disease and Alzheimers disease-related dementias (AD/ADRD) are urgently needed. Blood pressure (BP) management is a promising strategy for AD/ADRD prevention and the key element in the primary and secondary prevention of atherosclerotic cardiovascular disease (ASCVD), yet the effects of different population level BP control strategies across the life course on AD/ADRD are not known.
Large-scale randomized controlled trials are the least biased approach to identifying the effect of BP control on AD/ADRD, yet trials may be infeasible due to the need for prolonged follow-up and very large sample sizes. Thus, simulation analyses leveraging the best available observational data may be the best and most practical approach to answering these questions.
In this manuscript, we describe the design principles, implementation details, and population-level validation of a novel population health microsimulation framework, the MIchigan ChROnic Disease SIMulation (MICROSIM), for The Effect of Lower Blood Pressure over the Life Course on Late-life Cognition in Blacks, Hispanics, and Whites (BP COG) study of the effect of BP levels over the life course on cognitive decline and dementia.
MICROSIM was designed by applying computer programming best practices to create a novel simulation model. The initial purpose of this extensible, open-source framework is to explore a series of questions related to the impact of different blood pressure management strategies on late-life cognition and all-cause dementia, as well as the effects on race differences in all-cause dementia incidence. Ultimately, though, the framework is designed to be extensible such that a variety of different clinical conditions could be added to the framework. | health informatics |
10.1101/2022.03.21.22270828 | Home Care Follow up determine the point of inversion of IL-6 levels in relation to C-Reactive Protein as the cytokine storm marker in COVID-19 | The IL-6 has been used for the characterization of the cytokine storm induced by SARS-CoV-2, but so far, no one has found out when and in whom the cytokine storm develops. Our study demonstrates how early and longitudinal clinical-based monitoring and dosing of five markers (C-reactive protein, IL-6, fibrinogen, ferritin and D-dimer) helped to identify whod developed the cytokine storm. The peak of IL-6 in pg/mL proportionally higher than the peak of CRP in mg/L was sufficient to define the timing of the evolution of cytokine storm syndrome. The administration of antibiotic therapy, anticoagulant therapy and pulse therapy resolved the infection and prevented the progressive deterioration of the lung function of the patients with potential for development of severe COVID-19. | infectious diseases |
10.1101/2022.03.21.22272679 | Mass drug administration campaigns: Comparing two approaches for schistosomiasis and soil-transmitted helminths prevention and control in selected southern Malawi districts | Preventive chemotherapy using mass drug administration (MDA) is one of the key interventions recommended by WHO, to control neglected tropical diseases. In Malawi, health workers distribute anti-helminthic drugs annually with most support from donors. The mean community coverage of MDA from 2018 to 2020 were high at 87% for praziquantel and 82% for albendazole, however posing a sustainability challenge once donor support diminishes. This study was conducted to compare use of the community-directed intervention (CDI) approach with the use of health workers in delivery of MDA. It was carried out in three districts, where cross-sectional, mixed-methods approach to data collection during baseline and follow-up assessments was used.
Knowledge levels were high for what schistosomiasis is (65% - 88%) and what STH are (32% - 83%); and low for what causes schistosomiasis (32% - 58%), causes of STH (7% - 37%), intermediate organisms for schistosomiasis (13% - 33%) and types of schistosomiasis (2% - 26%). At follow-up, increases in praziquantel coverage were registered in control (86% to 89%) and intervention communities (83% to 89%); decreases were recorded for control (86% to 53%) and intervention schools (79% to 59%). Assessment of the costs for implementation of the study indicated that most resources were used at community (51%), health centre (29%) and district levels (19%). The intervention arm used more resources at health centre (27%) and community levels (44%) than the control arm at 2% and 4% respectively. Health workers and community members perceived the use of the CDI approach as a good initiative and more favorable over the standard practice of delivering MDA.
The use of the CDI in delivery of MDA campaigns against schistosomiasis and STH is feasible, increases coverage and is acceptable in intervention communities. This could be a way forward addressing the sustainability concern when donor support wanes.
Trial RegistrationPACTR202102477794401
Author summaryWorld Health Organization recommends mass drug administration (MDA) as a key control measure against neglected tropical diseases. In Malawi, community-based health workers distribute drugs for schistosomiasis and soil-transmitted helminths (STH) annually, using mostly donor support which raises concern on the programme sustainability without such support. This study compared the use of the local community people as volunteers in delivery of effective MDA against schistosomiasis and STH, defined as community-directed intervention (CDI) approach, with current standard practice of using community-based health workers. The MDA coverage in both groups was noted to be high, with community-based health workers, volunteers, community leaders and people welcoming the CDI approach as good, convenient, acceptable and satisfactory initiative. Therefore, this CDI approach is a positive and sustainable move towards successful delivery of MDA against schistosomiasis and STH in endemic and limited resource settings, using local community volunteers. | infectious diseases |
10.1101/2022.03.21.22271747 | Clinical validation and the evaluation of a colorimetric SARS-CoV-2 RT-LAMP assay identify its robustness against RT-PCR | The novel coronavirus has infected millions of people all around the world and has posed a great risk to global health. Rapid and accurate tests are needed to take early precautions and control the disease. The most routinely used method is real time polymerase chain reaction (RT-PCR) which stands as the gold standard in the detection of SARS-COV-2 viral RNA. However, robust assays as accurate as RT-PCR have been developed for rapid diagnosis and efficient control of the spread of the disease. Reverse transcriptase loop-mediated isothermal amplification (RT-LAMP) is one of the time-saving, accurate and cost-effective alternative methods to RT-PCR. In this study, we study the improved RT-LAMP colorimetric assay (N-Fact) to detect SARS-COV-2 viral RNA within 30 minutes using a primer sets special to N gene. Moreover, RT-LAMP colorimetric assay is subjected to authorized clinical studies to test its ability to detect COVID-19 in its early phases. The results reveal RT-LAMP colorimetric assay is an efficient, robust, and rapid assay to be used as in vitro diagnostic tool display competitiveness compared to RT-PCR. | infectious diseases |
10.1101/2022.03.20.22272651 | Dynamics of anti-SARS-CoV-2 seroconversion in individual patients and at the population level | The immune response and specific antibody production in COVID-19 are among the key factors that determine both prognostics for individual patients and the global perspective for controlling the pandemics. So called "dark figure", that is, a part of population that has been infected but not registered by the health care system, make it difficult to estimate herd immunity and to predict pandemic trajectories.
Here we present a follow up study of population screening for hidden herd immunity to SARS-CoV-2 in individuals who had never been positively diagnosed against SARS-CoV-2; the first screening was in May 2021, and the follow up in December 2021. We found that specific antibodies targeting SARS-CoV-2 detected in May as the "dark figure" cannot be considered important 7 months later due to their significant drop. On the other hand, among participants who at the first screening were negative for anti-SARS-CoV-2 IgG, and who have never been diagnosed for SARS-CoV-2 infection nor vaccinated, 26% were found positive for anti-SARS-CoV-2 IgG. This can be attributed to of the "dark figure" of the recent, fourth wave of the pandemic that occurred in Poland shortly before the study in December. Participants who were vaccinated between May and December demonstrated however higher levels of antibodies, than those who undergone mild or asymptomatic (thus unregistered) infection. Only 7% of these vaccinated participants demonstrated antibodies that resulted from infection (anti-NCP). The highest levels of protection were observed in the group that had been infected with SARS-CoV-2 before May 2021 and also fully vaccinated between May and December.
These observations demonstrate that the hidden fraction of herd immunity is considerable, however its potential to suppress the pandemics is limited, highlighting the key role of vaccinations. | infectious diseases |
10.1101/2022.03.21.22270313 | Mitochondrial DAMPs as mechanistic biomarkers of mucosal inflammation in Crohn's disease and Ulcerative Colitis | The MUSIC study is a multi-centre, longitudinal study set in the real world IBD clinical setting to investigate and develop a new biomarker approach utilising mitochondrial DAMP and circulating DNA in assessing mucosal healing in Crohns disease. We present the study protocol (approved by East of Scotland Research Ethics Service, Scotland, United Kingdom - Reference No. 19/ES/0087) on 17th September 2019 and registered in ClinicalTrials.gov as NCT04760964 for this on-going project based in Scotland, UK. | gastroenterology |
10.1101/2022.03.21.22272358 | Undiagnosed COVID-19 in households with a child with mitochondrial disease | BackgroundThe impact of the COVID-19 pandemic on medically fragile populations, who are at higher risk of severe illness and sequelae, has not been well characterized. Viral infection is a major cause of morbidity in children with mitochondrial disease (MtD), and the COVID-19 pandemic represents an opportunity to study this vulnerable population.
MethodsA convenience sampling cross-sectional serology study was conducted (October 2020 to June 2021) in households (N = 20) containing a child with MtD (N = 22). Samples (N = 83) were collected in the home using a microsampling apparatus and shipped to investigators. Antibodies against SARS-CoV-2 nucleocapsid (IgG), spike protein (IgG, IgM, IgA), and receptor binding domain (IgG, IgM, IgA) were determined by enzyme linked immunosorbent assay.
ResultsWhile only 4.8% of participants were clinically diagnosed for SARS-CoV-2 infection, 75.9% of study participants were seropositive for SARS-CoV-2 antibodies. Most samples were IgM positive for spike or RBD (70%), indicating that infection was recent. This translated to all 20 families showing evidence of infection in at least one household member. For the children with MtD, 91% had antibodies against SARS-CoV-2 and had not experienced any adverse outcomes at the time of assessment. For children with recent infections (IgM+ only), serologic data suggest household members as a source.
ConclusionsCOVID-19 was highly prevalent and undiagnosed in households with a child with MtD through the 2020-2021 winter wave of the pandemic. In this first major wave, children with MtD tolerated SARS-CoV-2 infection well, potentially due to household adherence to CDC recommendations for risk mitigation.
FundingThis study was funded by the Intramural Research Program of the National Institutes of Health (HG200381-03).
Clinical trial numberNCT04419870 | pediatrics |
10.1101/2022.03.21.22272480 | Altered microRNA expression in severe COVID-19: potential prognostic and pathophysiological role | BackgroundThe severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic is ongoing. The pathophysiology of SARS-CoV-2 infection is beginning to be elucidated but the role of microRNAs (miRNAs), small non-coding RNAs that regulate gene expression, remains incompletely understood. They play a role in the pathophysiology of viral infections with potential use as biomarkers. The objective of this study was to identify miRNAs as biomarkers of severe COVID-19 and to analyze their role in the pathophysiology of SARS-CoV-2 infection.
MethodsmiRNA expression was measured in nasopharyngeal swabs from 20 patients with severe COVID-19, 21 patients with non-severe COVID-19 and 20 controls. Promising miRNAs to differentiate non-severe from severe COVID-19 patients were identified by differential expression analysis and sparse Partial Least Squares-Discriminant Analysis (sPLS-DA). ROC analysis, target prediction, GO enrichment and pathway analysis were used to analyze the role and the pertinence of these miRNAs in severe COVID-19.
ResultsThe number of expressed miRNAs was lower in severe COVID-19 patients compared to non-severe COVID-19 patients and controls. Among the differentially expressed miRNAs between severe COVID-19 and controls, 5 miRNAs were also differentially expressed between severe and non-severe COVID-19. sPLS-DA analysis highlighted 8 miRNAs, that allowed to discriminate the severe and non-severe COVID-19 cases. Target and functional analysis revealed enrichment for genes involved in viral infections and the cellular response to infection as well as one miRNA, hsa-miR-15b-5p, that targeted the SARS-CoV-2 RNA.
The comparison of results of differential expression analysis and discriminant analysis revealed three miRNAs, namely hsa-miR-125a-5p, hsa-miR-491-5p and hsa-miR-200b-3p. These discriminated severe from non-severe cases with areas under the curve ranging from 0.76 to 0.80.
ConclusionsOur analysis of miRNA expression in nasopharyngeal swabs revealed several miRNAs of interest to discriminate severe and non-severe COVID-19. These miRNAs represent promising biomarkers and possibly targets for antiviral or anti-inflammatory treatment strategies. | infectious diseases |
10.1101/2022.03.21.22272673 | Sequential appearance and isolation of a SARS-CoV-2 recombinant between two major SARS-CoV-2 variants in a chronically infected immunocompromised patient | Genetic recombination is a major evolutionary mechanism among RNA viruses, and it is common in coronaviruses, including those infecting humans. A few SARS-CoV-2 recombinants have been reported to date whose genome harbored combinations of mutations from different mutants or variants, but a single patients sample was analyzed, and the virus was not isolated. Here, we re-port the gradual creation of a hybrid genome of B.1.160 and Alpha variants in a lymphoma patient chronically infected for 14 months, and we isolated the recombinant virus. The hybrid genome was obtained by next-generation sequencing, and recombination sites were confirmed by PCR. This consisted of a parental B.1.160 backbone interspersed with two fragments, including the spike gene, from an Alpha variant. Analysis of seven sequential samples from the patient decoded the recombination steps, including the initial infection with a B.1.160 variant, then a concurrent infec-tion with this variant and an Alpha variant, the generation of hybrid genomes, and eventually the emergence of a predominant recombinant virus isolated at the end of the patients follow-up. This case exemplifies the recombination process of SARS-CoV-2 in real life, and it calls for intensifying genomic surveillance in patients coinfected with different SARS-CoV-2 variants, and more gener-ally with several RNA viruses, as this may lead to the creation of new viruses. | infectious diseases |
10.1101/2022.03.21.22272266 | Protocol for an automated, pragmatic, embedded, adaptive randomised controlled trial: behavioural economics-informed mobile phone-based reminder messages to improve clinic attendance in a Botswanan schools-based vision screening programme | BackgroundClinic non-attendance rates are high across the African continent. Emerging evidence suggests that phone-based reminder messages could make a small but important contributing to reducing non-attendance. We used behavioral economics principles to develop an SMS and voice reminder message to improve attendance rates in a school-based eye screening programme in Botswana.
MethodsWe will test a new theory-informed SMS and voice reminder message in a national school-based eye screening programme in Botswana. The control will be the standard SMS message used to remind parents/guardians to bring their child for ophthalmic assessment. All messages will be sent three times. The primary outcome is attendance for ophthalmic assessment. We will use an automated adaptive approach, starting with a 1:1:1:1 allocation ratio. Patients will not be blinded,
DiscussionAs far as we are aware, only one other study has used behavioral economics to inform the development of reminder messages to be deployed in an African healthcare setting. Our study will will use an adaptive trial design, embedded in a national screening programme. Our approach can be used to trial other forms of reminder message in the future.
Trial registrationISRCTN:96528723. Registered 5th January 2022, https://doi.org/10.1186/ISRCTN96528723
Administrative informationNote: the numbers in curly brackets in this protocol refer to SPIRIT checklist item numbers. The order of the items has been modified to group similar items (see http://www.equator-network.org/reporting-guidelines/spirit-2013-statement-defining-standard-protocol-items-for-clinical-trials/).
O_TBL View this table:
[email protected]@12bca45org.highwire.dtl.DTLVardef@1ecabe7org.highwire.dtl.DTLVardef@109da0aorg.highwire.dtl.DTLVardef@1c9d56a_HPS_FORMAT_FIGEXP M_TBL C_TBL | public and global health |
10.1101/2022.03.22.22272632 | Psychometric properties of upper limb kinematics during functional tasks in children and adolescents with dyskinetic cerebral palsy | AimDyskinetic cerebral palsy (DCP) is characterised by involuntary movements, and the movement patterns of children with DCP have not been extensively studied during upper limb tasks. The aim of this study is to evaluate psychometric properties of upper limb kinematics in participants with DCP and typically developing (TD) participants.
MethodsTwenty TD participants and 20 participants with DCP performed three functional tasks: reaching forward, reach and grasp vertical and reach sideways during three-dimensional motion analysis. Joint angles at point of task achievement (PTA) and spatio-temporal parameters were evaluated within-and between sessions using intra-class correlation coefficients (ICC) and standard error of measurement (SEM). Independent t-tests/Mann-Whitney U tests were used to compare all parameters between groups.
ResultsWithin-session ICC values ranged from 0.55 to 0.99 for joint angles at PTA and spatio-temporal parameters for both groups during all tasks. Within-session SEM values ranged from 1.1{degrees} to 11.7{degrees} for TD participants and from 1.9{degrees} to 13.0{degrees} for participants with DCP. Eight within-session repetitions resulted in the smallest change in ICC and SEM values for both groups. Within-session variability was higher for participants with DCP in comparison with the TD group for the majority of the joint angles and spatio-temporal parameters. Intrinsic variability over time was small for all angles and spatio-temporal parameters, whereas extrinsic variability was higher for elbow pro/supination and scapula angles. Between-group differences revealed lower shoulder adduction and higher elbow flexion, pronation and wrist flexion, as well as higher trajectory deviation and a lower maximal velocity for participants with DCP.
ConclusionThis is the first study to assess the psychometric properties of upper limb kinematics in children and adolescents with DCP, showing that children with DCP show higher variability during task execution. However, their variable movement pattern can be reliably captured within- and between-sessions, if sufficient repetitions are taken into account within one session. | neurology |
10.1101/2022.03.22.22272507 | Application of Digital Twin and Heuristic Computer Reasoning to Workflow Management: Gastroenterology Outpatient Centers Study | The workflow in a large medical procedural suite is characterized by high variability of input and suboptimum throughput. Today, Electronic Health Record systems do not address the problem of workflow efficiency: there is still high frustration from medical staff who lack real-time awareness and need to act in response of events based on their personal experiences rather than anticipating. In a medical procedural suite, there are many nonlinear coupling mechanisms between individual tasks that could wrong and therefore is difficult for any individual to control the workflow in real-time or optimize it in the long run. We propose a system approach by creating a digital twin of the procedural suite that assimilates Electronic Health Record data and supports the process of making rational, data-driven, decisions to optimize the workflow on a continuous basis. In this paper, we focus on long term improvements of gastroenterology outpatient centers as a prototype example and use six months of data acquisition in two different clinical sites to validate the artificial intelligence algorithms. | health systems and quality improvement |
10.1101/2022.03.22.22271707 | Vitamin D Supplements for Prevention of Covid-19 or other Acute Respiratory Infections: a Phase 3 Randomized Controlled Trial (CORONAVIT) | BACKGROUNDVitamin D metabolites support innate immune responses to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) and other respiratory pathogens. Randomized controlled trials of vitamin D to prevent coronavirus disease 2019 (Covid-19) have not yet reported.
METHODSWe randomly assigned 6200 U.K. adults to receive an offer of a postal finger-prick 25-hydroxyvitamin D (25[OH]D) test with provision of a 6-month supply of higher-dose vitamin D (3200 IU/d, n=1550) or lower-dose vitamin D (800 IU/d, n=1550) to those with blood 25(OH)D concentration <75 nmol/L, vs. no offer of testing or supplementation (n=3100). The primary outcome was the proportion of participants experiencing at least one swab test- or doctor-confirmed acute respiratory infection (ARI) of any cause at six months. Secondary outcomes included incidence of swab test-confirmed Covid-19.
RESULTSOf 3100 participants offered testing, 2958 (95.4%) accepted, and 2690 (86.8%) had 25(OH)D <75 nmol/L and were sent vitamin D supplements (1356 higher-dose, 1334 lower-dose). 76 (5.0%) vs. 87 (5.7%) vs. 136 (4.6%) participants in higher-dose vs. lower-dose vs. no-offer groups experienced at least one ARI of any cause (odds ratio [OR] for higher-dose vs. no-offer 1.09, 95% CI 0.82-1.46; lower-dose vs. no-offer 1.26, 0.96-1.66). 45 (3.0%) vs. 55 (3.6%) vs. 78 (2.6%) participants in higher-dose vs. lower-dose vs. no-offer groups developed Covid-19 (OR for higher-dose vs. no-offer 1.13, 0.78-1.63; lower-dose vs. no-offer 1.39, 0.98-1.97).
CONCLUSIONSAmong adults with a high baseline prevalence of vitamin D insufficiency, implementation of a test-and-treat approach to vitamin D replacement did not reduce risk of all-cause ARI or Covid-19.
TRIAL REGISTRATIONClinicalTrials.gov no. NCT04579640 | infectious diseases |
10.1101/2022.03.21.22272279 | Accuracy of Rapid Antigen Testing across SARS-CoV-2 Variants | Variants of SARS-CoV-2 have mutations in the viral genome that may alter the accuracy of rapid diagnostic tests. We conducted analytical and clinical accuracy studies of two FDA-approved rapid antigen tests--SCoV-2 Ag Detect Rapid Test (InBios International, Seattle) and BinaxNOW COVID-19 Ag CARD; (Abbott Laboratories, Chicago)--using three using replication-competent variants or strains, including Omicron (B.1.1.529/BA.1), Delta (B.1.617.2), and a wild-type of SARS-CoV-2 (USA-WA1/2020). Overall, we found non-significant differences in the analytical limit of detection or clinical diagnostic accuracy of rapid antigen testing across SARS-CoV-2 variants. This study provides analytical and clinical performance data to demonstrate the preserved accuracy of rapid antigen testing across SARS-CoV-2 variants among symptomatic adults. | infectious diseases |
10.1101/2022.03.21.22272669 | Pre-exposure prophylaxis with Evusheld™ elicits limited neutralizing activity against the omicron variant in kidney transplant patients | The combination of cilgavimab-tixagevimab (Evusheld, Astra Zeneca) became the mainstay for protecting transplant recipients with poor response to vaccination against the omicron variant. Serum neutralizing capacity against SARS-CoV-2 is positively associated with protection against severe forms of Covid-19.
Both anti-RBD IgG titers and neutralizing antibody titers against the omicron BA.1 variant were measured in serum samples collected from 63 adult kidney transplant recipients who received prophylactic injections of Evusheld. Patients who received prophylactic Ronapreve (casirivimab-imdevimab, n = 39) and those who were infected with SARS-CoV-2 during the fifth wave of the pandemic (n = 14) served as negative and positive controls, respectively.
After a median interval from injection of 29 days (interquartile range 29-33 days), only 9.5% of patients who received Evusheld were able to neutralize the omicron variant compared to 71% of patients who were infected with SARS-CoV-2 and 2.6% of those who received Ronapreve. Interestingly, convalescent patients displayed higher levels of neutralizing antibodies than those who received EvusheldTM (median: 2.3 log IC50, IQR: 1.5-2.7 versus 0.00 log IC50, IQR: 0&[ndash] 0.05; p<0.001). A high interindividual variability in anti-RBD IgG titers was observed after Evusheld (range: 262-7032 BAU/mL). This variability was largely explained by the patientsbody mass index, which showed an inverse correlation with anti-RBD IgG titers.
These findings suggest that Evusheld given at a dose of 300 mg is not sufficient to elicit an anti-RDB titer that confers in vivo neutralizing activity and support recent FDA recommendations, derived from in vitro models, regarding the need to increase the dose of Evusheld | infectious diseases |
10.1101/2022.03.22.22272789 | Environmental chemical-wide associations with immune biomarkers in the US: A cross-sectional analysis | Exposure to environmental chemicals influence immune system functions, and humans are exposed to a wide range of chemicals, termed the chemical exposome. Thus, a comprehensive analysis of the effects across multiple chemical families with immune biomarkers is needed. In this study, we tested the associations between environmental chemicals and immune biomarkers. We analyzed the United States cross-sectional National Health and Nutrition Examination Survey (NHANES 1999-2018). Chemicals were measured in blood or urine (198 chemicals, 17 families). Immune biomarkers included percentages of lymphocytes, neutrophils, monocytes, basophils, and eosinophils, and counts of red blood cells, white blood cells, and mean corpuscular volume. We conducted survey-weighted, multivariable linear regressions of log2-transformed chemicals on immune measures, adjusted for age, sex, race/ethnicity, poverty-income ratio, waist circumference, cotinine concentration, creatinine for urinary chemicals, and survey cycle. We accounted for multiple comparisons using a false discovery rate (FDR). Among 45,528 adult participants, using survey weights, the mean age was 45.7 years, 51.4% were female, and 69.3% were Non-Hispanic White. There were 65 chemicals associated with white blood cell count. For example, a doubling in the concentration of blood lead was associated with a decrease of 61 white blood cells per {micro}L (95% CI: 23-99; FDR=0.005). 122 (61.6%) chemicals were associated with at least one of the eight immune biomarkers. Chemicals in the Metals family were associated with all eight immune measures. Concentrations of a wide variety of biomarkers of exposure to chemicals such as metals and smoking-related compounds, were highly associated with immune system biomarkers, with implications for immune function and toxicology. This environmental chemical-wide association study identified chemicals from multiple families for further toxicological and epidemiological investigation. | epidemiology |
10.1101/2022.03.22.22272789 | Environmental chemical-wide associations with immune biomarkers in the US: A cross-sectional analysis | Exposure to environmental chemicals influence immune system functions, and humans are exposed to a wide range of chemicals, termed the chemical exposome. Thus, a comprehensive analysis of the effects across multiple chemical families with immune biomarkers is needed. In this study, we tested the associations between environmental chemicals and immune biomarkers. We analyzed the United States cross-sectional National Health and Nutrition Examination Survey (NHANES 1999-2018). Chemicals were measured in blood or urine (198 chemicals, 17 families). Immune biomarkers included percentages of lymphocytes, neutrophils, monocytes, basophils, and eosinophils, and counts of red blood cells, white blood cells, and mean corpuscular volume. We conducted survey-weighted, multivariable linear regressions of log2-transformed chemicals on immune measures, adjusted for age, sex, race/ethnicity, poverty-income ratio, waist circumference, cotinine concentration, creatinine for urinary chemicals, and survey cycle. We accounted for multiple comparisons using a false discovery rate (FDR). Among 45,528 adult participants, using survey weights, the mean age was 45.7 years, 51.4% were female, and 69.3% were Non-Hispanic White. There were 65 chemicals associated with white blood cell count. For example, a doubling in the concentration of blood lead was associated with a decrease of 61 white blood cells per {micro}L (95% CI: 23-99; FDR=0.005). 122 (61.6%) chemicals were associated with at least one of the eight immune biomarkers. Chemicals in the Metals family were associated with all eight immune measures. Concentrations of a wide variety of biomarkers of exposure to chemicals such as metals and smoking-related compounds, were highly associated with immune system biomarkers, with implications for immune function and toxicology. This environmental chemical-wide association study identified chemicals from multiple families for further toxicological and epidemiological investigation. | epidemiology |
10.1101/2022.03.23.22272831 | The relationship between wild-type transthyretin amyloid load and ligamentum flavum thickness in lumbar stenosis patients | BackgroundOne key contributor to lumbar stenosis is thickening of the ligamentum flavum (LF), a process still poorly understood. Wild-type transthyretin amyloid (ATTRwt) has been found in the LF of patients undergoing decompression surgery, suggesting that amyloid may play a role. However, it is unclear whether within patients harboring ATTRwt, the amount of amyloid is associated with LF thickness.
MethodsFrom an initial cohort of 324 consecutive lumbar stenosis patients whose LF specimens from decompression surgery were sent for analysis (2018-2019), 33 patients met the following criteria: (1) Congo red-positive amyloid in the LF; (2) ATTRwt by mass spectrometry-based proteomics; and (3) an available preoperative MRI. Histological specimens were digitized, and amyloid load quantified through Trainable Weka Segmentation (TWS) machine learning. LF thicknesses were manually measured on axial T2-weighted preoperative MRI scans at each lumbar level, L1-S1. The sum of thicknesses at every lumbar LF level (L1-S1) equals "lumbar LF burden."
ResultsPatients had a mean age of 72.7 years (range 59-87), were mostly male (61%) and white (82%); and predominantly had surgery at L4-L5 levels (73%). Amyloid load was positively correlated with LF thickness (R=0.345, p=0.0492) at the levels of surgical decompression. Furthermore, amyloid load was positively correlated with lumbar LF burden (R=0.383, p=0.0279).
ConclusionsAmyloid load is positively correlated with LF thickness and lumbar LF burden across all lumbar levels, in a dose-dependent manner. Further studies are needed to validate these findings, uncover the underlying pathophysiology, and pave the way towards using therapies that slow LF thickening. | pathology |
10.1101/2022.03.22.22272038 | Brain iron and mental health symptoms in youth with and without prenatal alcohol exposure | Prenatal alcohol exposure (PAE) negatively affects brain development and increases the risk of poor mental health. We investigated if brain susceptibility, measuring iron, or volume were associated with internalizing or externalizing symptoms in youth with and without PAE. T1-weighted and quantitative susceptibility mapping (QSM) MRI scans were collected for 19 PAE and 40 unexposed participants aged 7.5-15 years. Magnetic susceptibility and volume of basal ganglia and limbic structures were extracted using FreeSurfer. Internalizing and externalizing problems were assessed using the Behavioural Assessment System for Children (BASC-2-PRS). Susceptibility in the nucleus accumbens was negatively associated with internalizing problems, while amygdala susceptibility was positively associated with internalizing problems across groups. In the PAE group, thalamus susceptibility was negatively associated with internalizing problems, and putamen susceptibility was positively associated with externalizing problems. Brain volume was not related to internalizing or externalizing symptoms. These findings highlight that brain iron is related to internalizing and externalizing symptoms differently in some brain regions for youth with and without PAE. Atypical iron levels (high or low) may indicate mental health problems across individuals, and iron in the thalamus and putamen may be particularly important for behaviour in individuals with PAE. | radiology and imaging |
10.1101/2022.03.22.22272696 | Longitudinal changes of white matter hyperintensities in sporadic small vessel disease: a systematic review and meta-analysis | Background and objectivesWhite matter hyperintensities (WMH) are frequent imaging features of small vessel disease (SVD) and related to poor clinical outcomes. WMH progression over time is well described, but regression was also noted recently, although the frequency and associated factors are unknown. This systematic review and meta-analysis aims to assess longitudinal intra-individual WMH volume changes in sporadic SVD.
MethodsFollowing PRISMA guidelines we searched EMBASE and MEDLINE for papers up to 28 January 2022 on WMH volume changes using MRI on [≥]2 time-points in adults with sporadic SVD. We classified populations (healthy/community-dwelling, stroke, cognitive, other vascular risk factors, depression) based on study characteristics. We performed random-effects meta-analyses with Knapp-Hartung adjustment to determine mean WMH volume change (change in mL, % of intracranial volume [%ICV], or mL/year), 95%CI and prediction intervals (PI, limits of increase and decrease) using unadjusted data. Risk of Bias Assessment tool for Non-randomised Studies (RoBANS) was used to assess risk of bias.
ResultsForty papers, 10,932 participants, met the inclusion criteria. Mean WMH volume increased over time by: 1.74 mL (95% CI 1.23, 2.26; PI -1.24, 4.73 mL; 27 papers, N=7411, mean time interval 2.7 years, SD=1.65); 0.25%ICV (95% CI 0.14, 0.36; PI -0.06, 0.56; 6 papers, N=1071, mean time interval 3.5 years, SD =1.54); or 0.61 mL/year (95% CI 0.37, 0.85; PI -0.25, 1.46; 7 papers, N=2450). Additionally, 13 papers specifically mentioned and/or provided data on WMH regression, which occurred in asymptomatic, stroke, and cognitive presentations of SVD.
DiscussionNet mean WMH volume increases over time mask wide-ranging change (e.g. mean increase of 1[3/4]mL ranging from 1[1/4]mL decrease to 4[3/4]ml increase), with regression documented explicitly in up to 1/3 of participants. More knowledge on underlying mechanisms, associated factors and clinical correlates is needed, as WMH regression could be an important intervention target. | neurology |
10.1101/2022.03.22.22272449 | Frequency of preclinical Alzheimer's disease in Japanese clinical study settings: a meta-analysis | BackgroundMany of earlier studies reporting the frequency of preclinical Alzheimers disease (AD) among cognitive normal (CN) individuals have based on the clinical study cohort from Western countries, and it has not been validated whether the frequency of preclinical AD in Asian country cohort may differ from that in Western countries.
MethodsWe conducted meta-analysis of earlier literature and original data of 4 Japanese cohort study (i.e., AMED-Preclinical, Brain/Minds Aging Imaging Study, J-ADNI, and A4 study screening conducted in Japan), among which we incorporated cognitive normal (CDR = 0) clinical study participants who were examined their amyloid status by amyloid-PET or CSF.
ResultsIn total, among the reviewed 658 Japanese CN participants from 10 different groups, 103 turned out to be amyloid positive, and the estimated overall frequency of preclinical AD (of any stage) was 17.2% (95% CI: 12.0 - 23.0%) with a high statistical heterogeneity of I2 = 64%. In meta-regression, being the Japanese cohort was barely significantly associated with the lower frequency of preclinical AD (p = 0.048).
ConclusionsOur study suggested that currently there is no robust evidence to support the lower frequency of preclinical AD in clinical study settings of Japan than that of Western countries. | neurology |
10.1101/2022.03.23.22272742 | Increased Circulating miR-155 identifies a subtype of preeclamptic patients | Preeclampsia is a heterogeneous disorder which affect maternal and fetal outcomes. The current classifications of preeclampsia such as "early-" and "late-" types, and "mild" and "severe" forms, are too imprecise to delineate the pathophysiology of preeclampsia. Here we reported that roughly one third of preeclampsia patients had high expression of maternal serum miR-155 in the case-control study and longitudinal study. The maternal serum miR-155 increased as early as 11-13+6 weeks of gestation. The patients with high serum miR-155 had severer clinical symptoms such as higher blood pressure and urine protein, and more adverse maternal and fetal outcomes. Moreover, these patients could be clustered as one group according to clinical manifestation by t-distributed stochastic neighbor embedding analysis. Therefore, these data suggest that preeclamptic patients with high maternal serum miR-155 could be viewed as a subtype of preeclampsia. | obstetrics and gynecology |
10.1101/2022.03.22.22272723 | Identification of deleterious neutrophil states and altered granulopoiesis in sepsis | Sepsis is a condition of high mortality arising from dysregulation of the host immune response. Gene expression studies have identified multiple immune endotypes but gaps remain in our understanding of the underlying biology and heterogeneity. We used single-cell multi-omics to profile 272,993 cells across 48 whole blood samples from 26 sepsis patients (9 with paired convalescent samples), 6 healthy controls and 7 post-surgery patients. We identified immature neutrophil populations specific to sepsis and demonstrated the immunosuppressive nature of sepsis neutrophils in vitro. An IL1R2+ neutrophil state was expanded in a transcriptomic sepsis endotype associated with increased early mortality (sepsis response signature 1, SRS1), together with enrichment of the response to IL-1 pathway in mature neutrophils, marking IL-1 out as a potential target for immunotherapy in SRS1 sepsis patients. We confirmed the expansion of immature neutrophils, specifically IL1R2+ neutrophils, in SRS1 in additional cohorts of patients (n = 906 RNA-sequencing samples, n = 41 CyTOF samples). Neutrophil changes persisted in convalescence, implicating disrupted granulopoiesis. Our findings establish a cellular immunological basis for transcriptomically defined sepsis endotypes and emphasise the relevance of granulopoietic dysfunction in sepsis, identifying opportunities for precision medicine approaches to the condition. | genetic and genomic medicine |
10.1101/2022.03.22.22272749 | Predicting Adverse Drug Effects: A Heterogeneous Graph Convolution Network with a Multi-layer Perceptron Approach | We exploit a heterogeneous graph convolution network (GCN) combined with a multi-layer perceptron (MLP), which is denoted by GCNMLP, to explore potential side effects of drugs. By inferring the relationship among similar drugs, our approach in silico methods shortens the time consumption in uncovering the side effects unobserved in routine drug prescriptions. In addition, it highlights the relevance in exploring the mechanism of well-documented drugs. Our results predict drug side effects with area under precision-recall (AUPR) curve AUPR = 0.941, substantially outperforming the non-negative matrix factorization (NMF) method with AUPR=0.600. Moreover, new side effects are obtained using the GCNMLP.
Author summaryWe propose the GCNMLP to study drug side effects. By making better drug side effect prediction our approach improves personalized medicine. | health informatics |
10.1101/2022.03.23.22272825 | The contribution of malaria and sickle cell disease to anaemia among children aged 6 to 59 months in Nigeria: A secondary analysis using data from the 2018 Demographic and Health Survey | IntroductionAnaemia is a major cause of morbidity and mortality among children in sub-Saharan Africa. Anaemia has many aetiologies best addressed by different treatments, so regional studies of the aetiology of anaemia may be required.
MethodsWe analysed data from Nigerias 2018 Demographic and Health Survey (DHS) to study predictors of anaemia among children ages 6-59m. We computed the fraction of anaemia at different degrees of severity attributable to malaria and sickle cell disease (SCD) using a regression model adjusting for demographic and socioeconomic risk factors. We also estimated the contribution of the risk factors to haemoglobin concentration.
ResultsWe found that 63.7% (95% CI: 58.3-69.4) of semi-severe anaemia (<80 g/L) was attributable to malaria compared to 12.4% (95% CI: 11.1-13.7) of mild-to-severe (adjusted haemoglobin concentration <110 g/L) and 29.6% (95% CI: 29.6-31.8) of moderate-to-severe (<100 g/L) anaemia and that SCD contributed 0.6% (95%CI: 0.4-0.9), 1.3% (95% CI: 1.0-1.7), and 7.3% (95%CI: 5.3-9.4) mild-to-severe, moderate-to-severe, and semi-severe anaemia, respectively. Sickle trait was protective against anaemia and was associated with higher haemoglobin concentration compared to children with normal haemoglobin (HbAA) among malaria-positive but not malaria-negative children.
ConclusionThis approach used offers a new tool to estimate the contribution of malaria to anaemia in many settings using widely available DHS data. The fraction of anaemia among young children in Nigeria attributable to malaria and SCD is higher at more severe levels of anaemia. Prevention of malaria and SCD and timely treatment of affected individuals would reduce cases of severe anaemia. | hematology |
10.1101/2022.03.22.22272793 | People with HIV receiving suppressive antiretroviral therapy show typical antibody durability after dual COVID-19 vaccination, and strong third dose responses | BackgroundLonger-term humoral responses to two-dose COVID-19 vaccines remain incompletely characterized in people living with HIV (PLWH), as do initial responses to a third dose.
MethodsWe measured antibodies against the SARS-CoV-2 spike protein receptor-binding domain, ACE2 displacement and viral neutralization against wild-type and Omicron strains up to six months following two-dose vaccination, and one month following the third dose, in 99 PLWH receiving suppressive antiretroviral therapy, and 152 controls.
ResultsThough humoral responses naturally decline following two-dose vaccination, we found no evidence of lower antibody concentrations nor faster rates of antibody decline in PLWH compared to controls after accounting for sociodemographic, health and vaccine-related factors. We also found no evidence of poorer viral neutralization in PLWH after two doses, nor evidence that a low nadir CD4+ T-cell count compromised responses. Post-third-dose humoral responses substantially exceeded post-second-dose levels, though anti-Omicron responses were consistently weaker than against wild-type.
Nevertheless, post-third-dose responses in PLWH were comparable to or higher than controls. An mRNA-1273 third dose was the strongest consistent correlate of higher post-third-dose responses.
ConclusionPLWH receiving suppressive antiretroviral therapy mount strong antibody responses after two- and three-dose COVID-19 vaccination. Results underscore the immune benefits of third doses in light of Omicron. | infectious diseases |
10.1101/2022.03.22.22272773 | Risk of SARS-CoV-2 transmission by fomites: a clinical observational study in highly infectious COVID-19 patients | BackgroundThe contribution of droplet-contaminated surfaces for virus transmission has been discussed controversially in the context of the current Severe Acute Respiratory Syndrome Coronavirus-2 (SARS-CoV-2) pandemic. Importantly, the risk of fomite-based transmission has not been systematically addressed.
MethodsWe initiated this single-center observational study to evaluate whether hospitalized COVID-19 patients can contaminate stainless steel carriers by coughing or intensive moistening with saliva and to assess the risk of SARS-CoV-2 transmission upon detection of viral loads and infectious virus in cell culture. Fifteen hospitalized patients with a high baseline viral load (CT value [≤] 25) shortly after admission were included. We documented clinical and laboratory parameters and used patient samples to perform virus culture, quantitative PCR and virus sequencing.
ResultsNasopharyngeal and oropharyngeal swabs of all patients were positive for viral RNA on the day of the study. Infectious SARS-CoV-2 could be isolated from 6 patient swabs (46.2 %). While after coughing, no infectious virus could be recovered, intensive moistening with saliva resulted in successful viral recovery from steel carriers of 5 patients (38.5 %).
ConclusionsTransmission of infectious SARS-CoV-2 via fomites is possible upon extensive moistening, but unlikely to occur in real-life scenarios and from droplet-contaminated fomites. | infectious diseases |
10.1101/2022.03.22.22272758 | Mathematical modelling of COVID-19 transmission dynamics with vaccination: A case study in Ethiopia | In this paper, we consider a mathematical model of COVID-19 transmission with vaccination where the total population was subdivided into nine disjoint compartments, namely, Susceptible(S), Vaccinated with the first dose(V1), Vaccinated with the second dose(V2), Exposed (E), Asymptomatic infectious (I), Symptomatic infectious (I), Quarantine (Q), Hospitalized (H) and Recovered (R). We computed a reproduction parameter, Rv, using the next generation matrix. Analytical and numerical approach is used to investigate the results. In the analytical study of the model: we showed the local and global stability of disease-free equilibrium, the existence of the endemic equilibrium and its local stability, positivity of the solution, invariant region of the solution, transcritical bifurcation of equilibrium and conducted sensitivity analysis of the model. From these analysis, we found that the disease-free equilibrium is globally asymptotically stable for Rv < 1 and unstable for Rv > 1. A locally stable endemic equilibrium exists for Rv > 1, which shows persistence of the disease if the reproduction parameter is greater than unity. The model is fitted to cumulative daily infected cases and vaccinated individuals data of Ethiopia from May 01, 2021 to January 31, 2022. The unknown parameters are estimated using the least square method with built-in MATLAB function lsqcurvefit. Finally, we performed different simulations using MATLAB and predicted the vaccine dose that will be administered at the end of two years. From the simulation results, we found that it is important to reduce the transmission rate, infectivity factor of asymptomatic cases and increase the vaccination rate, quarantine rate to control the disease transmission. Predictions show that the vaccination rate has to be increased from the current rate to achieve a reasonable vaccination coverage in the next two years. | infectious diseases |
10.1101/2022.03.23.22272094 | Performance of Gazelle COVID-19 point-of-care test for detection of nucleocapsid antigen from SARS-CoV-2 | SARS-CoV-2 antigen assays offer simplicity and rapidity in diagnosing COVID-19. We assessed the clinical performance of Gazelle COVID-19 test, a fluorescent lateral flow immunoassay with an accompanying Reader utilizing image-recognition software for detection of nucleocapsid antigen from SARS-CoV-2. We performed a prospective, operator-blinded, observational study at 2 point-of-care (POC) sites. Nasal swab specimens from symptomatic patients were tested with Gazelle COVID-19 test and real-time polymerase chain reaction (RT-P CR) assay. Overall, data from 1524 subjects was analyzed, and 133 were positive by RT-PCR. Mean (range) age of participants was 34.7 (2-94) years and 570 (37.4%) were female. The sensitivity and the specificity of the Gazelle COVID-19 test were 96.3% and 99.7%. The PPV of Gazelle COVID-19 test was 97.0%, NPV 99.6%, and accuracy 99.4%. In POC settings, Gazelle COVID-19 test had high diagnostic accuracy for detection of SARS-CoV-2 in nasal swab samples of symptomatic subjects suspected of COVID-19. | infectious diseases |
10.1101/2022.03.23.22272701 | Explainable machine learning for real-time hypoglycaemia and hyperglycaemia prediction and personalised control recommendations | BackgroundThe occurrences of acute complications arising from hypoglycaemia and hyperglycaemia peak as young adults with type 1 diabetes (T1D) take control of their own care. Continuous glucose monitoring (CGM) devices provide real-time blood glucose readings enabling users to manage their control pro-actively. Machine learning algorithms can use CGM data to make ahead-of-time risk predictions and provide insight into an individuals longer-term control.
MethodsWe introduce explainable machine learning to make predictions of hypoglycaemia (<70mg/dL) and hyperglycaemia (>270mg/dL) 60 minutes ahead-of-time. We train our models using CGM data from 153 people living with T1D in the CITY survey totalling over 28000 days of usage, which we summarise into (short-term, medium-term, and long-term) blood glucose features along with demographic information. We use machine learning explanations (SHAP) to identify which features have been most important in predicting risk per user.
ResultsMachine learning models (XGBoost) show excellent performance at predicting hypoglycaemia (AUROC: 0.998) and hyperglycaemia (AUROC: 0.989) in comparison to a baseline heuristic and logistic regression model.
ConclusionsMaximising model performance for blood glucose risk prediction and management is crucial to reduce the burden of alarm-fatigue on CGM users. Machine learning enables more precise and timely predictions in comparison to baseline models. SHAP helps identify what about a CGM users blood glucose control has led to predictions of risk which can be used to reduce their long-term risk of complications. | endocrinology |
10.1101/2022.03.22.22272775 | Risk of death following SARS-CoV-2 infection or COVID-19 vaccination in young people in England: a self-controlled case series study | ObjectivesTo assess whether there is a change in the incidence of cardiac and all-cause death in young people following COVID-19 vaccination or SARS-CoV-2 infection in unvaccinated individuals.
DesignSelf-controlled case series.
SettingNational, linked electronic health record data in England.
Study populationIndividuals aged 12-29 who had received at least one dose of COVID-19 vaccination and died between 8 December 2020 and 2 February 2022 and registered by 16 February 2022 within 12 weeks of COVID-19 vaccination; Individuals aged 12-29 who died within 12 weeks of testing positive for SARS-CoV-2.
Main outcome measuresCardiac and all-cause deaths occurring within 12 weeks of vaccination or SARS-CoV-2 infection.
ResultsCompared to the baseline period, there was no evidence of a change in the incidence of cardiac death in the six weeks after vaccination, whether for each of weeks 1 to 6 or the whole six-week period. There was a decrease in the risk of all-cause death in the first week after vaccination and no change in each of weeks 2 to 6 after vaccination or whole six-week period after vaccination. Subgroup analyses by sex, age, vaccine type, and last dose also showed no change in the risk of death in the first six weeks after vaccination. There was a large increase in the incidence of cardiac and all-cause death in the overall risk period after SARS-CoV-2 infection among the unvaccinated.
ConclusionThere is no evidence of an association between COVID-19 vaccination and an increased risk of death in young people. By contrast, SARS-CoV-2 infection was associated with substantially higher risk of cardiac related death and all-cause death.
What is already known on this topicSeveral studies have highlighted the association between COVID-19 vaccination and the risk of myocarditis, myopericarditis, and other cardiac problems, especially in young people, but associated risk of mortality is unclear. Since younger people have lower risk of COVID-19 hospitalisation and mortality, the mortality risk associated with vaccination is potentially more important to them in balancing the risk and benefit of vaccination.
What this study addsAlthough there is a risk of myocarditis or myopericarditis with COVID-19, there is no evidence of increased risk of cardiac or all-cause mortality following COVID-19 vaccination in young people aged 12 to 29. Given the increased risk of mortality following SARS-CoV-2 infection in this group, the risk-benefit analysis favours COVID-19 vaccination for this age group. | epidemiology |
10.1101/2022.03.23.22272805 | Calcium Channel Blockers: clinical outcome associations with reported pharmacogenetics variants in 32,000 patients | BackgroundDihydropiridine calcium channel blockers (dCCB) (e.g. amlodipine) are widely used for treating hypertension. Pharmacogenetic variants impact treatment efficacy, yet evidence on clinical outcomes in routine primary care is limited. We aimed to estimate associations between reported pharmacogenetic variants and incident adverse events in a community-based cohort prescribed dCCB, including in high-risk subgroups.
MethodsWe analysed up to 32,360 UK Biobank European-ancestry participants prescribed dCCB in primary care electronic health records (from UK General Practices, 1990 to 2017). We investigated 23 genetic variants in 16 genes reported in PharmGKB, including CYP3A5 and RYR3. Outcomes were incident diagnosis of coronary heart disease (CHD), heart failure (HF), chronic kidney disease (CKD), edema, and switching antihypertensive medication. Secondary analysis in patients with history of heart disease was also performed.
ResultsParticipants were aged 40 to 79 years at first dihydropyridine prescription (treatment duration 1 month to 40 years, mean 5.9 years). Carriers of rs877087 T allele in the ryanodine receptor 3 (RYR3) had increased risk of HF (Hazard Ratio 1.13: 95% Confidence Intervals 1.02 to 1.25, p=0.02). We estimated that if rs877087 T allele carriers were prescribed an alternative treatment the incidence of HF in patients prescribed dCCB would reduce by 9.2% (95%CI 3.1 to 15.4). In patients with a history of heart disease when first prescribed dCCB (N=2,296), RYR3 rs877087 homozygotes had increased risk of new CHD or HF compared to CC variant (HR 1.25, 95%CI 1.09 to 1.44, p=0.002). Two variants increased likelihood of switching to an alternate antihypertensive medication (rs10898815 in gene NUMA1 HR 1.16: 95%CI 1.07 to 1,27, p=0.0009; rs776746 in CYP3A5 HR 1.59: 95%CI 1.09 to 2.32, p=0.02). rs776746 in CYP3A5 also increased CKD risk (HR 2.12, p=0.002). The remaining previously reported variants were not strongly or consistently associated with the studied clinical outcomes.
ConclusionsIn this large primary care cohort, patients with common genetic variants in NUMA1, CYP3A5 and RYR3 had increased adverse clinical outcomes. Work is needed to establish whether outcomes of dCCB prescribing could be improved by prior knowledge of such pharmacogenetics variants supported by clinical evidence of association with adverse events. | epidemiology |
10.1101/2022.03.21.22272665 | Existential suffering as a motive for assisted suicide: difficulties, acceptability, management and roles from the perspectives of Swiss professionals | BackgroundExistential suffering is often a part of the requests for assisted suicide (AS). Its definitions have gained in clarity recently and refer to a distress arising from an inner realization that life has lost its meaning. There is however a lack of consensus on how to manage existential suffering, especially in a country where AS is legal and little is known about the difficulties faced by professionals confronted with these situations.
ObjectivesTo explore the perspectives of Swiss professionals involved in end-of-life care and assisted suicide on the management of existential suffering when it is part of AS requests, taking into account the question of roles, as well as on the difficulties they encounter along the way and their views on the acceptability of existential suffering as a motive for AS.
MethodsA qualitative study based on face-to-face interviews was performed among twenty-six participants from the fields of palliative and primary care as well as from EXIT right-to-die organization. A semi-structured interview guide exploring four themes was used. Elements from the grounded theory approach were applied.
ResultsAlmost all participants reported experiencing difficulties when facing existential suffering. Two-thirds considered existential suffering as a justifiable reason for requesting AS. Concerning the management of existential suffering, participants referred to the notion of being present, respect, explore the suffering, give meaning, working together, psychological support, spiritual support, relieve physical symptoms and palliative sedation.
ConclusionThis study offers a unique opportunity to reflect on what are desirable responses to existential suffering when it is part of AS requests. Existential suffering is plural and certainly implies a multiplicity of responses as well. These situations remain however difficult and controversial according to Swiss professionals. Clinicians education should better address these issues and give them the tool to take care of patients with existential suffering. | palliative medicine |
10.1101/2022.03.22.22272786 | Patients Experiences of Using Smartphone Apps to Manage their Gestational Diabetes Diagnosis: A Systematic Review - protocol (version 1) | BackgroundGestational diabetes is a strong predictor of type 2 diabetes onset. However, women who make healthy dietary and lifestyle choices can significantly reduce their risk. There is a need to understand the factors facilitating or preventing women from using smartphone apps that may encourage these behaviours.
MethodsWe plan to conduct a systematic review of patient experiences when using mobile health applications to manage gestational diabetes. We will include primary studies of qualitative data around patient experiences of using these applications, as well as the barriers and facilitators to using technologies. We will search the following electronic databases: Medline, Embase, PsycINFO, Global Health, Web of Science, Cochrane Central Register of Controlled Trials (CENTRAL), AMED and CINAHL and manually search the reference lists of included studies. We will include primary studies involving direct user interviews and where the results have been analysed using qualitative methods. To assess the quality of included studies, we will use the CASP qualitative research checklist.
Expected resultsWe intend to use summary tables to report the characteristics of the study population, the overarching themes that emerged, and recommendations for research and practice. | primary care research |
10.1101/2022.03.23.22272826 | Statistical and functional convergence of common and rare variant risk for autism spectrum disorders at chromosome 16p | The dominant human genetics paradigm for converting association to mechanism ("variant-to-function") involves iteratively mapping individual associations to specific SNPs and to the proximal genes through which they act. In contrast, here we demonstrate the feasibility of extracting biological insight from a very large (>10Mb) region of the genome, and leverage this approach to derive insight into autism spectrum disorder (ASD). Using a novel statistical framework applied in an unbiased scan of the genome, we identified the 33Mb p-arm of chromosome 16 (16p) as harboring the greatest excess of common polygenic risk for ASD. This region includes the recurrent 16p11.2 copy number variant (CNV) - one of the largest single genetic risk factors for ASD, and whose pathogenic mechanisms are undefined. Analysis of bulk and single-cell RNA-sequencing data from post-mortem human brain samples revealed that common polygenic risk for ASD within 16p associated with decreased average expression of genes throughout this 33-Mb region. Similarly, analysis of isogenic neuronal cell lines with CRISPR/Cas9-mediated deletion of 16p11.2 revealed that the deletion also associated with depressed average gene expression across 16p. The effects of the rare deletion and diffuse common variation were correlated at the level of individual genes. Finally, analysis of chromatin contact patterns by Hi-C revealed patterns which may explain this transcriptional convergence, including elevated contact throughout 16p, and between 16p11.2 and a distal region on 16p (Mb 0-5.2) which showed the greatest gene expression changes in both the common and rare variant analyses. These results demonstrate that elevated 3D chromatin contact may coordinate genetic and transcriptional disease liability across large genomic regions, exemplifying a novel approach for extracting biological insight from genetic association data. As applied to ASD, our analyses highlight the 33Mb p-arm of chromosome 16 as a novel locus for ASD liability and provide insight into disease liability originating from the 16p11.2 CNV. | psychiatry and clinical psychology |
10.1101/2022.03.22.22272504 | A hypothesis-free data science approach to identifying novel patterns of clinical interest in clinicians' treatment preferences: clusters of high Pericyazine and Promazine use in England | BackgroundData analysis can be used to identify signals suggestive of variation in treatment choice or clinical outcome. Analyses to date have generally focused on an hypothesis-driven approach.
MethodsHere we report an innovative hypothesis-blind approach (calculating chemical-class proportions for every chemical substance prescribed in each Clinical Commissioning Group and ranking chemicals by (a) their kurtosis and (b) a ratio between inter-centile differences) applied to Englands national prescribing data, and demonstrate how this identified unusual prescribing of two antipsychotics.
OutcomesWe identified that, while promazine and pericyazine are barely used by most clinicians, they make up a substantial proportion of all antipsychotic prescribing in two small geographic regions in England.
InterpretationData-driven approaches can be effective at identifying unusual clinical choices. More widespread adoption of such approaches, combined with clinician and decision-maker engagement could lead to better optimised patient care.
FundingNIHR Biomedical Research Centre, Oxford; Health Foundation; National Institute for Health Research (NIHR) School of Primary Care Research and Research for Patient Benefit
Research in contextO_ST_ABSEvidence before this studyC_ST_ABSIdentifying variation in clinical activity typically employs a traditional approach whereby measures are prospectively defined, and adherence then assessed in data. We are aware of no prior work using data science techniques hypothesis-blind to systematically identify outliers for any given treatment choice or clinical outcome (numerators) as a proportion of automatically generated denominators.
Added value of this studyHere we report an innovative hypothesis-blind approach applied to Englands national prescribing data, to identify chemical substances with substantial prescribing patterns between organisations. As illustrative examples we show that promazine and pericyazine, while rarely used by most clinicians, made up a substantial proportion of all antipsychotic prescribing in two small geographic regions in England.
Implications of all the available evidenceThe choice of antipsychotics between English regions could be further investigated using qualitative methods to explore the implications for patient care. More broadly, data-driven approaches can be effective at identifying unusual clinical choices. More widespread adoption of such approaches, combined with clinician and decision-maker engagement could lead to better optimised patient care. | psychiatry and clinical psychology |
10.1101/2022.03.22.22272790 | The World Health Organization's Disease Outbreak News: a retrospective database | The World Health Organization (WHO) notifies the global community about disease outbreaks through the Disease Outbreak News (DON). These online reports tell important stories about both outbreaks themselves and the high-level decision making that governs information sharing during public health emergencies. However, they have been used only minimally in global health scholarship to date. Here, we collate all 2,789 of these reports from their first use through the start of the Covid-19 pandemic (January 1996 to December 2019), and develop an annotated database of the subjective and often inconsistent information they contain. We find that these reports are dominated by a mix of persistent worldwide threats (particularly influenza and cholera) and persistent epidemics (like Ebola virus disease in Africa or MERS-CoV in the Middle East), but also document important periods in history like the anthrax bioterrorist attacks at the turn of the century, the spread of chikungunya and Zika virus to the Americas, or even recent lapses in progress towards polio elimination. We present three simple vignettes that show how researchers can use these data to answer both qualitative and quantitative questions about global outbreak dynamics and public health response. However, we also find that the retrospective value of these reports is visibly limited by inconsistent reporting (e.g., of disease names, case totals, mortality, and actions taken to curtail spread). We conclude that sharing a transparent rubric for which outbreaks are considered reportable, and adopting more standardized formats for sharing epidemiological metadata, might help make the DON more useful to researchers and policymakers. | public and global health |
10.1101/2022.03.22.22272730 | Research on the Emotions of Uninfected People during the COVID-19 Epidemic in China | BackgroundThe negative emotions induced by the ongoing coronavirus disease 2019 (COVID-19) epidemic are affecting peoples health. In order to identify emotional problems and promote early intervention to reduce the risk of disease, we studied the emotional states of Chinese people during the epidemic.
MethodWe adopted the Automated Neuropsychological Assessment Metrics mood scale and prepared an online questionnaire. Then, we conducted an exploratory factor analysis of the effective responses of 567 participants from 31 provinces and cities in China. Finally, we analyzed the characteristics of the distribution of different types of emotions and compared them via several statistical methods.
ResultsThe original scale was modified to have six dimensions that yielded reliable internal consistency values ranging from 0.898 to 0.965 and explained 74.96% of the total variance. We found that a total of 33.9% of respondents felt negative emotions more strongly, were less happy and had less energy than other respondents (p<0.001). People with these traits had relatively serious emotional problems and were typically over 60 years old, doctoral degree holders, enterprise personnel and residents in an outbreak area.
ConclusionThirty-three percent of people without COVID-19 had emotional problems. Psychotherapy should be provided as early as possible for people with emotional problems caused by the epidemic, and the modified scale could be used to survey the publics mood during public health events to detect problems and facilitate early intervention. | public and global health |
10.1101/2022.03.22.22272744 | A comparison of paediatric hypertension clinical practice guidelines and their ability to predict adult hypertension in an African birth cohort | It remains unclear which paediatric hypertension clinical practice guideline (CPG) should be applied in an African population. We therefore aimed to compare three commonly used CPG (2017 AAP, 2016 ESH and 2004 Fourth Report) developed in high-income countries for use in South African children at four paediatric ages (children: 5yrs and 8yrs; adolescents: 13yrs and 17yrs) to determine which best predicts elevated blood pressure (EBP) in young adulthood (22yrs and 28yrs). Moreover, the sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) for each specific paediatric CPG was calculated. The 2017 AAP definition identified more children and adolescents with hypertension when compared to the 2004 Fourth Report and 2016 ESH guidelines. In computed hazards ratios, from ages 8yrs to 17yrs, all three paediatric CPG significantly predicted the risk of EBP in young adulthood (p[≤]0.008). However, sensitivity to predict EBP at age 22yrs for all CPG was generally low (17.0% - 33.0%) with higher specificity (87.4% - 93.1%). Sensitivity increased at age 28yrs (51.4.0% - 70.1%), while specificity decreased (52.8% - 65.1%). Both PPV and NPV at both adult age points varied widely (17.9% - 79.9% and 29.3% - 92.5% respectively). The performance of these paediatric CPG in terms of AUC were not optimal at both adult age points, however, the AAP definition at 17yrs met an acceptable level of performance (AUC= 0.71). Our results highlight the need for more research to examine if an African-specific CPG would better identify high-risk children to minimise their trajectory towards adult hypertension. | public and global health |
10.1101/2022.03.23.22272809 | Weighting of risk factors for low birth weight: A linked routine data cohort study in Wales, UK. | BackgroundGlobally 20 million children are born with a birth weight below 2,500 grams every year which is considered as a low birthweight (LBW) baby. This study investigates the contribution of modifiable risk factors to inform activities that reduce the rates of LBW in Wales.
MethodThe cohort (N = 693,377) was comprised of children born between 1st January 1998 and 31st December 2018 in Wales. A multivariable logistic regression model and a predictive model using decision tree were used to investigate the associations between the risk factors and LBW.
ResultsThe study found that non-singleton children had the highest risk of LBW (adjusted odds ratio 21.74, 95% confidence interval (CI) 21.09 - 22.40), followed by pregnancy interval less than one year (2.92, 95% CI 2.70 - 3.15), maternal diabetes (2.03, 95% CI 1.81 - 2.28), maternal hospital admission for anaemia (1.26, 95% CI 1.16 - 1.36), depression (1.58, 95% CI 1.43 - 1.75), serious mental illness (1.46, 95% CI 1.04 - 2.05), anxiety (1.22, 95% CI 1.08 - 1.38) and use of anti-depressant medication during pregnancy (1.92, 95% CI 1.20 - 3.07). Additional maternal risk factors include smoking (1.80, 95% CI 1.76 - 1.84), alcohol-related hospital admission (1.60, 95% CI 1.30 - 1.97), substance misuse (1.35, 95% CI 1.29 - 1.41), living in areas of high deprivation, and evidence of domestic abuse.
ConclusionThis work suggests that to address LBW, measures need to focus on improving maternal health, addressing pre-term births, promoting awareness of sufficient pregnancy interval, ensuring adequate support and resources for mothers mental health and wellbeing.
What is already known on this topicO_LIEach year 6.9% of live births are identified as low birth weight (LBW) in the UK.
C_LIO_LILBW children are at risk of poor cognitive development, which is associated with developmental disabilities and poor academic achievement in later life.
C_LIO_LIThe progress to reduce the LBW prevalence in high income regions (including Europe) is not satisfactory to meet World Health Organisation (WHO) LBW target by 2025.
C_LI
What this study addsO_LIThis work has built an e-cohort using data-linkage across multiple routinely collected administrative datasets to investigate the risk factors of LBW for the population of Wales.
C_LIO_LINon-singleton children had 22 times higher risk of LBW than singleton children.
C_LIO_LIFindings suggested that the most important factors to address to reduce the risk of LBW are multiple births, underlying maternal physical (diabetes, anaemia) and mental health, maternal smoking, and substance use (alcohol/drugs), adequate pregnancy interval, higher deprivation, and domestic abuse during pregnancy.
C_LI
How this study might affect research, practice, or policyO_LIThis finding suggests that to address the LBW, adequate support is needed to the mother and their family when they are planning for and during pregnancy.
C_LIO_LIMultiagency (doctors, midwives, security department) can liaise between them and can take necessary mitigative action to reduce the prevalence of LBW.
C_LI | public and global health |
10.1101/2022.03.23.22272796 | Hypertension perspectives and health behaviors of African youth: The effect of health education and employment | In South Africa between 1998 and 2016, hypertension rates in young adults (15-34 years) more than doubled calling for preventive interventions. However, with many youth unemployed, young adults struggle to prioritize health or implement healthy behaviors. We conducted six focus group discussions comparing hypertension-related beliefs and behaviors between NEET youth (n=20; not in employment, education or training) and previously NEET youth on a health-focused learnership (n=20). While all youth viewed hypertension as life threatening leading to cardiovascular disease or death, especially if left untreated, only youth undertaking health education felt empowered to implement healthy behaviors for disease prevention. In contrast, NEET youth felt hypertension was inevitable and described negative experiences at clinics and fear of lifelong medication use if diagnosed as reasons not to be screened. Our results suggest that engaging NEET youth in culturally appropriate health education programs can motivate preventive health behavior for chronic diseases such as hypertension. | public and global health |
10.1101/2022.03.22.22272705 | Evaluation of different diffusion-weighted image techniques for head and neck radiation treatment: phantom and volunteer studies | PurposeTo quantitatively compare and assess the geometric distortion and apparent diffusion coefficient (ADC) accuracy of conventional single-shot EPI (SSEPI)-DWI, readout segmentation of long variable echo-trains (RESOLVE)-DWI, and BLADE-DWI techniques in both phantom and volunteer imaging studies.
MethodsPhantom measurements were obtained using the QIBA DWI phantom at 0{degrees} and at ambient temperature (~17{degrees}). DW images were acquired using SSEPI-DWI, RESOLVE-DWI, and BLADE-DWI. Image geometric distortion factors (compression and dilation, shear distortion), and image shift factors were measured and compared with computed tomography images. ADC and signal-to-noise ratio (SNR) values were measured for the various DWI techniques. Images were also obtained from three healthy volunteers and three oropharynx cancer patients. Geometric distortion parameters and ADC values were analyzed in the following head and neck regions of interest (ROI): salivary glands, tonsils, and primary tumors (patient only). The metric evaluation included Dice similarity coefficient and Hausdorff distance mean. T2-weighted images were used as a reference to evaluate geometric distortion.
ResultsIn the phantom experiment, the vials were prominently distorted on the SSEPI-DWI and RESOLVE-DWI images but were less distorted on the BLADE-DWI image, as determined by the overall effects of Image geometric distortion factors. The paired t-test showed no significant difference (0.15 < p < 0.88) in ADC values among the three DWI techniques; however, BLADE-DWI led to a lower SNR in vials. In the volunteer and patient studies, an ROI-based overlap metrics analysis of the salivary glands and gross tumor volumes were less distorted for BLADE-DWI than for EPI-based DWI, as determined by Wilcoxon paired signed-rank test.
ConclusionBLADE-DWI demonstrated excellent geometric accuracy, with similar quantitative ADC values to those of EPI-based DW-MRI, thus potentially making this technique suitable for target volume delineation and functional assessment for head and neck radiation treatment in the future. | radiology and imaging |
10.1101/2022.03.22.22272736 | The gut microbiome is a significant risk factor for future chronic lung disease | BackgroundThe gut-lung axis is generally recognized, but there are few large studies of the gut microbiome and incident respiratory disease in adults.
Objectives3To investigate the associations between gut microbiome and respiratory disease and to construct predictive models from baseline gut microbiome profiles for incident asthma or chronic obstructive pulmonary disease (COPD).
MethodsShallow metagenomic sequencing was performed for stool samples from a prospective, population-based cohort (FINRISK02; N=7,115 adults) with linked national administrative health register derived classifications for incident asthma and COPD up to 15 years after baseline. Generalised linear models and Cox regressions were utilised to assess associations of microbial taxa and diversity with disease occurrence. Predictive models were constructed using machine learning with extreme gradient boosting. Models considered taxa abundances individually and in combination with other risk factors, including sex, age, body mass index and smoking status.
ResultsA total of 695 and 392 significant microbial associations at different taxonomic levels were found with incident asthma and COPD, respectively. Gradient boosting decision trees of baseline gut microbiome predicted incident asthma and COPD with mean area under the curves of 0.608 and 0.780, respectively. For both incident asthma and COPD, the baseline gut microbiome had C-indices of 0.623 for asthma and 0.817 for COPD, which were more predictive than other conventional risk factors. The integration of gut microbiome and conventional risk factors further improved prediction capacities. Subgroup analyses indicated gut microbiome was significantly associated with incident COPD in both current smokers and non-smokers, as well as in individuals who reported never smoking.
ConclusionsThe gut microbiome is a significant risk factor for incident asthma and incident COPD and is largely independent of conventional risk factors. | respiratory medicine |
10.1101/2022.03.23.22272699 | Dampened inflammatory signalling and myeloid-derived suppressor-like cell accumulation reduces circulating monocytic HLA-DR density and associates with malignancy risk in long-term renal transplant recipients | BackgroundMalignancy is a major cause of morbidity and mortality in transplant recipients. Identification of those at highest risk could facilitate pre-emptive intervention such as reduction of immunosuppression. Reduced circulating monocytic HLA-DR density is a marker of immune depression in the general population and associates with poorer outcome in critical illness. It has recently been used as a safety marker in adoptive cell therapy trials in renal transplantation. Despite its potential as a marker of dampened immune responses, factors that impact upon monocytic HLA-DR density and the long-term clinical sequelae of this have not been assessed in transplant recipients.
MethodsA cohort study of stable long-term renal transplant recipients was undertaken. Serial circulating monocytic HLA-DR density and other leucocyte populations were quantified by flow cytometry. Gene expression of monocytes was performed using the Nanostring nCounter platform, and 13-plex cytokine bead array used to quantify serum concentrations. The primary outcome was malignancy development during one-year follow-up. Risk of malignancy was calculated by univariate and multivariate proportionate hazards modelling with and without adjustment for competing risks.
ResultsMonocytic HLA-DR density was stable in long-term renal transplant recipients (n=135) and similar to non-immunosuppressed controls (n=29), though was suppressed in recipients receiving prednisolone. Decreased mHLA-DRd was associated with accumulation of CD14+CD11b+CD33+HLA-DRlo monocytic myeloid-derived suppressor-like cells. Pathway analysis revealed downregulation of pathways relating to cytokine and chemokine signalling in monocytes with low HLA-DR density; however serum concentrations of major cytokines did not differ between these groups. There was an independent increase in malignancy risk during follow-up with decreased HLA-DR density.
ConclusionsDampened chemokine and cytokine signalling drives a stable reduction in monocytic HLA-DR density in long-term transplant recipients and associates with subsequent malignancy risk. This may function as a novel marker of excess immunosuppression. Further study is needed to understand the mechanism behind this association. | transplantation |
10.1101/2022.03.23.22272801 | Computer-assisted analysis of polysomnographic recordings improves inter-scorer associated agreement and scoring times | Study ObjectivesTo investigate inter-scorer agreement and scoring time differences associated with visual and computer-assisted analysis of polysomnographic (PSG) recordings.
MethodsA group of 12 expert scorers reviewed 5 PSGs that were independently selected in the context of each of the following tasks: (i) sleep stating, (ii) detection of EEG arousals, (iii) analysis of the respiratory activity, and (iv) identification of leg movements. All scorers independently reviewed the same recordings, hence resulting in 20 scoring exercises from an equal amount of different subjects. The procedure was repeated, separately, using the classical visual manual approach and a computer-assisted (semi-automatic) procedure. Resulting inter-scorer agreement and scoring times were examined and compared among the two methods.
ResultsComputer-assisted sleep scoring showed a consistent and statistically relevant effect toward less time required for the completion of each of the PSG scoring tasks. Gain factors ranged from 1.26 (EEG arousals) to 2.41 (limb movements). Inter-scorer kappa agreement was also consistently increased with the use of supervised semi-automatic scoring. Specifically, agreement increased from K=0.76 to K=0.80 (sleep stages), K=0.72 to K=0.91 (limb movements), K=0.55 to K=0.66 (respiratory activity), and K=0.58 to K=0.65 (EEG arousals). Inter-scorer agreement on the examined set of diagnostic indices did also show a trend toward higher Interclass Correlation Coefficient scores when using the semi-automatic scoring approach.
ConclusionsComputer-assisted analysis can improve inter-scorer agreement and scoring times associated with the review of PSG studies resulting in higher efficiency and overall quality in the diagnosis sleep disorders. | neurology |
10.1101/2022.03.23.22272801 | Computer-assisted analysis of polysomnographic recordings improves inter-scorer associated agreement and scoring times | Study ObjectivesTo investigate inter-scorer agreement and scoring time differences associated with visual and computer-assisted analysis of polysomnographic (PSG) recordings.
MethodsA group of 12 expert scorers reviewed 5 PSGs that were independently selected in the context of each of the following tasks: (i) sleep stating, (ii) detection of EEG arousals, (iii) analysis of the respiratory activity, and (iv) identification of leg movements. All scorers independently reviewed the same recordings, hence resulting in 20 scoring exercises from an equal amount of different subjects. The procedure was repeated, separately, using the classical visual manual approach and a computer-assisted (semi-automatic) procedure. Resulting inter-scorer agreement and scoring times were examined and compared among the two methods.
ResultsComputer-assisted sleep scoring showed a consistent and statistically relevant effect toward less time required for the completion of each of the PSG scoring tasks. Gain factors ranged from 1.26 (EEG arousals) to 2.41 (limb movements). Inter-scorer kappa agreement was also consistently increased with the use of supervised semi-automatic scoring. Specifically, agreement increased from K=0.76 to K=0.80 (sleep stages), K=0.72 to K=0.91 (limb movements), K=0.55 to K=0.66 (respiratory activity), and K=0.58 to K=0.65 (EEG arousals). Inter-scorer agreement on the examined set of diagnostic indices did also show a trend toward higher Interclass Correlation Coefficient scores when using the semi-automatic scoring approach.
ConclusionsComputer-assisted analysis can improve inter-scorer agreement and scoring times associated with the review of PSG studies resulting in higher efficiency and overall quality in the diagnosis sleep disorders. | neurology |
10.1101/2022.03.23.22272823 | Fat tails, fat earnings, fat mistakes: On the need to disclose distribution parameters of qEEG databases | Neurometry (a.k.a. quantitative EEG or qEEG) is a popular method to assess clinically relevant abnormalities in the electroencephalogram. Neurometry is based on norm values for the distribution of specific EEG parameters and believed to show good psychometric properties such as test-retest reliability. Many psychometric properties only hold under the Gaussian distribution and become problematic when distributions are fat-tailed. EEG signals are typically fat-tailed and do not show fast convergence to a Gaussian distribution. To circumvent this property of EEG, log-transformations have frequently, but not always been employed. In Monte Carlo simulations, we investigated the impact of fat-tails (i.e. deviations from Gaussian) on the cut-off criteria and changeability of what in neurometry is termed "abnormal EEG". Even slight deviations from the Gaussian distribution as measured by skewness and kurtosis lead to large inflation in the number of false positive qEEG findings. The more stringent the cutoff value adopted, the larger the inflation. For these reasons, we argue that distribution properties of qEEG databases should be disclosed in much more detail to avoid questionable research practices and promote diagnostic transparency. Moreover, "abnormal EEG" seems to recover spontaneously at rates not compatible with the alleged test-retest reliability of qEEG. Alternative methods should be employed to determine cut-off values for diagnostics purposes, since a large number of false positive results emerge even when slight deviations from the Gaussian distribution are present. We provide recommendations for the improvement of psychometric properties of existing qEEG databases. | neurology |
10.1101/2022.03.22.22272678 | Age-stratified normative cognitive scores in urban and rural Maharashtra -- a community-based study | Background and objectivesIndian ethnic and educational diversities necessitate obtaining normative cognitive data in different populations. We aimed to evaluate cognitive scores using a Marathi translation of the Kolkata Cognitive Battery (KCB), and to study the association of KCB components with depression and socio-demographic variables.
MethodologyWe studied 2651 individuals aged [≥]40 years, without pre-existing neuropsychiatric conditions, from urban (Mumbai) and rural districts of Maharashtra. For each component of KCB, the lowest 10th percentile score was used to define cognitive impairment.
ResultsWe studied 1435 (54%) rural and 1216 (46%) urban residents equally divided by gender (1316 women, 1335 men), average age 54 years. KCB scores were significantly lower with female sex, older age, illiteracy and depression. The largest effect sizes attributable to these factors were in the domains of calculation (gender), visuo-constructional ability (VCA; rurality) and verbal fluency (VF; depression). Scores remained significantly lower in rural residents after controlling for age, sex and education, particularly for VCA, immediate recall and calculation.
ConclusionThis Marathi KCB having been validated on large urban as well as rural samples, may be used to study cognition in Marathi speaking populations with appropriate cut-offs tailored to the degree of urbanization of the population. | neurology |
10.1101/2022.03.22.22272678 | Normative cognitive scores in western India, stratified by age, rurality, cognitive domains, and psychiatric comorbidity. | Background and objectivesIndian ethnic and educational diversities necessitate obtaining normative cognitive data in different populations. We aimed to evaluate cognitive scores using a Marathi translation of the Kolkata Cognitive Battery (KCB), and to study the association of KCB components with depression and socio-demographic variables.
MethodologyWe studied 2651 individuals aged [≥]40 years, without pre-existing neuropsychiatric conditions, from urban (Mumbai) and rural districts of Maharashtra. For each component of KCB, the lowest 10th percentile score was used to define cognitive impairment.
ResultsWe studied 1435 (54%) rural and 1216 (46%) urban residents equally divided by gender (1316 women, 1335 men), average age 54 years. KCB scores were significantly lower with female sex, older age, illiteracy and depression. The largest effect sizes attributable to these factors were in the domains of calculation (gender), visuo-constructional ability (VCA; rurality) and verbal fluency (VF; depression). Scores remained significantly lower in rural residents after controlling for age, sex and education, particularly for VCA, immediate recall and calculation.
ConclusionThis Marathi KCB having been validated on large urban as well as rural samples, may be used to study cognition in Marathi speaking populations with appropriate cut-offs tailored to the degree of urbanization of the population. | neurology |
10.1101/2022.03.22.22272780 | Systematic Review and Meta-analysis of Mental Health Impact on BAME Populations with Preterm Birth | BackgroundPreterm birth (PTB) is one of the main causes of neonatal deaths globally, with approximately 15 million infants are born preterm. Therefore, the mental health (MH) impact experienced by mothers experiencing a PTB is important, especially within the Black, Asian and Minority Ethnic (BAME) populations.
AimThe aims of the study were to determine the prevalence of MH conditions among BAME women with PTB as well as the MH assessments used to characterise the MH outcomes.
MethodsA systematic methodology was developed and published as a protocol in PROSPERO (CRD42020210863). Multiple databases were used to extract relevant data. I2 and Eggers tests were used to detect the heterogeneity and publication bias. A Trim and fill method was used to demonstrate the influence of publication bias and the credibility of conclusions.
ResultsThirty-nine studies met the eligibility criteria from a possible 3526. The prevalence rates of depression among PTB-BAME mothers were significantly higher than full-term mothers with a standard median deviation (SMD) of 1.5 and a 95% confidence interval (CI) 29-74%. The subgroup analysis indicated, depressive symptoms to be time sensitive. Women within the very PTB category demonstrated a significantly higher prevalence of depression than those categorised as non-very PTB. The prevalence rates of anxiety and stress among PTB-BAME mothers were significantly higher than full-term mothers (OR of 88% and 60% with a CI of 42%-149% and 24-106%, respectively).
ConclusionBAME women with PTB suffers with MH conditions. Many studies did not report on BAME population specific MH outcomes. Therefore, the impact of PTB is not accurately represented in this population, and thus could negatively influence the quality of maternity services.
Core TipThis study demonstrates the mental health impact due to preterm birth among the Black, Asian and Ethnic minority women. There is minimal research available at present around this subject matter, and potential disease sequalae. | obstetrics and gynecology |
10.1101/2022.03.22.22272635 | Patient-Level Clinical Expertise Enhances Prostate Cancer Recurrence Predictions with Machine Learning | With rising access to electronic health record data, application of artificial intelligence to create clinical risk prediction models has grown. A key component in designing these models is feature generation. Methods used to generate features differ in the degree of clinical expertise they deploy (from minimal to population-level to patient-level), and subsequently the extent to which they can extract reliable signals and be automated. In this work, we develop a new process that defines how to systematically implement patient-level clinician feature generation (CFG), which leverages clinical expertise to define concepts relevant to the outcome variable, identify each concepts associated features, and finally extract most features on a per-patient level by manual chart review. We subsequently apply this method to identifying and extracting patient-level features predictive of cancer recurrence from progress notes for a cohort of prostate cancer patients. We evaluate the performance of the CFG process against an automated feature generation (AFG) process via natural language processing techniques. The machine learning outcome prediction model leveraging the CFG process has a mean AUC-ROC of 0.80, in comparison to the AFG model that has a mean AUC-ROC of 0.74. This relationship remains qualitatively unchanged throughout extensive sensitivity analyses. Our analyses illustrate the value of in-depth specialist reasoning in generating features from progress notes and provide a proof of concept that there is a need for new research on efficient integration of in-depth clinical expertise into feature generation for clinical risk prediction. | oncology |
10.1101/2022.03.22.22272635 | Patient-Level Clinical Expertise Enhances Prostate Cancer Recurrence Predictions with Machine Learning | With rising access to electronic health record data, application of artificial intelligence to create clinical risk prediction models has grown. A key component in designing these models is feature generation. Methods used to generate features differ in the degree of clinical expertise they deploy (from minimal to population-level to patient-level), and subsequently the extent to which they can extract reliable signals and be automated. In this work, we develop a new process that defines how to systematically implement patient-level clinician feature generation (CFG), which leverages clinical expertise to define concepts relevant to the outcome variable, identify each concepts associated features, and finally extract most features on a per-patient level by manual chart review. We subsequently apply this method to identifying and extracting patient-level features predictive of cancer recurrence from progress notes for a cohort of prostate cancer patients. We evaluate the performance of the CFG process against an automated feature generation (AFG) process via natural language processing techniques. The machine learning outcome prediction model leveraging the CFG process has a mean AUC-ROC of 0.80, in comparison to the AFG model that has a mean AUC-ROC of 0.74. This relationship remains qualitatively unchanged throughout extensive sensitivity analyses. Our analyses illustrate the value of in-depth specialist reasoning in generating features from progress notes and provide a proof of concept that there is a need for new research on efficient integration of in-depth clinical expertise into feature generation for clinical risk prediction. | oncology |
10.1101/2022.03.23.22272807 | SCENTinel 1.1 rapidly screens for COVID-19 related olfactory disorders | The COVID-19 pandemic has increased the prevalence of people suffering from olfactory disorders. In the absence of quick, population-wide olfactory tests, we developed SCENTinel, a rapid, inexpensive smell test to assess odor detection, intensity, and identification ability, which can discriminate anosmia (e.g., total smell loss) from normosmia (e.g., normal sense of smell) using a single odor. A new version, SCENTinel 1.1, extends the original test with one of four possible odors and a hedonic subtest ("how pleasant is the odor"). The purpose of this study was to determine if SCENTinel 1.1 can discriminate other types of olfactory disorders common to COVID-19, such as hyposmia (e.g., reduced sense of smell), parosmia (e.g., distorted odor perception), and phantosmia (e.g., odor sensation without an odor source). Participants (N=381) were divided into three groups based on their self-reported olfactory function: quantitative smell disorder (anosmia or hyposmia, N=135), qualitative smell disorder (parosmia and/or phantosmia; n=86), and normosmia (N=66). SCENTinel 1.1 classifies anosmia and normosmia groups with high sensitivity (AUC=0.94), similar to SCENTinel 1.0 (AUC=0.95). SCENTinel 1.1 also accurately discriminates quantitative from qualitative (AUC=0.76), and normosmia (AUC=0.84), and normosmia from qualitative (AUC=0.73) groups. We also considered a subset of participants who only reported one type of olfactory disorder. SCENTinel 1.1 discriminates hyposmia from parosmia (AUC=0.89), and anosmia (AUC=0.78); as well as parosmia from anosmia (AUC=0.82). Participants with parosmia had a significantly lower hedonic score than those without parosmia, indicating odor distortions are unpleasant. SCENTinel 1.1 is a rapid smell test that can discriminate quantitative (anosmia, hyposmia) and qualitative (parosmia, phantosmia) olfactory disorders, and it is among the only direct tests to rapidly screen for parosmia. | otolaryngology |
10.1101/2022.03.22.22271657 | Persistence of rare Salmonella Typhi genotypes susceptible to first-line antibiotics in the remote islands of Samoa | For decades, the remote island nation of Samoa (pop. ~200,000) has faced endemic typhoid fever despite improvements in water quality, sanitation, and economic development. We recently described the epidemiology of typhoid fever in Samoa from 2008-2019 by person, place, and time; however, the local Salmonella enterica serovar Typhi (S. Typhi) population structure, evolutionary origins, and genomic features remained unknown. Herein, we report whole genome sequence analyses of 306 S. Typhi isolates from Samoa collected between 1983 and 2020. Phylogenetics revealed a dominant population of rare genotypes 3.5.4 and 3.5.3, together comprising 292/306 (95.4%) of Samoan versus 2/4934 (0.04%) global S. Typhi isolates. Three distinct 3.5.4 genomic sub-lineages were identified and their defining polymorphisms were determined. These dominant Samoan genotypes, which likely emerged in the 1970s, share ancestry with other clade 3.5 isolates from South America, Southeast Asia, and Oceania. Additionally, a 106-kb pHCM2 phenotypically-cryptic plasmid, detected earliest in a 1992 Samoan S. Typhi isolate, was identified in 106/306 (34.6%) of Samoan isolates; this is more than double the observed proportion of pHCM2-containing isolates in the global collection. In stark contrast with global S. Typhi trends, resistance-conferring polymorphisms were detected in only 15/306 (4.9%) of Samoan S. Typhi, indicating overwhelming susceptibility to antibiotics that are no longer effective in most of South and Southeast Asia. This country-level genomic framework can help local health authorities in their ongoing typhoid surveillance and control efforts, as well as to fill a critical knowledge gap in S. Typhi genomic data from Oceania.
IMPORTANCEIn this study we used whole genome sequencing and comparative genomics analyses to characterize the population structure, evolutionary origins, and genomic features of S. Typhi associated with decades of endemic typhoid fever in Samoa. Our analyses of Samoan isolates from 1983 to 2020 identified a rare S. Typhi population in Samoa that likely emerged around the early 1970s and evolved into sub-lineages that presently dominate. The dominance and persistence of these endemic genotypes in Samoa are not readily explained by any apparent genomic competitive advantage or widespread acquisition of antimicrobial resistance. These data establish the necessary framework for future genomic surveillance of S. Typhi in Samoa for public health benefit. | genetic and genomic medicine |
10.1101/2022.03.23.22272818 | CNV-ClinViewer: Enhancing the clinical interpretation of large copy-number variants online | PurposeLarge copy number variants (CNVs) can cause a heterogeneous spectrum of rare and severe disorders. However, most CNVs are benign and are part of natural variation in human genomes. CNV pathogenicity classification, genotype-phenotype analyses, and therapeutic target identification are challenging and time-consuming tasks that require the integration and analysis of information from multiple scattered sources by experts.
MethodsWe developed a web-application combining >250,000 patient and population CNVs together with a large set of biomedical annotations and provide tools for CNV classification based on ACMG/ClinGen guidelines and gene-set enrichment analyses.
ResultsHere, we introduce the CNV-ClinViewer (https://cnv-ClinViewer.broadinstitute.org), an open-source web-application for clinical evaluation and visual exploration of CNVs. The application enables real-time interactive exploration of large CNV datasets in a user-friendly designed interface.
ConclusionOverall, this resource facilitates semi-automated clinical CNV interpretation and genomic loci exploration and, in combination with clinical judgment, enables clinicians and researchers to formulate novel hypotheses and guide their decision-making process. Subsequently, the CNV-ClinViewer enhances for clinical investigators patient care and for basic scientists translational genomic research. | genetic and genomic medicine |
10.1101/2022.03.22.22272697 | Genetically predicted on-statin LDL response is associated with higher intracerebral hemorrhage risk | Statins lower low-density lipoprotein (LDL) cholesterol and are widely used for the prevention of atherosclerotic cardiovascular disease. Whether statin-induced LDL reduction increases risk of intracerebral hemorrhage (ICH) has been debated for almost two decades. Here, we explored whether genetically predicted on-statin LDL response is associated with ICH risk using Mendelian Randomization. Utilizing genomic data from randomized trials, we derived a polygenic score from 35 single nucleotide polymorphisms (SNP) of on-statin LDL response and tested it in the population-based UK-Biobank (UKB). We extracted statin drug and dose information from primary care data on a subset of 225,195 UKB participants covering a period of 29 years. We validated the effects of the genetic score on longitudinal LDL measurements with generalized mixed models and explored associations with incident ICH using Cox regression analysis. Statins were prescribed at least once to 75,973 (31%) of the study participants (mean 57 years, 55% females). Among statin users, mean LDL decreased by 3.45 mg/dl per year (95% CI: [-3.47, -3.42]) over follow-up. A higher genetic score of statin response (one SD increment) was associated with significant additional reductions in LDL levels (-0.05 mg/dl per year, [-0.07, -0.02]), showed concordant lipidomic effects on other lipid traits as statin use, and was associated with a lower risk for incident myocardial infarction (HR per SD increment 0.98 95% CI [0.96, 0.99]) and peripheral artery disease (HR per SD increment 0.92 95% CI [0.86, 0.98]). Over a 11-year follow-up period, a higher genetically predicted statin response among statin users was associated with higher ICH risk in a model adjusting for statin dose (HR per SD increment 1.15, 95% CI [1.04, 1.27]). On the contrary, there was no association with ICH risk among statin non-users (p=0.89). These results provide further support for the hypothesis that statin-induced LDL reduction may be causally associated with ICH risk. While the net benefit of statins for preventing vascular disease is well-established, these results provide insights about the personalized response to statin intake and the role of pharmacologic LDL-lowering in the pathogenesis of ICH. | genetic and genomic medicine |
10.1101/2022.03.23.22272834 | SynTL: A synthetic-data-based transfer learning approach for multi-center risk prediction | ObjectivesWe propose a synthetic-data-based transfer learning approach (SynTL) to incorporate multi-center healthcare data for improving the risk prediction of a target population, accounting for challenges including heterogeneity, data sharing, and privacy constraints.
MethodsSynTL combines locally trained risk prediction models from each source population with the target population to adjust for the data heterogeneity through a flexible distance-based transfer learning approach. Heterogeneity-adjusted synthetic data are then generated for source populations where individual-level data are not shareable. The synthetic data are then combined with the target and source data for joint training of the target model. We evaluate SynTL via extensive simulation studies and an application to multi-center data from the electronic Medical Records and Genomics (eMERGE) Network to predict extreme obesity.
ResultsSimulation studies show that SynTL outperforms methods without adjusting for the data heterogeneity and methods that are trained in a single population over a wide spectrum of settings. SynTL has low communication costs where each participating site only needs to share parameter estimates to the target, requiring only one round of communication. SynTL protects against negative transfer when some source populations are highly different from the target. Using eMERGE data, SynTL achieves an area under the receiver operating characteristic curve (AUC) around 0.79, which outperforms other benchmark methods (0.50 - 0.67).
ConclusionSynTL improves the risk prediction performance of the target population, and is robust to the level of heterogeneity between the target and source populations. It protects patient-level information and is highly communication efficient. | health informatics |
10.1101/2022.03.22.22272752 | Medication Against Conflict | The consequences of successful public health interventions for social violence and conflict are largely unknown. This paper closes this gap by evaluating the effect of a major health intervention - the successful expansion of anti-retroviral therapy (ART) to combat the HIV/AIDS pandemic - in Africa. To identify the effect, we combine exogenous variation in the scope for treatment and global variation in drug prices. We find that the ART expansion significantly reduced the number of violent events in African countries and sub-national regions. The effect pertains to social violence and unrest, not civil war. The evidence also shows that the effect is not explained by general improvements in economic prosperity, but related to health improvements, greater approval of government policy, and increased trust in political institutions. Results of a counterfactual simulation reveal the largest potential gains in countries with intermediate HIV prevalence where disease control has been given relatively low priority.
JEL-classification: C36, D47, I15, O10 | hiv aids |
10.1101/2022.03.22.22272739 | The efficacy, safety and immunogenicity Nanocovax: results of a randomized, double-blind, placebo-controlled Phase 3 trial. | BackgroundNanocovax is a recombinant severe acute respiratory syndrome coronavirus 2 subunit vaccine composed of full-length prefusion stabilized recombinant SARS-CoV-2 spike glycoproteins (S-2P) and aluminum hydroxide adjuvant. In a Phase 1 and 2 studies, (NCT04683484) the vaccine was found to be safe and induce a robust immune response in healthy adult participants.
MethodsWe conducted a multicenter, randomized, double-blind, placebo-controlled study to evaluate the safety, immunogenicity, and protective efficacy of the Nanocovax vaccine against Covid-19 in approximately 13,007 volunteers aged 18 years and over. The immunogenicity was assessed based on Anti-S IgG antibody response, surrogate virus neutralization, wild-type SARS-CoV-2 neutralization and the types of helper T-cell response by intracellular staining (ICS) for interferon gamma (IFNg) and interleukin-4 (IL-4). The vaccine efficacy (VE) was calculated basing on serologically confirmed cases of Covid-19.
FindingsUp to day 180, incidences of solicited and unsolicited adverse events (AE) were similar between vaccine and placebo groups. 100 serious adverse events (SAE) were observed in both vaccine and placebo groups (out of total 13007 participants). 96 out of these 100 SAEs were determined to be unrelated to the investigational products. 4 SAEs were possibly related, as determined by the Data and Safety Monitoring Board (DSMB) and investigators. Reactogenicity was absent or mild in the majority of participants and of short duration. These findings highlight the excellent safety profile of Nanocovax.
Regarding immunogenicity, Nanocovax induced robust IgG and neutralizing antibody responses. Importantly, Anti S-IgG levels and neutralizing antibody titers on day 42 were higher than those of natural infected cases. Nanocovax was found to induce Th2 polarization rather than Th1.
Post-hoc analysis showed that the VE against symptomatic disease was 51.5% (95% confidence interval [CI] was [34.4%-64.1%]. VE against severe illness and death were 93.3% [62.2-98.1]. Notably, the dominant strain during the period of this study was Delta variant.
InterpretationNanocovax 25 microgram (mcg) was found to be safe with the efficacy against symptomatic infection of Delta variant of 51.5%.
FundingResearch was funded by Nanogen Pharmaceutical Biotechnology JSC., and the Ministry of Science and Technology of Vietnam; ClinicalTrials.gov number, NCT04922788. | infectious diseases |
10.1101/2022.03.23.22272811 | Association of COVID-19 with risks of hospitalization and mortality from other disorders post-infection: A study of the UK Biobank | BackgroundMore than 0.4 billion cases of COVID-19 have been reported worldwide. The sequalae of COVID-19 remains unclear, especially whether it may be associated with increased hospitalization and mortality from other diseases.
MethodsWe leveraged a large prospective cohort of the UK biobank (UKBB) (N=412,096; current age 50-87) to identify associations of COVID-19 with hospitalization and mortality due to different diseases post-infection. We conducted a comprehensive survey on disorders from all systems (up to 135 disease categories). Multivariable Cox and Poisson regression was conducted controlling for main confounders. For sensitivity analysis, we also conducted separate analysis for new-onset and recurrent cases, and other sensitivity analysis such as the prior event rate adjustment (PERR) approach to minimize effects of unmeasured confounders.
ResultsCompared to individuals with no known history of COVID-19, those with severe COVID-19 (requiring hospitalization) exhibited higher hazards of hospitalization and/or mortality due to multiple disorders (median follow-up=608 days), including disorders of respiratory, cardiovascular, neurological, gastrointestinal, genitourinary and musculoskeletal systems. Increased hazards of hospitalizations and/or mortality were also observed for injuries due to fractures, various infections and other non-specific symptoms. These results remained largely consistent after sensitivity analyses. Severe COVID-19 was also associated with increased all-cause mortality (HR=14.700, 95% CI: 13.835-15.619).
Mild (non-hospitalized) COVID-19 was associated with modestly increased risk of all-cause mortality (HR=1.237, 95% CI 1.037-1.476) and mortality from neurocognitive disorders (HR=9.100, 95% CI: 5.590-14.816), as well as hospital admission from a few disorders such as aspiration pneumonitis, musculoskeletal pain and other general signs/symptoms.
ConclusionsIn conclusion, this study revealed increased risk of hospitalization and mortality from a wide variety of pulmonary and extra-pulmonary diseases after COVID-19, especially for severe infections. Mild disease was also associated with increased all-cause mortality. Further studies are required to replicate our findings. | infectious diseases |
10.1101/2022.03.23.22272804 | Waning effectiveness of BNT162b2 and ChAdOx1 COVID-19 vaccines over six months since second dose: a cohort study using linked electronic health records | BackgroundThe rate at which COVID-19 vaccine effectiveness wanes over time is crucial for vaccination policies, but is incompletely understood with conflicting results from different studies.
MethodsThis cohort study, using the OpenSAFELY-TPP database and approved by NHS England, included individuals without prior SARS-CoV-2 infection assigned to vaccines priority groups 2-12 defined by the UK Joint Committee on Vaccination and Immunisation. We compared individuals who had received two doses of BNT162b2 or ChAdOx1 with unvaccinated individuals during six 4-week comparison periods, separately for subgroups aged 65+ years; 16-64 years and clinically vulnerable; 40-64 years and 18-39 years. We used Cox regression, stratified by first dose eligibility and geographical region and controlled for calendar time, to estimate adjusted hazard ratios (aHRs) comparing vaccinated with unvaccinated individuals, and quantified waning vaccine effectiveness as ratios of aHRs per-4-week period. The outcomes were COVID-19 hospitalisation, COVID-19 death, positive SARS-CoV-2 test, and non-COVID-19 death.
FindingsThe BNT162b2, ChAdOx1 and unvaccinated groups comprised 1,773,970, 2,961,011 and 2,433,988 individuals, respectively. Waning of vaccine effectiveness was similar across outcomes and vaccine brands: e.g. in the 65+ years subgroup ratios of aHRs versus unvaccinated for COVID-19 hospitalisation, COVID-19 death and positive SARS-CoV-2 test ranged from 1.23 (95% CI 1.15-1.32) to 1.27 (1.20-1.34) for BNT162b2 and 1.16 (0.98-1.37) to 1.20 (1.14-1.27) for ChAdOx1. Despite waning, rates of COVID-19 hospitalisation and COVID-19 death were substantially lower among vaccinated individuals compared to unvaccinated individuals up to 26 weeks after second dose, with estimated aHRs <0.20 (>80% vaccine effectiveness) for BNT162b2, and <0.26 (>74%) for ChAdOx1. By weeks 23-26, rates of SARS-CoV-2 infection in fully vaccinated individuals were similar to or higher than those in unvaccinated individuals: aHRs ranged from 0.85 (0.78-0.92) to 1.53 (1.07-2.18) for BNT162b2, and 1.21 (1.13-1.30) to 1.99 (1.94-2.05) for ChAdOx1.
InterpretationThe rate at which estimated vaccine effectiveness waned was strikingly consistent for COVID-19 hospitalisation, COVID-19 death and positive SARS-CoV-2 test, and similar across subgroups defined by age and clinical vulnerability. If sustained to outcomes of infection with the Omicron variant and to booster vaccination, these findings will facilitate scheduling of booster vaccination doses. | infectious diseases |