id
stringlengths 16
27
| title
stringlengths 18
339
| abstract
stringlengths 95
38.7k
| category
stringlengths 7
44
|
---|---|---|---|
10.1101/2022.03.29.22273148 | Modeling behavior change and underreporting in the early phase of COVID-19 pandemic in Metro Manila, Philippines | When the Philippine government eased the community quarantine restrictions on June 2020, the healthcare system was overwhelmed by the surge in coronavirus disease 2019 (COVID-19) cases. In this study, we developed an SEIQR model considering behavior change and unreported cases to examine their impact on the COVID-19 case reports in Metro Manila during the early phase of the pandemic. We found that if behavior was changed one to four weeks earlier, then the cumulative number of cases can be reduced by up to 74% and the peak delayed by up to four weeks. Moreover, a two- or threefold increase in the reporting ratio can decrease the cumulative number of cases by 29% or 47%, respectively, at the end of September 2020. Results of our finding are expected to guide healthcare professionals to mitigate disease spread and minimize socioeconomic burden of strict lockdown policies during the start of an epidemic. | public and global health |
10.1101/2022.04.03.22273370 | Automated method to extract and purify RNA from wastewater enables more sensitive detection of SARS-CoV-2 markers in community sewersheds | Wastewater based epidemiology (WBE) has emerged as a strategy to identify, locate, and manage outbreaks of COVID19, and thereby possibly prevent surges in cases, which overwhelm local to global health care networks. The WBE process is based on assaying municipal wastewater for molecular markers of the SARS-CoV-2 virus. The standard process for sampling municipal wastewater is both time-consuming and requires the handling of large quantities of wastewater, which negatively affect throughput and timely reporting, and can increase safety risks. We report on a method to assay multiple sub-samples of a bulk wastewater sample. We document the effectiveness of this new approach by way of comparison of technologies for automating RNA purification from wastewater samples. We compared processes using the Perkin-Elmer Chemagic 360 to a PEG/NaCl/Qiagen protocol that is used for detection of N1 and N2 SARS-CoV-2 markers by the majority of 19 pandemic wastewater testing labs in the State of Michigan. Specifically, we found that the Chemagic 360 lowered handling time, decreased the amount of wastewater required by 10-fold, increased the amount of RNA isolated per {micro}l of final elution product by approximately five-fold, and had no deleterious effect on subsequent ddPCR analysis. Moreover, for detection of markers on the borderline of detectability, we found that use of the Chemagic 360 enabled the detection of viral markers in a significant number of samples for which the result with the PEG/NaCl/Qiagen method was below the level of detectability. This improvement in detectability of the viral markers might be particularly important for early warning to public health authorities at the beginning of an outbreak. | public and global health |
10.1101/2022.04.05.22273169 | Changing patterns of SARS-CoV-2 infection through Delta and Omicron waves by vaccination status, previous infection and neighbourhood deprivation: A cohort analysis of 2.7M people. | ObjectiveTo examine if SARS-CoV-2 infections vary by vaccination status, if an individual had previously tested positive and by neighbourhood socioeconomic deprivation across the Delta and Omicron epidemic waves of SARS-CoV-2.
DesignCohort study using electronic health records
SettingCheshire and Merseyside, England (3rd June 2021 to 1st March 2022)
Participants2.7M residents
Main Outcome measureRegistered positive test for SARS-CoV-2
ResultsSocial inequalities in registered positive tests were dynamic during the study. Originally higher SARS-CoV-2 rates in the most socioeconomically deprived neighbourhoods changed to being higher in the least deprived neighbourhoods from the 1st September 2021. While the introduction of Omicron initially reset inequalities, they continued to be dynamic and inconsistent. Individuals who were fully vaccinated (two doses) were associated with fewer registered positive tests (e.g., between 1st September and 27th November 2021: (i) individuals engaged in testing - Hazards Ratio (HR) = 0.48, 95% Confidence Intervals (CIs) = 0.47-0.50; (ii) individuals engaged with healthcare - HR = 0.34, 95% CIs = 0.33-0.34). Individuals with a previous registered positive test were also less likely to have a registered positive test (e.g., between 1st September and 27th November 2021: (i) individuals engaged in testing - HR = 0.16, 95% CIs = 0.15-0.18; (ii) individuals engaged with healthcare - HR = 0.14, 95% CIs = 0.13-0.16). However, Omicron is disrupting these associations due to immune escape resulting in smaller effect sizes for both measures.
ConclusionsChanging patterns of SARS-CoV-2 infections during the Delta and Omicron waves reveals a dynamic pandemic that continues to affect diverse communities in sometimes unexpected ways. | public and global health |
10.1101/2022.04.05.22273169 | Changing patterns of SARS-CoV-2 infection through Delta and Omicron waves by vaccination status, previous infection and neighbourhood deprivation: A cohort analysis of 2.7M people. | ObjectiveTo examine if SARS-CoV-2 infections vary by vaccination status, if an individual had previously tested positive and by neighbourhood socioeconomic deprivation across the Delta and Omicron epidemic waves of SARS-CoV-2.
DesignCohort study using electronic health records
SettingCheshire and Merseyside, England (3rd June 2021 to 1st March 2022)
Participants2.7M residents
Main Outcome measureRegistered positive test for SARS-CoV-2
ResultsSocial inequalities in registered positive tests were dynamic during the study. Originally higher SARS-CoV-2 rates in the most socioeconomically deprived neighbourhoods changed to being higher in the least deprived neighbourhoods from the 1st September 2021. While the introduction of Omicron initially reset inequalities, they continued to be dynamic and inconsistent. Individuals who were fully vaccinated (two doses) were associated with fewer registered positive tests (e.g., between 1st September and 27th November 2021: (i) individuals engaged in testing - Hazards Ratio (HR) = 0.48, 95% Confidence Intervals (CIs) = 0.47-0.50; (ii) individuals engaged with healthcare - HR = 0.34, 95% CIs = 0.33-0.34). Individuals with a previous registered positive test were also less likely to have a registered positive test (e.g., between 1st September and 27th November 2021: (i) individuals engaged in testing - HR = 0.16, 95% CIs = 0.15-0.18; (ii) individuals engaged with healthcare - HR = 0.14, 95% CIs = 0.13-0.16). However, Omicron is disrupting these associations due to immune escape resulting in smaller effect sizes for both measures.
ConclusionsChanging patterns of SARS-CoV-2 infections during the Delta and Omicron waves reveals a dynamic pandemic that continues to affect diverse communities in sometimes unexpected ways. | public and global health |
10.1101/2022.04.04.22273427 | Sustained patient use and improved outcomes with digital transformation of a COPD service: RECEIVER trial and DYNAMIC-SCOT COVID-19 scale-up response. | IntroductionLenusCOPD has been co-designed to enable digital transformation of COPD services for proactive preventative care. Patient-facing progressive web application, clinician dashboard and support website integrate patient-reported outcomes (PROs), self-management resources, structured clinical summary, wearable and home NIV data with asynchronous patient-clinician messaging. We commenced the implementation-effectiveness observational cohort RECEIVER trial in September 2019, with the primary endpoint of sustained patient usage and secondary endpoints including admissions, mortality, exacerbations, service workload and quality of life. We paused recruitment in March 2021 and provided LenusCOPD as routine care in the "DYNAMIC-SCOT" COVID-19 response service scale-up.
Methods83 RECEIVER trial participants and 142 DYNAMIC-SCOT participants had completed minimum 1 year follow-up when we censored data on 31st August 2021. We established a control cohort with 5 patients matched per RECEIVER participant from de-identified contemporary routine clinical data.
ResultsSustained patient app utilisation was noted in both cohorts. Median time to admission or death was 43 days in control, 338 days in RECEIVER and 400 days in DYNAMIC-SCOT participants who had had a respiratory-related admission in the preceding year. The 12-month risk of admission or death was 74% in control patients, 53% in RECEIVER and 47% in the DYNAMIC-SCOT sub-cohort participants. There was a median of 2.5 COPD exacerbations per patient per year with stable quality of life across follow up and a manageable workload for clinical users.
ConclusionsA high proportion of people continued to use the co-designed LenusCOPD application during extended follow-up. Outcome data supports scale-up of this digital service transformation.
Key messages
What is the key question?Can sustained patient interaction and improved patient outcomes be achieved with digital transformation of a COPD service?
What is the bottom line?Participants continue to use the LenusCOPD patient app, with an average of 3-3.5 interactions per person per week sustained >1-year post-onboarding. COPD- related hospital admissions and occupied bed days were reduced following LenusCOPD onboarding in participants with a history of a severe exacerbation in the previous year, with a median time to readmission of 380 days compared with 50 days in a contemporary matched control patient cohort.
Why read on?Feasibility and utility results support scale-up adoption of these digital tools, to support optimised co-management of COPD and other long-term conditions within a continuous implementation-evaluation framework. This will establish a test-bed infrastructure for additional innovations including artificial intelligence-insights for MDT decision support. | respiratory medicine |
10.1101/2022.03.29.22272960 | Nonutility of procalcitonin for diagnosing bacterial pneumonia in COVID-19 | Patients hospitalized with COVID-19 are at significant risk for superimposed bacterial pneumonia. However, diagnosing superinfection is challenging due to its clinical resemblance to severe COVID-19. We therefore evaluated whether the immune biomarker, procalcitonin, could facilitate the diagnosis of bacterial superinfection. To do so, we identified 185 patients with severe COVID-19 who underwent lower respiratory culture; 85 had superinfection. Receiver operating characteristic curve analysis showed that procalcitonin at the time of culture was incapable of distinguishing patients with bacterial infection (AUC, 0.52). We conclude that static measurement of procalcitonin does not aid in the diagnosis of superinfection in severe COVID-19. | respiratory medicine |
10.1101/2022.03.29.22272862 | Comparing phrenic nerve stimulation using three rapid coils: implications for mechanical ventilation | RationaleRapid magnetic stimulation (RMS) of the phrenic nerves may serve to attenuate diaphragm atrophy during mechanical ventilation. With different coil shapes and stimulation location, inspiratory responses and side-effects may differ.
ObjectiveTo compare the inspiratory and sensory responses of three different RMS-coils either used bilaterally on the neck or on the chest, and to determine if ventilation over 10min can be achieved without muscle fatigue and coils overheating.
MethodsHealthy participants underwent 1-s RMS on the neck (RMSBAMPS) (n=14) with three different pairs of magnetic coils (parabolic, D-shape, butterfly) at 15, 20, 25 and 30Hz stimulator-frequency and 20% stimulator-output with +10% increments. The D-shape coil with individual optimal stimulation settings was then used to ventilate participants (n=11) for up to 10min. Anterior RMS on the chest (RMSaMS) (n=8) was conducted on an optional visit. Airflow was assessed via pneumotach and transdiaphragmatic pressure via esophageal and gastric balloon catheters. Perception of air hunger, pain, discomfort and paresthesia were measured with a numerical scale.
Main resultsInspiration was induced via RMSBAMPS in 86% of participants with all coils and via RMSaMS in only one participant with the parabolic coil. All coils produced similar inspiratory and sensory responses during RMSBAMPS with the butterfly coil needing higher stimulator-output, which resulted in significantly larger discomfort ratings at maximal inspiratory responses. Ten of 11 participants achieved 10min of ventilation without decreases in minute ventilation (15.7{+/-}4.6L/min).
ConclusionsRMSBAMPS was more effective than RMSaMS, and could temporarily ventilate humans seemingly without development of muscular fatigue. | respiratory medicine |
10.1101/2022.03.29.22272862 | Comparing phrenic nerve stimulation using three rapid coils: implications for mechanical ventilation | RationaleRapid magnetic stimulation (RMS) of the phrenic nerves may serve to attenuate diaphragm atrophy during mechanical ventilation. With different coil shapes and stimulation location, inspiratory responses and side-effects may differ.
ObjectiveTo compare the inspiratory and sensory responses of three different RMS-coils either used bilaterally on the neck or on the chest, and to determine if ventilation over 10min can be achieved without muscle fatigue and coils overheating.
MethodsHealthy participants underwent 1-s RMS on the neck (RMSBAMPS) (n=14) with three different pairs of magnetic coils (parabolic, D-shape, butterfly) at 15, 20, 25 and 30Hz stimulator-frequency and 20% stimulator-output with +10% increments. The D-shape coil with individual optimal stimulation settings was then used to ventilate participants (n=11) for up to 10min. Anterior RMS on the chest (RMSaMS) (n=8) was conducted on an optional visit. Airflow was assessed via pneumotach and transdiaphragmatic pressure via esophageal and gastric balloon catheters. Perception of air hunger, pain, discomfort and paresthesia were measured with a numerical scale.
Main resultsInspiration was induced via RMSBAMPS in 86% of participants with all coils and via RMSaMS in only one participant with the parabolic coil. All coils produced similar inspiratory and sensory responses during RMSBAMPS with the butterfly coil needing higher stimulator-output, which resulted in significantly larger discomfort ratings at maximal inspiratory responses. Ten of 11 participants achieved 10min of ventilation without decreases in minute ventilation (15.7{+/-}4.6L/min).
ConclusionsRMSBAMPS was more effective than RMSaMS, and could temporarily ventilate humans seemingly without development of muscular fatigue. | respiratory medicine |
10.1101/2022.03.30.22269826 | Predicting Clinical Endpoints and Visual Changes with Quality-Weighted Tissue-based Histological Features | Two common obstacles limiting the performance of data-driven algorithms in digital histopathology classification tasks are the lack of expert annotations and the narrow diversity of datasets. Multi-instance learning (MIL) can be used to address the former challenge for the analysis of whole slide images (WSI) but performance is often inferior to full supervision. We show that the inclusion of weak annotations can significantly enhance the effectiveness of MIL while keeping the approach scalable.
An analysis framework was developed to process periodic acid-Schiff (PAS) and Sirius Red (SR) slides of renal biopsies. The workflow segments tissues into coarse tissue classes. Handcrafted and deep features were extracted from these tissues and combined using a soft attention model to predict several slide-level labels: delayed graft function (DGF), acute tubular injury (ATI), and Remuzzi grade components. A tissue segmentation quality metric was also developed to reduce the adverse impact of poorly segmented instances. The soft attention model was trained using 5-fold cross-validation on a mixed dataset and tested on the QUOD dataset containing n=373 PAS and n=195 SR biopsies.
The average ROC-AUC over different prediction tasks was found to be 0.598{+/-}0.011, significantly higher than using only ResNet50 (0.545{+/-}0.012), only handcrafted features (0.542 {+/-} 0.011), and the baseline (0.532 {+/-} 0.012) of state-of-the-art performance. Weighting tissues by segmentation quality in conjunction with the soft attention has led to further improvement (AUC = 0.618 {+/-} 0.010). Using an intuitive visualisation scheme, we show that our approach may also be used to support clinical decision making as it allows pinpointing individual tissues relevant to the predictions. | transplantation |
10.1101/2022.03.30.22269826 | Predicting Clinical Endpoints and Visual Changes with Quality-Weighted Tissue-based Histological Features | Two common obstacles limiting the performance of data-driven algorithms in digital histopathology classification tasks are the lack of expert annotations and the narrow diversity of datasets. Multi-instance learning (MIL) can be used to address the former challenge for the analysis of whole slide images (WSI) but performance is often inferior to full supervision. We show that the inclusion of weak annotations can significantly enhance the effectiveness of MIL while keeping the approach scalable.
An analysis framework was developed to process periodic acid-Schiff (PAS) and Sirius Red (SR) slides of renal biopsies. The workflow segments tissues into coarse tissue classes. Handcrafted and deep features were extracted from these tissues and combined using a soft attention model to predict several slide-level labels: delayed graft function (DGF), acute tubular injury (ATI), and Remuzzi grade components. A tissue segmentation quality metric was also developed to reduce the adverse impact of poorly segmented instances. The soft attention model was trained using 5-fold cross-validation on a mixed dataset and tested on the QUOD dataset containing n=373 PAS and n=195 SR biopsies.
The average ROC-AUC over different prediction tasks was found to be 0.598{+/-}0.011, significantly higher than using only ResNet50 (0.545{+/-}0.012), only handcrafted features (0.542 {+/-} 0.011), and the baseline (0.532 {+/-} 0.012) of state-of-the-art performance. Weighting tissues by segmentation quality in conjunction with the soft attention has led to further improvement (AUC = 0.618 {+/-} 0.010). Using an intuitive visualisation scheme, we show that our approach may also be used to support clinical decision making as it allows pinpointing individual tissues relevant to the predictions. | transplantation |
10.1101/2022.04.04.22273410 | Influence of Illustrators Intentions and Visual Description Techniques of Medical Illustrations on Generating Interest and Boosting Comprehension of Medical Information | In this study, an exploratory online survey was conducted to clarify points that have not been clarified so far in the medical illustration research field. If several illustrators make an illustration of the same information, does a match of intentions and techniques affect the impression or comprehension of the information? The aim is to further the utilization of medical illustration in actual practice (n=1104).
First, we selected Autosomal Dominant Polycystic Kidney Disease (ADPKD) as the medical condition about which information had to be disseminated. Then, we asked 32 art professionals to make six illustrations with three detail types (high, middle, and low) and two purposes ("getting interested" and "boosting comprehension"). Thereafter, we selected six different types of art professionals illustrations for the questionnaire to ask the participants about the intentions of the illustrations and comprehension of the content. We found that if the illustrators intentions and visual description techniques match with the recipients, the match of intentions could help generate interest in the content, and the match of visual description techniques could enhance the comprehension of the information. | medical education |
10.1101/2022.04.01.22273073 | Trans-border transfer of human biological materials in collaborative biobanking research: Perceptions and experiences of researchers in Uganda | IntroductionThe study aimed to explore researchers perceptions and experiences on the transfer of human biological materials in international collaborative research.
MethodsThis was a descriptive convergent parallel survey that randomly recruited 187 researchers involved in biobanking and/or genetics research. Data were collected using a self-administered tool that had both open and closed ended questions.
ResultsMajority of respondents were male scientists (53.5%) with a mean research experience of 12.2 (SD 6.75) years. About 89% had ever worked in international collaborative research and 42% had ever participated in material transfer agreement (MTA) development. There were several areas of agreement in regard to the rights of local researchers and institution in collaborative biobanking research, and what details should be included in material transfer agreements. There was overwhelming support for collaborative partnership in biobanking research with 90.4% of researchers agreeing that local scientists should be involved in decisions making regarding the future use of samples. A majority (85.6%) opined that there should be fair sharing of research benefits with local researchers and populations or country from which the human biological materials were taken. Researchers felt that most MTAs tend to favour international collaborators. Several trust issues in the MTA development and implementation process were also highlighted and these included: lack of transparency and dishonesty of receiving scientists; lack of mechanisms to monitor the use of exported samples; ownership and intellectual property disputes; exploitation and inequitable sharing of research benefits; and authorship challenges. Several researchers seemed not to be conversant with the guidance provided by the Ugandan national ethics guidelines on the cross-border exchange of human biological specimens.
ConclusionLocal researchers had a positive attitude towards the export of human biological materials in collaborative research. However, there were several governance and trust concerns. There is a need for collaborative partnership in biobanking research. | medical ethics |
10.1101/2022.04.03.22273365 | Classifying REM Sleep Behavior Disorder through CNNs with Image-Based Representations of EEGs | ObjectiveRapid Eye Movement Sleep Behavior Disorder (RBD) is a parasomnia with a high conversion rate to -synucleinopathies such as Parkinsons Disease (PD), dementia with Lewy bodies (DLB), and multiple system atrophy (MSA). The objective of this paper is to classify RBD in patients through a convolutional neural network utilizing imagebased representations of electroencephalogram (EEG) channels.
MethodsAnalysis was conducted on polysomnography data from 22 patients with RBD and 12 healthy controls acquired from the Cyclic Alternating Pattern (CAP) Sleep Database. EEG channels were split into four frequency bands (theta, alpha, beta, and gamma). Power spectrum density was calculated through a Fast Fourier Transformation (FFT) and converted into 2D images through bicubic interpolation. RBD classification was accomplished through a pre-trained VGG-16 CNN with layer weights being fine-tuned.
ResultsThe model was successful in classifying RBD patients over non-RBD patients and achieved 97.92% accuracy over 40 epochs. Accuracy increased dramatically with increased data generated from FFT and interpolation (62.63% to 97.92%).
ConclusionsThis project proposes a novel approach toward an automatic classification of RBD and highlights the importance of deep learning models in the field. The proposed transfer learning model outperforms state-of-the-art models and preliminarily accounts for the lack of computational resources in clinical spaces, thereby increasing the accessibility of automatic classification.
SignificanceBy leveraging transfer learning and raw data, the results demonstrate that a similar model for the classification of RBD patients could easily be translated to the clinical atmosphere, drastically accelerating the classification pipeline. The proposed methods are also applicable to -synucleinopathies, including PD, DLB, and MSA. | neurology |
10.1101/2022.03.31.22273072 | Cardiovascular diseases worsen the maternal prognosis of COVID-19 | Cardiovascular diseases are a risk factor for severe cases of COVID-19. There are no studies evaluating whether the presence of CVD in pregnant women and in postpartum women with COVID-19 is associated with a worse prognosis. In an anonymized open database of the Ministry of Health, we selected cases of pregnant and postpartum women who were hospitalized due to COVID-19 infection. Among the 1,876,953 reported cases, 3,562 confirmed cases of pregnant and postpartum women were included, of which 602 had CVD. Patients with CVD had an older age (p<0,001), a higher incidence of diabetes (p<0,001) and obesity (p<0,001), a higher frequency of systemic (p<0,001) and respiratory symptoms (p<0,001). CVD was a risk factor for ICU admission (p<0,001), ventilatory support (p=0.004) and orotracheal intubation in the third trimester (OR 1.30 CI95%1.04-1.62). The group CVD had a higher mortality (18.9% vs. 13.5%, p<0,001), with a 32% higher risk of death (OR 1.32 CI95%1.16-1.50). Moreover, the risk was increased in the second (OR 1.94 CI95%1.43-2.63) and third (OR 1.29 CI95%1.04-1.60) trimesters, as well as puerperium (OR 1.27 IC95%1.03-1.56). Hospitalized obstetric patients with CVD and COVID-19 are more symptomatic. Their management demand more ICU admission and ventilatory support and the mortality is higher. | obstetrics and gynecology |
10.1101/2022.04.03.22273359 | Automated Enhancement and Curation of Healthcare Data | Social and behavioral aspects of our lives significantly impact our health, yet minimal social determinants of health (SDOH) data elements are collected in the healthcare system. In this study we developed a repeatable SDOH enrichment and integration process to incorporate dynamically evolving SDOH domain concepts from consumers into clinical data. This process included SDOH mapping, linking compiled consumer data, data quality analysis and preprocessing, storage, and visualization. Our preliminary analysis shows commercial consumer data can be a viable source of SDOH factor at an individual-level for clinical data thus providing a path for clinicians to improve patient treatment and care. | health informatics |
10.1101/2022.04.04.22273232 | Evaluation of Eight Lateral Flow Tests For The Detection Of Anti-SARS-CoV-2 Antibodies In A Vaccinated Population | With the distribution of COVID-19 vaccinations across the globe and the limited access in many countries, quick determination of an individuals antibody status could be beneficial in allocating limited vaccine doses in low- and middle-income countries (LMIC). Antibody lateral flow tests (LFTs) have potential to address this need as a quick, point of care test, they also have a use case for identifying sero-negative individuals for novel therapeutics, and for epidemiology. Here we present a proof-of-concept evaluation of eight LFT brands using sera from 95 vaccinated individuals to determine sensitivity for detecting vaccination generated antibodies. All 95 (100%) participants tested positive for anti-spike antibodies by the chemiluminescent microparticle immunoassay (CMIA) reference standard post-dose two of their SARS-CoV-2 vaccine: BNT162b2 (Pfizer/BioNTech, n=60), AZD1222 (AstraZeneca, n=31), mRNA-1273 (Moderna, n=2) and Undeclared Vaccine Brand (n=2). Sensitivity increased from dose one to dose two in six out of eight LFTs with three tests achieving 100% sensitivity at dose two in detecting anti-spike antibodies. These tests are quick, low-cost point-of-care tools that can be used without prior training to establish antibody status and may prove valuable for allocating limited vaccine doses in LMICs to ensure those in at risk groups access the protection they need. Further investigation into their performance in vaccinated peoples is required before more widespread utilisation is considered. | infectious diseases |
10.1101/2022.04.04.22273232 | Evaluation of Eight Lateral Flow Tests For The Detection Of Anti-SARS-CoV-2 Antibodies In A Vaccinated Population | With the distribution of COVID-19 vaccinations across the globe and the limited access in many countries, quick determination of an individuals antibody status could be beneficial in allocating limited vaccine doses in low- and middle-income countries (LMIC). Antibody lateral flow tests (LFTs) have potential to address this need as a quick, point of care test, they also have a use case for identifying sero-negative individuals for novel therapeutics, and for epidemiology. Here we present a proof-of-concept evaluation of eight LFT brands using sera from 95 vaccinated individuals to determine sensitivity for detecting vaccination generated antibodies. All 95 (100%) participants tested positive for anti-spike antibodies by the chemiluminescent microparticle immunoassay (CMIA) reference standard post-dose two of their SARS-CoV-2 vaccine: BNT162b2 (Pfizer/BioNTech, n=60), AZD1222 (AstraZeneca, n=31), mRNA-1273 (Moderna, n=2) and Undeclared Vaccine Brand (n=2). Sensitivity increased from dose one to dose two in six out of eight LFTs with three tests achieving 100% sensitivity at dose two in detecting anti-spike antibodies. These tests are quick, low-cost point-of-care tools that can be used without prior training to establish antibody status and may prove valuable for allocating limited vaccine doses in LMICs to ensure those in at risk groups access the protection they need. Further investigation into their performance in vaccinated peoples is required before more widespread utilisation is considered. | infectious diseases |
10.1101/2022.04.01.22273117 | Developing a model for predicting impairing physical symptoms in children 3 months after a SARS-CoV-2 PCR-test: The CLoCk Study | ImportancePredictive models can help identify SARS-CoV-2 patients at greatest risk of post-COVID sequelae and direct them towards appropriate care.
ObjectiveTo develop and internally validate a model to predict children and young people most likely to experience at least one impairing physical symptom 3 months after a SARS-CoV-2 PCR-test and to determine whether the impact of these predictors differed by SARS-CoV-2 infection status.
DesignPotential pre-specified predictors included: SARS-CoV-2 status, sex, age, ethnicity, deprivation, quality of life/functioning (5 EQ-5D-Y items), physical and mental health, and loneliness (all prior to SARS-CoV-2 testing), and number of physical symptoms at testing. Logistic regression was used to develop the model. Model performance was assessed using calibration and discrimination measures; internal validation was performed via bootstrapping; the final model was adjusted for overfitting.
SettingNational cohort study of SARS-CoV-2 PCR-positive and PCR-negative participants matched according to age, sex, and geographical area.
ParticipantsChildren and young people aged 11-17 years who were tested for SARS-CoV-2 infection in England, January to March 2021.
Main outcome measureone or more physical symptom 3 months after initial PCR-testing which affected physical, mental or social well-being and interfered with daily living.
ResultsA total of 50,836 children and young people were approached; 7,096 (3,227 test-positives, 3,869 test-negatives) who completed a questionnaire 3 months after their PCR-test were included. 39.6% (1,279/3,227) of SAR-CoV-2 PCR-positives and 30.6% (1,184/3,869) of SAR-CoV-2 PCR-negatives had at least one impairing physical symptom 3 months post-test. The final model contained predictors: SARS-COV-2 status, number of symptoms at testing, sex, age, ethnicity, self-rated physical and mental health, feelings of loneliness and four EQ-5D-Y items before testing. Internal validation showed minimal overfitting with excellent calibration and discrimination measures (optimism adjusted calibration slope:0.97527; C-statistic:0.83640).
Conclusions and relevanceWe developed a risk prediction equation to identify those most at risk of experiencing at least one impairing physical symptom 3 months after a SARS-CoV-2 PCR-test which could serve as a useful triage and management tool for children and young people during the ongoing pandemic. External validation is required before large-scale implementation.
Key PointsO_ST_ABSQuestionC_ST_ABSWhich children have impairing physical symptoms during the COVID-19 pandemic?
FindingsUsing data from a large national matched cohort study in children and young people (CYP) aged 11-17 years (N=7,096), we developed a prediction model for experiencing at least one impairing physical symptom 3 months after testing for SARS-COV-2. Our model had excellent predictive ability, calibration and discrimination; we used it to produce a risk estimation calculator.
MeaningOur developed risk calculator could serve as a useful tool in the early identification and management of CYP at risk of persisting physical symptoms in the context of the COVID-19 pandemic. | infectious diseases |
10.1101/2022.04.04.22273361 | Long- and short-term effects of cross-immunity in epidemic dynamics | The vertebrate immune system is capable of strong, focused adaptive responses that depend on T-cell specificity in recognizing antigenic sequences of a pathogen. Recognition tolerance and antigenic convergence cause cross-immune reactions that extend prompt, specific responses to rather similar pathogens. This suggests that reaching herd-immunity might be facilitated during successive epidemic outbreaks (e.g., SARS-CoV-2 waves with different variants). Qualitative studies play down this possibility because cross-immune protection is seldom sterilizing. We use minimal quantitative models to study how cross-immunity affects epidemic dynamics over short and long timescales. In the short scale, we investigate models of sterilizing and attenuating immunity, finding equivalences between both mechanisms--thus suggesting a key role of attenuating protection in achieving herd immunity. Our models render maps in epidemic-parameter space that discern threatening variants depending on acquired cross-immunity levels. We illustrate this application with SARS-CoV-2 data, including protection due to vaccination rates across countries. In the long-time scale, we model sterilizing cross-immunity between rolling pathogens to characterize statistical properties of successful strains. We find that sustained cross-immune protection alters the regions of epidemic-parameter space where large outbreaks happen. Our results suggest an optimistic revision concerning prospects for herd protection based on cross-immunity, including for the SARS-CoV-2 pandemics. | epidemiology |
10.1101/2022.04.04.22273405 | Severity and Correlates of Mental Fog in People with Traumatic Brain Injury | ObjectiveAlongside objective performance declines, self-reported cognitive symptoms after traumatic brain injury (TBI) abound. Mental fog is one symptom that has been underexplored. The current project investigated mental fog across two studies of individuals with mild traumatic brain injury and moderate-to-severe traumatic brain injury to close our knowledge gap about differences in severity. We then explored the cognitive and affective correlates of mental fog within these groups.
MethodsUsing between-groups designs, the first study recruited individuals with acute mild TBI (n = 15) along with a healthy control group (n = 16). Simultaneously, a second study recruited persons with post-acute moderate-to-severe TBI, a stage when self-reports are most reliable (n = 15). Measures across the studies were harmonized and involved measuring mental fog (Mental Clutter Scale), objective cognition (Cogstate(R) and UFOV(R)), and depressive symptoms. In addition to descriptive group difference analyses, nonparametric correlations determined associations between mental fog, objective cognition, and depressive symptoms.
ResultsResults revealed higher self-reported mental fog in acute mild TBI compared to healthy controls. And though exploratory, post-acute moderate-to-severe TBI also appears characterized by greater mental fog. Correlations showed that mental fog in mild TBI corresponded to greater depressive symptoms (r = .66) but was unrelated to objective cognition. By contrast, mental fog in moderate-to-severe TBI corresponded to poorer working memory (r = .68) and slowed processing speed (r = -.55) but was unrelated to depressive symptoms.
ConclusionAs a common symptom in TBI, mental fog distinguishes individuals with acute mild TBI from uninjured peers. Mental fog also appears to reflect challenges in recovery, including depressive symptoms and objective cognitive problems. Screening for mental fog, in addition to other cognitive symptoms, might be worthwhile in these populations. | psychiatry and clinical psychology |
10.1101/2022.04.04.22273416 | Domain specific need assessment for Disaster Preparedness: a systematic review and critical interpretative synthesis | BackgroundDuring disasters, the most pressing demands are those related to health, and hospital preparedness is an area that require special attention. Hospitals are viewed as resources that must be proactively utilized in the event of a disaster. If national and local systems, particularly health systems, are unprepared to deal with disasters, the vulnerability of both individuals and communities is amplified. The unexpected surge in demand for important health services caused by disasters frequently overwhelms health systems and institutions, leaving them unable to perform the life-saving measures that are required. This study aims to understand various domains of hospital disaster preparedness by critically synthesizing qualitative evidence from selected research on the topic.
MethodsElectronic data base from PubMed, Google Scholar, key hospital disaster related journals was explored with search syntax focusing on hospital related disaster preparedness. Peer reviewed English articles published from January 2011 were systematically selected and critical interpretative qualitative synthesis was done to have comprehensive understanding of the said phenomenon.
ResultsA total of 29 articles were included in the systematic review. Major resultant domains describing disaster preparedness were Human Resource, Logistics And Finance, Response, Communication, Coordination, Patient Care, Evacuation and Personal Protection. Some domains were more emphasised than others, this information can help prioritizing the action based on need especially in the times of disaster.
ConclusionDisaster preparedness needs a comprehensive approach including context specific optimization with the effective use of available resources. | public and global health |
10.1101/2022.04.01.22273213 | Physical activity-correlated changes in plasma enzyme concentrations in fragile sarcolemmal muscular dystrophies | STRUCTURED ABSTRACTO_ST_ABSBackground and ObjectivesC_ST_ABSMuscular dystrophies associated with decreased sarcolemma integrity lack validated clinical measures of sarcolemma fragility that can be used to assess disease progression and the effects of therapies designed to reduce sarcolemma fragility. We conducted a pilot study to test the hypothesis that physical activity leads to significant changes in muscle-derived plasma enzymes in participants with "fragile sarcolemmal muscular dystrophies" (FSMD).
MethodsWe enrolled ambulatory individuals clinically affected with genetically confirmed FSMD neither taking anti-inflammatory medications nor having relevant co-morbidities for an inpatient study. Over five days, blood samples at 20 time points were obtained. Plasma enzymes alanine and aspartate aminotransferase (ALT, AST), creatine kinase (CK), and lactate dehydrogenase (LDH), all found in muscle, were measured before and after routine morning activities and motor function testing. Analysis of Z-transformed time series data led to feature and kinetic models that revealed activity-dependent feature and kinetic parameters.
ResultsAmong the 11 enrolled participants, (LGMD Type 2B/R2 Dysferlin-related (4F/1M), LGMD Type 2L/R12 Anoctamin-5-related (3F/2M), LGMD Type 2I/R9 FKRP-related (1M)), plasma enzymes increased with activity. The average % change +/- SEM with morning activity across all participants was ALT 12.8 {+/-} 2.8%, AST 11.6 {+/-} 2.9%, CK 12.9 {+/-} 2.8%, and LDH 12.2 {+/-} 3.9%, suggesting the increases originate from the same stimulated source, presumably skeletal muscle. For ALT, AST, CK, and LDH, characteristic kinetic features include (a) elevated enzyme activities on arrival that decreased overnight; (b) a longer decay trend observed over the week, and (c) for ALT, AST, and CK, a similar decay trend observed with post-morning activity blood draws.
DiscussionControlled activity-dependent changes in plasma ALT, AST, and CK on time scales of days to weeks can serve as common outcome measures for sarcolemma integrity and may be efficient and effective tools for monitoring disease progression and treatment efficacy for both individuals and patient populations. In addition, this study provides data that may benefit patient management as it can inform guidance on duration and type of activity that minimizes muscle damage. | neurology |
10.1101/2022.04.02.22272628 | Effectiveness of mHEALTH Application at Primary Health Care to Improve Maternal and New-born Health Services in Rural Ethiopia: Comparative study | Ethiopia has recently implemented mHealth technology on a limited scale to help increase the uptake of health services, including intervention for maternal and new-born health service utilisation. In this study, the effectiveness of the mHealth intervention was assessed by measuring the level of maternal health service utilization in 4 Health Centers in Ethiopia.
The study was comparative by design employing comparison of maternal and newborn health service utilization before and after initiation of mHealth implementation. Follow-up data of 800 clients were randomly selected and included in the study, to determine the magnitude of maternal and new-born health service utilization. Data analysis included comparison of pre-mHealth (baseline) with mHealth follow-up data, using independent t-test to compare magnitude of maternal and new-born health service utilization.
The mean of antenatal care follow-up during their recent pregnancy was 2.21(SD{+/-}1.02) and 3.43(SD{+/-}0.88) for baseline and intervention, respectively. Antenatal visit of four or more was reached for 55(13.8%) of the baseline and 256(64%) of pregnant women in the mHealth intervention group. Pregnant womens timeliness to start ANC follow-up at baseline and intervention groups was 44.5% and 77.3%, respectively. Institutional delivery at baseline and intervention groups were 35.0% and was 71.2%, respectively. Of women who gave birth, 23.8% at baseline received first postnatal care within 6 hours, 11.3% within 6 days, and 6.8% within 6 weeks. In the intervention group, 84% delivered women received first postpartum within 6 hours after delivery, 70.8% after 6 hours, and 46% made their third postpartum visit within 6 weeks after delivery. Penta-3 vaccination coverage at baseline and mHealth intervention groups was 61.5% and 70.4%, respectively.
The study result suggest that the introduction of a low-cost mHealth technologies contributed to the observed improvement of maternal and new-born health service utilization. This intervention shows promise for scale up as well as to be applied to other health interventions beyond maternal and newborn health services. | health systems and quality improvement |
10.1101/2022.04.02.22272628 | Effectiveness of mHEALTH Application at Primary Health Care to Improve Maternal and New-born Health Services in Rural Ethiopia: Comparative study | Ethiopia has recently implemented mHealth technology on a limited scale to help increase the uptake of health services, including intervention for maternal and new-born health service utilisation. In this study, the effectiveness of the mHealth intervention was assessed by measuring the level of maternal health service utilization in 4 Health Centers in Ethiopia.
The study was comparative by design employing comparison of maternal and newborn health service utilization before and after initiation of mHealth implementation. Follow-up data of 800 clients were randomly selected and included in the study, to determine the magnitude of maternal and new-born health service utilization. Data analysis included comparison of pre-mHealth (baseline) with mHealth follow-up data, using independent t-test to compare magnitude of maternal and new-born health service utilization.
The mean of antenatal care follow-up during their recent pregnancy was 2.21(SD{+/-}1.02) and 3.43(SD{+/-}0.88) for baseline and intervention, respectively. Antenatal visit of four or more was reached for 55(13.8%) of the baseline and 256(64%) of pregnant women in the mHealth intervention group. Pregnant womens timeliness to start ANC follow-up at baseline and intervention groups was 44.5% and 77.3%, respectively. Institutional delivery at baseline and intervention groups were 35.0% and was 71.2%, respectively. Of women who gave birth, 23.8% at baseline received first postnatal care within 6 hours, 11.3% within 6 days, and 6.8% within 6 weeks. In the intervention group, 84% delivered women received first postpartum within 6 hours after delivery, 70.8% after 6 hours, and 46% made their third postpartum visit within 6 weeks after delivery. Penta-3 vaccination coverage at baseline and mHealth intervention groups was 61.5% and 70.4%, respectively.
The study result suggest that the introduction of a low-cost mHealth technologies contributed to the observed improvement of maternal and new-born health service utilization. This intervention shows promise for scale up as well as to be applied to other health interventions beyond maternal and newborn health services. | health systems and quality improvement |
10.1101/2022.04.04.22273418 | Interleukin-6 as a predictor of early weaning from invasive mechanical ventilation in patients with acute respiratory distress syndrome | BackgroundTherapeutic effects of steroids on acute respiratory distress syndrome (ARDS) requiring mechanical ventilation (MV) have been reported. However, predictive indicators of early weaning from MV post-treatment have not yet been defined, making treating established ARDS challenging. Interleukin (IL)-6 has been associated with the pathogenesis of ARDS.
ObjectiveOur aim was to clarify clinical utility of IL-6 level in ventilated patients with established ARDS.
MethodsClinical, treatment, and outcome data were evaluated in 119 invasively ventilated patients with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2)-mediated ARDS. Plasma levels of IL-6 and C-reactive protein (CRP) were measured on days 1, 4, and 7 after intubation.
ResultsFifty-two patients were treated with dexamethasone (steroid group), while the remaining 67 patients were not (non-steroid group). Duration of MV use was significantly shorter in the steroid group compared to non-steroid group (11.5{+/-}0.6 vs. 16.1{+/-}1.0 days, P = 0.0005, respectively) along with significantly decreased levels of IL-6 and CRP. Even when restricted to the steroid group, among variables post-MV, IL-6 level on day 7 was most closely correlated with duration of MV use (Spearmans rank correlation coefficient [{rho}] = 0.73, P < 0.0001), followed by CRP level on day 7 and the percentage change in IL-6 or CRP levels between day 1 and day 7. Moreover, among these variables, IL-6 levels on day 7 showed the highest accuracy for withdrawal from MV within 11 days (AUC: 0.88), with optimal cutoff value of 20.6 pg/mL. Consistently, the rate of MV weaning increased significantly earlier in patients with low IL-6 ([≤] 20.6 pg/mL) than in those with high IL-6 (> 20.6 pg/mL) (log-rank test P < 0.0001).
ConclusionsIn invasively ventilated patients with established ARDS due to SARS-CoV-2, plasma IL-6 levels served as a predictor of early withdrawal from MV after dexamethasone administration. | intensive care and critical care medicine |
10.1101/2022.03.29.22272937 | Adequacy Of Salt Iodization, Retailers Knowledge And Associated Factors About Iodized Salt In Oromia Special Zone Surrounding Finfinne, Ethiopia | BackgroundThere are laws to enforce the universal iodization of salt to check the consequences of iodine deficiency in Ethiopia. These laws are to ensure that there are production and sales of iodized salt in the country. Yet, the adequacy of iodized salt in the retailers level is not determined.
ObjectiveThis study is aimed to investigate the adequacy of salt iodization, retailers knowledge, and associated factors about iodized salt among retailers in Addis Ababa surrounding finfinne Special Zone, Ethiopia.
MethodsA community-based cross-sectional study was conducted among 202 iodized salt retailers that were selected by using a simple random sampling technique. Data among the shop retailers for the assessment of knowledge was collected by using a structured questionnaire, and titration test method was done to determine the adequacy of salt iodization. The data management and analysis were done by using Statistical Packages for Social Scientists Version 23. Descriptive statistics such as mean, median, standard deviation, and frequency were computed to describe the data. Binary logistic regression analysis was done to examine the association between the dependent and independent variables. The strength of association between the dependent and independent variables was explained by using the odds ratio and the level of statistical significance was accepted at a p-value of less than 0.05. Finally, the results were presented by using graphs and tables.
ResultA total of 202 salt retailers were approached in this study with a mean age of 28.78{+/-}7.548 years. Nearly around two-thirds of the study participants were male. Of the total retailers salt samples, 95(57.2%) were adequately (20-40ppm) iodized, and the educational level of the retailers was significantly associated with the adequacy of salt iodization. From the total salt retailer, 113(55.9%) had good knowledge, and retailers who heard about iodized salt (AOR=0.043, CI: 0.017-0.109, p<0.001) and having legal framework/law (AOR=0.24, CI: 0.098-0.61, p=0.003) that prohibits the selling of non-iodized salt were significantly associated with good knowledge level.
Conclusion and RecommendationStill policies have been implemented to promote the production and consumption of iodized salt, the iodine content of salt in retail shops in the Oromia Special Zone Finfinne, Oromia regional state is not encouraging. We recommend the establishment of checkpoints along the production and distribution chain to ensure salt with adequate iodine reaches the consumer. Again, traders of iodized salt should have regular training on ways to preserve salt to maintain its iodine content | public and global health |
10.1101/2022.03.28.22272737 | One-year results of long femoropopliteal lesions stenting with fasciotomy lamina vastoadductoria | ObjectiveFasciotomy can increase the mobility of the superficial femoral artery and reduce the incidence of stent breakage. This study aimed to compare the long-term patency of drug-eluting nitinol stents with and without fasciotomy in patients with prolonged SFA occlusions.
MethodsA randomized clinical trial was conducted in 60 (1:1) patients with long femoropopliteal steno-occlusive lesions more than 200 mm. Patients in group 1 (Zilver) underwent recanalization of occlusion of the femoropopliteal artery with stenting. In group 2 (ZilverFas), recanalization of the femoropopliteal occlusion with stenting and fasciotomy of Gunters canal were performed. The follow up evaluation of patency at 6, 12 months.
Results12-month primary patency in Zilver and ZilverFas groups was 51% and 80%, respectively (p = 0.02). The freedom from target revascularization (TLR) in Zilver and ZilverFas groups was 50% and 76%, respectively (p = 0.04). At one-years, primary-assisted and secondary patency for the ZilverFas and Zilver groups were 83% versus 62% (p = 0.07), 86% versus 65% (p = 0.05), respectively. In groups Zilver and ZilverFas, the number of stents fracture was 14 and 7, respectively (p = 0.05). The Cox multivariables regression indicated that the stent fracture, diabetes mellitus were the independent predictor of restenosis and reocclusion. Fasciotomy reduced the chance of reocclusion and restenosis by 2.94 times.
ConclusionsOur study has shown that a decompressing the stented segment with fasciotomy significantly improves the patency of the femoropopliteal segment and significantly reduces the number and severity of stents fractures. | surgery |
10.1101/2022.03.31.22273092 | Exploring the use of UmbiFlow to assess the impact of heat stress on fetoplacental blood flow in field studies. | ObjectiveTo evaluate the impact of heat stress on umbilical artery resistance index (RI) measured by UmbiFlow in field settings and the implications for pregnancy outcomes.
MethodsThis feasibility study was conducted in West Kiang, The Gambia, West Africa; a rural area with increasing exposure to extreme heat. We recruited women with singleton fetuses who performed manual tasks (such as farming) during pregnancy. The umbilical artery RI was measured at rest, during and at the end of a typical working shift in women [≥] 28 weeks gestation. Adverse pregnancy outcomes (APO) were classified as stillbirth, preterm birth, low birth weight, or small for gestational age, and all other outcomes as normal.
ResultsA total of 40 participants were included; 23 normal births and 17 APO. Umbilical artery RI demonstrated a nonlinear relationship to heat stress, with indication of a potential threshold value for placental insufficiency around 32{o}C by universal thermal climate index. Preliminary evidence suggests the fetoplacental circulation response to heat stress differs in APO versus normal outcome.
ConclusionsThe Umbiflow device proved to be an effective field method for assessing placental function. Dynamic changes in RI may begin to explain the association between extreme heat and APO.
FundingThe Wellcome Trust (216336/Z/19/Z)
SynopsisExtreme heat exposure is increasing and a low-cost umbilical artery doppler device, UmbiFlow, can aid understanding of fetoplacental function under heat stress conditions. | obstetrics and gynecology |
10.1101/2022.03.31.22273149 | Undernutrition and associated factors among people living with HIV under NACS assessment in Muchinga Province, Zambia, 2019-2020 | IntroductionGood nutrition in People Living with HIV (PLHIV) has a good influence on treatment outcomes and in turn a better quality of life. Despite, the significant role it plays, many patients have limited access to Nutritional Assessment Counselling and Support (NACS). We evaluated undernutrition in people living with HIV and associated factors in Muchinga Province, Zambia, from October 2019 to March 2020.
Material and MethodsThis was secondary analysis of routine program data of HIV-positive clients on ART enrolled at EQUIP-supported health facilities in Muchinga. Undernutrition was determined using body mass index (BMI) calculations and classified as undernutrition (<18.5 kg/m2), normal (18.5 - 24.9 kg/m2) or over-nutrition (overweight, 25 - 29.9 kg/m2 and obese, 25 - 29.9 kg/m2). Multivariate-adjusted odds ratios (aOR) were used to assess factors associated with undernutrition.
ResultsOf the 506 eligible clients under NACS, the mean age was 34.9 years {+/-} 13.5SD, with 251 (approximately 50%) between the ages of 21 - 39 years. More than half (67%) were females, 284 (56%) were urban residents, and 180 (35.6 %) were unemployed. The majority (approximately 71%) were on the TLE regimen with a median duration on ART treatment of [~]3 years (IQR=1- 6). There were 233 (46%) who had a normal BMI, 191 (37.7%) who had under-nutrition, and 82 (16.2%) who had over-nutrition (9.7% overweight: 6.5% obesity). Clients in the urban area (aOR= 2.0; 95%CI: 1.28 - 3.1), unemployed (aOR= 2.4 (1.18-4.69)2.4; 95%CI: 1.18 - 4.69), married (aOR= 2.3; 95%CI: 1.26 - 4.38) and being on TLD (aOR= 2.8; 95%CI: 1.23 - 6.23) were more likely to be under-nourished.
ConclusionNACS played a vital role in identifying HIV-positive clients who required more specialized care for improved clinical health outcomes. There is a need to strengthen HIV and nutrition integration in low-resourced countries with high HIV burden for improved treatment outcomes and quality of life. | hiv aids |
10.1101/2022.03.30.22271503 | Combinatorial language parent-report score differs significantly between typically developing children and those with Autism Spectrum Disorders | Prefrontal synthesis (PFS) is a component of constructive imagination. It is defined as the process of mentally juxtaposing objects into novel combinations. For example, to comprehend the instruction "put the cat under the dog and above the monkey," it is necessary to use PFS in order to correctly determine the spatial arrangement of the cat, dog, and monkey with relation to one another. The acquisition of PFS hinges on the use of combinatorial language during early development in childhood. Accordingly, children with developmental delays exhibit a deficit in PFS, and frequent assessments are recommended for these individuals. In 2018, we developed the Mental Synthesis Evaluation Checklist (MSEC), a parent-reported evaluation designed to assess PFS and combinatorial language comprehension. In this manuscript we use MSEC to identify differences in combinatorial language acquisition between ASD (N=29138) and neurotypical (N=111) children. Results confirm the utility of the MSEC in distinguishing language deficits in ASD from typical development as early as 2 years of age (p<0.0001). | pediatrics |
10.1101/2022.03.30.22273090 | Temporal trends in the incidence of haemophagocytic lymphohistiocytosis: a nationwide cohort study from England 2003-2018 | BackgroundHaemophagocytic lymphohistiocytosis (HLH) is rare, results in high mortality and is increasingly being diagnosed. Little is known about what is driving the apparent rise in the incidence of this disease.
MethodsUsing national linked electronic health data from hospital admissions and death certification cases of HLH that were diagnosed in England between 1/1/2003 and 31/12/2018 were identified using a previously validated approach. We calculated incidence rates of diagnosed HLH per million population using mid-year population estimates by calendar year, age group, sex and associated comorbidity (haematological malignancy, inflammatory rheumatological or bowel diseases (IBD)) associated with the diagnosis of HLH. We modelled trends in incidence and the interactions between calendar year, age and associated comorbidity using Poisson regression.
FindingsThere were 1674 people with HLH diagnosed in England between 2003 and 2018. The incidence rate quadrupled (Incidence Rate Ratio (IRR) 2018 compared to 2003: 3.88 95% Confidence Interval (CI) 2.91 to 5.28), increasing 11% annually (adjusted IRR 1.11 95% CI 1.09 to 1.12). There was a rising trend in all age groups except those aged less than 5 years. There was a transition across the age groups with greater increases in those aged 5 to 14 years of HLH associated with rheumatological disease/IBD compared to HLH associated with haematological malignancy, with similar increases in HLH associated with both co-morbidities for those 15-54, and greater increases in associated haematological malignancies for those 55 years and older.
InterpretationThe incidence of HLH in England has quadrupled between 2003 and 2018, increasing 11% annually. Substantial variation in the incidence occurred by age group and by HLH associated comorbidities with inflammatory rheumatological diseases or IBD associated HLH increasing more among the young and middle age groups, whereas in older age groups the largest increase was seen with haematological malignancy-associated HLH.
Evidence before this studyThere is a paucity of population-based data on the epidemiology of HLH worldwide. The available evidence relies mostly upon a collection of cases series published in The Lancet in 2014 which described 2197 cases of HLH in adults reported in the literature to that point. Almost all of these were from tertiary referral specialist centres and/or described in small case series. The incidence of HLH has only been described in a few reports - and mainly this has focused on children with primary HLH. No previous studies have been large enough to examine trends in incidence by age, sex, underlying risk factors and calendar time.
Added value of this studyThis study quantifies the incidence of diagnosed HLH for the first time in a nationwide manner for all age groups. It reports on 1674 patients with HLH from England and shows that there is substantial variation in the incidence by age group, associated disease and calendar time. The results imply reasons for the increase in HLH could be related to the increasing occurrence of haematological cancer, inflammatory rheumatological or bowel diseases and the treatments given for these conditions. This study has been carried out in partnership with the National Congenital Anomalies and Rare Diseases Registration Service and the methodology described can in future be applied to many rare diseases that as yet lack a way of quantifying crucial epidemiological metrics.
Implications of all the available evidenceThe incidence of HLH is rising rapidly in people older than 5 years of age. This could be due to an increase in the biologic, immunomodulation or immunosuppressive therapy being used in people with haematological cancer and inflammatory rheumatological and bowel diseases. Further work should focus on how to minimise the risk of HLH occurring, or to improve treatment of this often fatal disease among those who need treatment for an associated comorbidity. | hematology |
10.1101/2022.03.29.22272857 | Proteasome inhibition enhances myeloma oncolytic reovirus therapy by suppressing monocytic anti-viral immune responses | Reovirus is an oncolytic virus with natural tropism for cancer cells. We previously showed that reovirus intravenous administration in myeloma patients was safe, but disease control associated with viral replication in the cancer cells was not observed. Here we show that ex vivo proteasome inhibitors (PIs) potentiate reovirus replication in circulating classical monocytes, increasing viral delivery to myeloma cells. We found that the anti-viral signals in monocytes primarily rely on the NF-kB activation and that this effect is impaired by the addition of PIs. Conversely, PIs improved reovirus-induced monocyte and T cell activation against cancer cells. Based on these preclinical data, we conducted a phase 1b trial of the reovirus Pelareorep together with the PI carfilzomib in 13 heavily pretreated bortezomib-resistant MM patients. Objective responses associated with reovirus active replication in MM cells, T cell activation and monocytic expansion were noted in 70% of patients. | oncology |
10.1101/2022.04.04.22273334 | Effect of Heat inactivation and bulk lysis on Real-Time Reverse Transcription PCR Detection of the SARS-COV-2: An Experimental Study | The outbreak of the severe acute respiratory syndrome coronavirus - 2 has quickly turned into a global pandemic. Real-time reverse transcription Polymerase chain reaction is commonly used to diagnose as "gold standard". Many coronaviruses are sensitive to heat and chemicals. Heat and chemical inactivation of samples is considered a possible method to reduce the risk of transmission, but the effect of heating and chemical treatment on the measurement of the virus is still unclear. Thus, this study aimed to investigate the effect of heat inactivation and chemical bulklysis on virus detection. The laboratory-based experimental study design was conducted in Ethiopian Public Health Institute from August to November 2020 on the samples referred to the laboratory for Coronavirus disease-19 testing. Tests were performed on eighty Nasopharyngeal/Oropharyngeal swab samples using the Abbott Real-time severe acute respiratory syndrome coronavirus - 2 assays, a test for the qualitative detection of virus in the sample. Data were analyzed and described by mean and standard deviation. Repeated measurement analysis of variance was used to assess the mean difference between the three temperatures and bulk lysis on viral detection. Post-hock analysis was employed to locate the place of significant differences. P-values less than 0.05 was used to declare statistical significance. About 6.2% (5/80) of samples were changed to negative results in heat inactivation at 60{degrees}C and 8.7% (7/80) of samples were changed to negative in heat inactivation at 100{degrees}C. The Cyclic threshold values of heat-inactivated samples (at 60{degrees}C, at 100{degrees}C, and bulk lysis) were significantly different from the temperature at 56{degrees}C. The efficacy of heat-inactivation varies greatly depending on temperature and duration. Therefore, local validation and verification of heat-inactivation are essential. | infectious diseases |
10.1101/2022.03.30.22271928 | Multiplexed biosensor for point-of-care COVID-19 monitoring: CRISPR-powered unamplified RNA diagnostics and protein-based therapeutic drug management | In late 2019 SARS-CoV-2 rapidly spread to become a global pandemic, therefore, measures to attenuate chains of infection, such as high-throughput screenings and isolation of carriers were taken. Prerequisite for a reasonable and democratic implementation of such measures, however, is the availability of sufficient testing opportunities (beyond reverse transcription PCR, the current gold standard). We, therefore, propose an electrochemical, microfluidic multiplexed biosensor in combination with CRISPR/Cas-powered assays for point-of-care nucleic acid testing. In this study, we simultaneously screen for and identify SARS-CoV-2 infections (Omicron-variant) in clinical specimens (Sample-to-result time: [~]30 min), employing LbuCas13a, whilst bypassing reverse transcription as well as target amplification of the viral RNA, both of which are necessary for detection via PCR and multiple other methods. In addition, we demonstrate the feasibility of combining assays based on different classes of biomolecules, in this case protein-based antibiotic detection, on the same device. The programmability of the effector and multiplexing capacity (up to six analytes) of our platform, in combination with a miniaturized measurement setup, including a credit card sized near field communication (NFC) potentiostat and a microperistaltic pump, provide a promising on-site tool for identifying individuals infected with variants of concern and monitoring their disease progression alongside other potential biomarkers or medication clearance. | infectious diseases |
10.1101/2022.04.05.22273453 | Unsuppressed HIV infection impairs T cell responses to SARS-CoV-2 infection and abrogates T cell cross-recognition | HIV infection has been identified as one of the major risk factors for severe COVID-19 disease, but the mechanisms underpinning this susceptability are still unclear. Here, we assessed the impact of HIV infection on the quality and epitope specificity of SARS-CoV-2 T cell responses in the first wave and second wave of the COVID-19 epidemic in South Africa. Flow cytometry was used to measure T cell responses following PBMC stimulation with SARS-CoV-2 peptide pools. Culture expansion was used to determine T cell immunodominance hierarchies and to assess potential SARS-CoV-2 escape from T cell recognition. HIV-seronegative individuals had significantly greater CD4+ and CD8+ T cell responses against the Spike protein compared to the viremic PLWH. Absolute CD4 count correlated positively with SARS-CoV-2 specific CD4+ and CD8+ T cell responses (CD4 r= 0.5, p=0.03; CD8 r=0.5, p=0.001), whereas T cell activation was negatively correlated with CD4+ T cell responses (CD4 r= -0.7, p=0.04). There was diminished T cell cross-recognition between the two waves, which was more pronounced in individuals with unsuppressed HIV infection. Importantly, we identify four mutations in the Beta variant that resulted in abrogation of T cell recognition. Together, we show that unsuppressed HIV infection markedly impairs T cell responses to SARS-Cov-2 infection and diminishes T cell cross-recognition. These findings may partly explain the increased susceptibility of PLWH to severe COVID-19 and also highlights their vulnerability to emerging SARS-CoV-2 variants of concern.
One sentence summaryUnsuppressed HIV infection is associated with muted SARS-CoV-2 T cell responses and poorer recognition of the Beta variant. | infectious diseases |
10.1101/2022.04.05.22273480 | Immunogenicity of BNT162b2 COVID-19 vaccine in New Zealand adults | BackgroundThere is very little known about SARS-CoV-2 vaccine immune responses in New Zealand populations at greatest risk for serious COVID-19 disease.
MethodsThis prospective cohort study assessed immunogenicity in BNT162b2 mRNA vaccine recipients in New Zealand without previous COVID-19, with enrichment for M[a]ori, Pacific peoples, older adults [≥] 65 years of age, and those with co-morbidities. Serum samples were analysed at baseline and 28 days after second dose for presence of quantitative anti-S IgG by chemiluminescent microparticle immunoassay and for neutralizing capacity against Wuhan, Beta, Delta, and Omicron BA.1 strains using a surrogate viral neutralisation assay.
Results285 adults with median age of 52 years were included. 55% were female, 30% were M[a]ori, 28% were Pacific peoples, and 26% were [≥]65 years of age. Obesity, cardiac and pulmonary disease and diabetes were more common than in the general population. All participants received 2 doses of BNT162b2 vaccine. At 28 days after second vaccination, 99.6% seroconverted to the vaccine, and anti-S IgG and neutralising antibody levels were high across gender and ethnic groups. IgG and neutralising responses declined with age. Lower responses were associated with age [≥]75 and diabetes, but not BMI. The ability to neutralise the Omicron BA.1 variant in vitro was severely diminished but maintained against other variants of concern.
ConclusionsVaccine antibody responses to BNT162b2 were generally robust and consistent with international data in this COVID-19 naive cohort with representation of key populations at risk for COVID-19 morbidity. Subsequent data on response to boosters, durability of responses and cellular immune responses should be assessed with attention to elderly adults and diabetics. | infectious diseases |
10.1101/2022.04.05.22273483 | SARS-CoV-2 Omicron sublineage BA.2 replaces BA.1.1: genomic surveillance in Japan from September 2021 to March 2022 | ObjectiveThe new emerging Omicron strain of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) is currently spreading worldwide. We aimed to analyze the genomic evolution of the shifting Omicron virus subtypes.
MethodsThe study included 1,297 individuals diagnosed as SARS-CoV-2 positive by PCR test or antigen quantification test from September 2021 to March 2022. Samples were analyzed by whole genome sequencing analysis (n=489) or TaqMan assay (n=808).
ResultsAfter the outbreak of the SARS-CoV-2 Delta strain, the Omicron strain spread rapidly in Yamanashi, Japan. BA.1.1 was the predominant sublineage of the Omicron strain from January to mid-February 2022, but the number of cases of sublineage BA.2 began to increase after mid-February, and this sublineage was shown to have replaced BA.1.1 by the end of March 2022. We observed higher viral and antigen levels of sublineage BA.2 than of sublineage BA.1.1 in nasopharyngeal swab samples. However, no difference in viral load by patient age was apparent between sublineages BA.1.1 and BA.2.
ConclusionsA transition from sublineage BA.1.1 to sublineage BA.2 was clearly observed over approximately one month. Omicron sublineage BA.2 was found to be more transmissible owing to its higher viral load regardless of patient age. | infectious diseases |
10.1101/2022.04.05.22273474 | A comparison of regression discontinuity and propensity score matching to estimate the causal effects of statins using electronic health records | The authors have withdrawn this manuscript because the results are currently in the process of being updated. Therefore, the authors do not wish this work to be cited as reference for the project. If you have any questions, please contact the corresponding author. | epidemiology |
10.1101/2022.04.05.22273474 | A comparison of regression discontinuity and propensity score matching to estimate the causal effects of statins using electronic health records | The authors have withdrawn this manuscript because the results are currently in the process of being updated. Therefore, the authors do not wish this work to be cited as reference for the project. If you have any questions, please contact the corresponding author. | epidemiology |
10.1101/2022.04.05.22273474 | A comparison of regression discontinuity and propensity score matching to estimate the causal effects of statins using electronic health records | The authors have withdrawn this manuscript because the results are currently in the process of being updated. Therefore, the authors do not wish this work to be cited as reference for the project. If you have any questions, please contact the corresponding author. | epidemiology |
10.1101/2022.04.05.22273478 | Altered brain network topology in children with Auditory Processing Disorder: a resting-state multi-echo fMRI study | Children with auditory processing disorder (APD) experience hearing difficulties, particularly in the presence of competing sounds, despite having normal audiograms. There is considerable debate on whether APD symptoms originate from bottom-up (e.g., auditory sensory processing) and/or top-down processing (e.g., cognitive, language, memory). A related issue is that little is known about whether functional brain network topology is altered in APD. Therefore, we used resting-state functional magnetic resonance imaging data to investigate the functional brain network organization of 57 children from 8 to 13 years old, diagnosed with APD (n=28) and without hearing difficulties (healthy control, HC; n=29). We applied complex network analysis using graph theory to assess the whole-brain integration and segregation of functional networks and brain hub architecture. Our results showed children with APD and HC have similar global network properties -i.e., an average of all brain regions- and modular organization. Still, the APD group showed different hub architecture in default mode-ventral attention, somatomotor and frontoparietal-dorsal attention modules. At the nodal level -i.e., single-brain regions-, we observed decreased participation coefficient (PC - a measure quantifying the diversity of between-network connectivity) in auditory cortical regions in APD, including bilateral superior temporal gyrus and left middle temporal gyrus. Beyond auditory regions, PC was also decreased in APD in bilateral posterior temporo-occipital cortices, left intraparietal sulcus, and right posterior insular cortex. Correlation analysis suggested a positive association between PC in the left parahippocampal gyrus and the listening-in-spatialized-noise - sentences task where APD children were engaged in auditory perception. In conclusion, our findings provide evidence of altered brain network organization in children with APD, specific to auditory networks, and shed new light on the neural systems underlying childrens listening difficulties. | pediatrics |
10.1101/2022.04.05.22273478 | Altered brain network topology in children with Auditory Processing Disorder: a resting-state multi-echo fMRI study | Children with auditory processing disorder (APD) experience hearing difficulties, particularly in the presence of competing sounds, despite having normal audiograms. There is considerable debate on whether APD symptoms originate from bottom-up (e.g., auditory sensory processing) and/or top-down processing (e.g., cognitive, language, memory). A related issue is that little is known about whether functional brain network topology is altered in APD. Therefore, we used resting-state functional magnetic resonance imaging data to investigate the functional brain network organization of 57 children from 8 to 13 years old, diagnosed with APD (n=28) and without hearing difficulties (healthy control, HC; n=29). We applied complex network analysis using graph theory to assess the whole-brain integration and segregation of functional networks and brain hub architecture. Our results showed children with APD and HC have similar global network properties -i.e., an average of all brain regions- and modular organization. Still, the APD group showed different hub architecture in default mode-ventral attention, somatomotor and frontoparietal-dorsal attention modules. At the nodal level -i.e., single-brain regions-, we observed decreased participation coefficient (PC - a measure quantifying the diversity of between-network connectivity) in auditory cortical regions in APD, including bilateral superior temporal gyrus and left middle temporal gyrus. Beyond auditory regions, PC was also decreased in APD in bilateral posterior temporo-occipital cortices, left intraparietal sulcus, and right posterior insular cortex. Correlation analysis suggested a positive association between PC in the left parahippocampal gyrus and the listening-in-spatialized-noise - sentences task where APD children were engaged in auditory perception. In conclusion, our findings provide evidence of altered brain network organization in children with APD, specific to auditory networks, and shed new light on the neural systems underlying childrens listening difficulties. | pediatrics |
10.1101/2022.03.29.22272361 | Sustained Minimal Residual Disease Negativity in Multiple Myeloma is Associated with Stool Butyrate and Healthier Plant-Based Diets | Sustained minimal residual disease (MRD) negativity is associated with long-term survival in multiple myeloma (MM). The gut microbiome is affected by diet, and in turn can modulate host immunity, for example through production of short-chain fatty acids including butyrate. We examined the relationship of dietary factors, stool metabolites, and microbial diversity with sustained MRD negativity in patients on lenalidomide maintenance. At 3 months, higher stool butyrate concentration (p=0.037), butyrate producers (p=0.025) and -diversity (p=0.0035) were associated with sustained MRD-negativity. Healthier dietary proteins, (from seafood and plants), correlated with butyrate at 3 months (p=0.009) and sustained MRD-negativity (p=0.05). Consumption of dietary flavonoids, plant nutrients with antioxidant effects, correlated with stool butyrate concentration (anthocyanidins p=0.01, flavones p=0.01, and flavanols p=0.02). This is the first study to demonstrate an association between a plant-based dietary pattern, stool butyrate production and sustained MRD-negativity in MM; providing rationale to evaluate a prospective dietary intervention.
Statement of SignificanceWe demonstrate an association between diet, the gut microbiome, and sustained MRD-negativity in MM. A healthy diet, with adequate plant and seafood protein, and containing flavonoids, associates with stool diversity, butyrate production and sustained MRD-negativity. These findings suggest dietary modification should be studied to enhance myeloma control.
Key PointsO_LIIn MM on lenalidomide maintenance, stool butyrate concentration at 3 months was associated with higher rates of MRD negativity at 12 months.
C_LIO_LIIncreased seafood and plant proteins, dietary flavonoids, and diversity of dietary flavonoids correlated with stool butyrate concentrations.
C_LI | oncology |
10.1101/2022.04.04.22271362 | Loss of CHGA protein as a potential biomarker for colon cancer diagnosis: a study on biomarker discovery by machine learning and confirmation by immunohistochemistry in colorectal cancer tissue microarrays | BackgroundIncidences of colon and rectal cancers have been parallelly and constantly increased, however, the mortality has been slightly decreased due to early diagnosis and better therapies during the last decades. Precise early diagnosis for colorectal cancer has been a great challenge in order to win chances for the best choices of cancer therapies.
Patients and methodsWe have started with searching protein biomarkers based on our colorectal biomarker database (CBD), finding differential expressed genes (GEGs) and non-DEGs from RNA sequencing (RNA-seq) data, and further predicted new biomarkers on protein-protein interaction (PPI) networks by machine learning (ML) methods. The best-selected biomarker was further verified by receiver operating characteristic (ROC) test from microarray and RNA-seq data, biological network and functional analysis, and immunohistochemistry in the tissue arrays from 198 specimens.
ResultsThere were twelve proteins (MYO5A, CHGA, MAPK13, VDAC1, CCNA2, YWHAZ, CDK5, GNB3, CAMK2G, MAPK10, SDC2, and ADCY5) which were predicted by ML as colon cancer candidate diagnosis biomarkers. These predicted biomarkers showed close relationships with reported biomarkers on PPI network and shared some pathways. ROC test showed CHGA protein with the best diagnostic accuracy (AUC=0.9 in microarray data and 0.995 in RNA-seq data) among these candidate protein biomarkers. Furthermore, CHGA performed well in the immunohistochemistry test.
ConclusionsProtein expression of CHGA in the normal colorectal mucosa was lost in the colorectal cancers and the lose of CHGA protein might be a potential candidate biomarker for colon or even colorectal cancer diagnosis.
TopicCHGA expression, colon cancer diagnosis
Key MessageThe results of this study suggest that lose of CHGA expression from the normal colon and adjacent mucosa to colon cancer may be used as a valuable biomarker for early diagnosis of colon adenocarcinoma | oncology |
10.1101/2022.04.05.22273294 | Impulse dispersion of aerosols during playing the recorder and evaluation of safety measures | IntroductionGroup musical activities using wind instruments have been restricted during the CoVID19 pandemic due to suspected higher risk of virus transmission. It was presumed that the aerosols exhaled through the tubes while playing would be ejected over larger distances and spread into the room due to jet stream effects. In particular, the soprano recorder is widely used as instrument in school classes and for beginners of all age groups in their musical education, as well as in contexts of leisure activities and in professional concert performances.
Understanding the aerosol impulse dispersion characteristics of playing the soprano recorder could assist with the establishment of concepts for safe music-making.
MethodsFive adult professionally trained soprano recorder players (4 female, 1 male) played four bars of the main theme of L. van Beethovens "Ode to Joy" in low and in high octaves, as well as with 3 different potential protection devices in the high octave. For comparison they spoke the corresponding text by F. Schiller. Before each task, they inhaled .5 L of vapor from an e-cigarette filled with base liquid. The vapor cloud escaping during speaking or playing was recorded by cameras and its spread was measured as a function of time in the three spatial dimensions. The potential safety devices were rated in practicability with a questionnaire, and their influence on the sound was compared, generating a long-term average spectrum from the audio data.
ResultsWhen playing in the high octave, at the end of the task the clouds showed a median distance of 1.06 m to the front and .57 m diameter laterally (maxima: x: 1.35 m and y: .97 m). It was found that the clouds expansion values in playing the recorder with and without safety measures are mostly lower when compared to the ordinary, raised speaking voice of the same subjects.
The safety devices which covered the instrument did not show clear advantages and were rated as unpractical by the subjects. The most effective reduction of the cloud was reached when playing into a suction funnel.
ConclusionThe aerosol dispersion characteristics of soprano recorders seem comparable to clarinets. The tested safety devices which covered holes of the instrument did not show clear benefits. | otolaryngology |
10.1101/2022.04.04.22273309 | Bi-allelic loss-of-function variants in PPFIBP1 cause a neurodevelopmental disorder with microcephaly, epilepsy and periventricular calcifications | PPFIBP1 encodes for the liprin-{beta}1 protein which has been shown to play a role in neuronal outgrowth and synapse formation in Drosophila melanogaster. By exome sequencing, we detected nine ultra-rare homozygous loss-of-function variants in 14 individuals from 10 unrelated families. The individuals presented with moderate to profound developmental delay, often refractory early-onset epilepsy and progressive microcephaly. Further common clinical findings included muscular hypertonia, spasticity, failure to thrive and short stature, feeding difficulties, impaired hearing and vision, and congenital heart defects. Neuroimaging revealed abnormalities of brain morphology with leukoencephalopathy, cortical abnormalities, and intracranial periventricular calcifications as major features. In a fetus with intracranial calcifications, we identified a rare homozygous missense variant that by structural analysis was predicted to disturb the topology of the SAM-domain region that is essential for protein-protein interaction. For further insight in the effects of PPFIBP1 loss-of-function, we performed automated behavioural phenotyping of a Caenorhabditis elegans PPFIBP1/hlb-1 knockout model which revealed defects in spontaneous and light-induced behaviour and confirmed resistance to the acetylcholinesterase inhibitor aldicarb suggesting a defect in the neuronal presynaptic zone. In conclusion, we present bi-allelic loss-of-function variants in PPFIBP1 as a novel cause of an autosomal recessive neurodevelopmental disorder. | genetic and genomic medicine |
10.1101/2022.04.05.22273371 | Antibody responses to SARS-CoV-2 vaccination in patients with acute leukaemia and high risk MDS on active anti-cancer therapies | Patients with haematological malignancies, such as acute leukaemia and high-risk MDS (HR-MDS), have significantly increased mortality and morbidity from COVID-19. However vaccine efficacy in these patients and the impact of systemic anti-cancer therapy (SACT) on vaccine response remains to be fully established. SARS-CoV-2 antibody responses in 53 patients with ALL, AML or HR-MDS receiving SACT were characterised following two doses of either BNT162b2 or ChAdOx1nCoV-19. All patients were tested for anti-S antibodies after 2 doses, 60% after the first dose and anti-N antibody testing was performed on 46 patients (87%). Seropositivity rates after 2 vaccine doses were 95% in AML/HR-MDS patients and 79% in ALL. After stratification by prior SARS-CoV-2 infection, naive patients with AML/HR-MDS had higher seroconversion rates and median anti-S antibody titres compared to ALL (median 291U/mL versus 5.06U/mL), and significant increases in anti-S titres with consecutive vaccine doses, not seen in ALL. No difference was seen in serological response between patients receiving intensive chemotherapy or non-intensive therapies (HMA) but significantly reduced titres were present in AML/HR-MDS patients who received venetoclax-based regimens compared to other therapies. All ALL patients received intensive chemotherapy, with no further impact of anti-CD20 immunotherapy on serological response. Understanding the impact of disease subtypes and therapy on vaccine response is essential to enable decisions on modifying or delaying treatment in the context of either SARS-CoV-2 infection or vaccination. | hematology |
10.1101/2022.04.06.22273503 | High incidence of sotrovimab resistance and viral persistence after treatment of immunocompromised patients infected with the SARS-CoV-2 Omicron variant | BackgroundSotrovimab is a monoclonal antibody that neutralizes SARS-CoV-2 by binding to a highly conserved epitope in the receptor binding domain. It retains activity against the Omicron BA.1 variant and is used to treat immunocompromised patients as they are at increased risk for a severe outcome of COVID-19.
MethodsWe studied viral evolution in 47 immunocompromised patients infected with Omicron BA.1 or 2 and treated with sotrovimab. SARS-CoV-2 PCR was performed at baseline and weekly thereafter until Ct-value was [≥] 30. All RNA samples were sequenced to determine the variant and occurrence of mutations, in particular in the Spike protein, after treatment.
ResultsTwenty-four (51%) of the 47 patients were male and their median age was 63 years. Thirty-one (66%) had undergone a solid organ transplantation and 13 (28%) had received prior B-cell depleting therapy. Despite a history of vaccination, 24 of 30 patients with available data on anti-SARS-CoV-2 IgG Spike antibodies prior to treatment with sotrovimab had very low or no antibodies. Median time to viral clearance (Ct-value [≥] 30) after treatment was 15 days (IQR 7-22). However, viral RNA with low Ct-values was continuously detected for at least 28 days after treatment in four patients infected with BA.1. Mutations in the Spike protein at position 337 or 340 were observed in all four patients. Similar mutations were also found after treatment of two patients with a BA.2 infection but both cleared the virus within two weeks. Thus following treatment with sotrovimab, spike mutations associated with reduced in vitro susceptibility were detected in 6 of 47 (13%) patients.
ConclusionViral evolution towards resistance against sotrovimab can explain treatment failure in most immunocompromised patients and these patients can remain infectious after treatment. Therefore, documenting viral clearance after treatment is recommended to avoid that these patients unintentionally become a source of new, sotrovimab resistant, variants. Research on direct acting antivirals and possibly combination therapy for the treatment of COVID-19 in immunocompromised patients is needed. | infectious diseases |
10.1101/2022.04.05.22273450 | Predictors for Reactogenicity and Humoral Immunity to SARS-CoV-2 Following Infection and mRNA Vaccination: A Regularized Mixed-Effects Modelling Approach | BackgroundThe influence of pre-existing humoral immunity, inter-individual demographic factors, and vaccine-associated reactogenicity on immunogenicity following COVID vaccination remains poorly understood.
MethodsTen-fold cross-validated least absolute shrinkage and selection operator (LASSO) and linear mixed effects models were used to evaluate symptoms experienced during natural infection and following SARS-CoV-2 mRNA vaccination along with demographics as predictors for antibody (AB) responses in COVID+ participants in a longitudinal cohort study.
ResultsIn previously infected individuals, AB were more durable and robust following vaccination when compared to natural infection alone. Higher AB were associated with experiencing dyspnea during natural infection, as was the total number of symptoms reported during the COVID-19 disease course. Both local and systemic symptoms following 1st and 2nd dose of SARS-CoV-2 mRNA vaccines were predictive of higher AB after vaccination, as were the demographic factors of age and Hispanic ethnicity. Lastly, there was a significant temporal relationship between AB and days since infection or vaccination.
ConclusionVaccination in COVID+ individuals ensures a more robust immune response. Experiencing systemic and local symptoms post-vaccine is suggestive of higher AB, which may confer greater protection. Age and Hispanic ethnicity are predictive of higher AB. | infectious diseases |
10.1101/2022.04.04.22273320 | Monitoring of SARS-CoV-2 variant dynamics in wastewater by digital RT-PCR : from Alpha to Omicron BA.2 VOC | Throughout the COVID-19 pandemic, new variants have continuously emerged and spread in populations. Among these, variants of concern (VOC) have been the main culprits of successive epidemic waves, due to their transmissibility, pathogenicity or ability to escape the immune response. Quantification of the SARS-CoV-2 genomes in raw wastewater is a reliable approach well-described and widely deployed worldwide to monitor the spread of SARS-CoV-2 in human populations connected to sewage systems. Discrimination of VOCs in wastewater is also a major issue and can be achieved by genome sequencing or by detection of specific mutations suggesting the presence of VOCs. This study aimed to date the emergence of these VOCs (from Alpha to Omicron BA.2) by monitoring wastewater from the greater Paris area, France, but also to model the propagation dynamics of these VOCs and to characterize the replacement kinetics of the majority populations. These dynamics were compared to various individual-centered public health data, such as regional incidence and proportions of VOCs identified by sequencing of isolated patient strains. The viral dynamics in wastewater highlighted the impact of the vaccination strategy on the viral circulation in human populations but also suggested its potential effect on the selection of variants most likely to be propagated in immunized populations. Normalization of concentrations to capture population movements appeared statistically more reliable using variations in local drinking water consumption rather than using PMMoV concentrations because PMMoV fecal shedding was subject to variability and was not sufficiently relevant in this study. The dynamics of viral spread was observed earlier (about 13 days on the wave related to Omicron VOC) in raw wastewater than the regional incidence alerting to a possible risk of decorrelation between incidence and actual virus circulation probably resulting from a lower severity of infection in vaccinated populations. | infectious diseases |
10.1101/2022.04.05.22273434 | Analysis of immunization, adverse events, and efficacy of a fourth dose of BNT162b2 vaccine | ImportanceScarce information exists concerning the seroconversion and adverse events after immunization (AEFI) of the fourth dose of a SARS-COV-2 vaccine.
ObjectiveCorrelate the magnitude of the antibody response to vaccination with previous clinical conditions and AEFI of the fourth dose of BNT162b2 mRNA.
DesignObservational study where SARS-CoV-2 spike 1-2 IgG antibodies IgG titers were measured 21-28 days after the exposition of the first, and second dose, three months after the second dose, 1-7 days after the third dose, before the fourth dose, and 21-28 days after the fourth dose of BNT162b2 mRNA.
SettingThe study was conducted on healthcare workers of a private hospital in Northern Mexico.
ParticipantsInclusion criteria were healthcare workers of both genders, any age, who planned to conclude the immunization regimen. The exclusion criteria were previously given any SARS-CoV-2 vaccine prior to study entry.
InterventionSubjects were exposed to four doses of the BNT162b2 mRNA vaccine.
Main Outcome and Measures:
The anti-S1 and anti-S2 IgG antibodies against SARS-CoV-2 in plasma samples were measured with chemiluminescence immunoassay developed by DiaSorin.
ResultsWe recruited 112 subjects [43 (SD 9) years old, 74% women].
After the first dose, subjects had a median (IQR) AU/ml IgG of 122(1904), with an increase to 1875 (2095) after the second dose, 3020 (2330) after the third dose, and 4230 (3393) after 21-28 of a fourth dose (p<0.01). The number (%) of any AEFI between doses was 90 (80.4), 89(79), 65(58), 69 (61.5), after first, second, third, and fourth, respectively, p<0.001. After the fourth dose, the most frequent AEFI was pain at the injection site (87%). Fever was slightly more frequent after the third and fourth doses, 9 (13.8) and 8 (11.4%) cases, respectively, and adenopathy was more frequent after the fourth dose [in11(15.7%) cases]. There was a correlation between AEFI in the fourth dose with gender and antibody levels (p<0.05). The highest proportion of AEFI was considered mild after the fourth dose. During the Omicron outbreak, 6 (5.3%) had mild SARS-CoV-2 during 8-28 days of the fourth dose.
Conclusions and RelevanceThe fourth dose of BNT162b2mRNA increases S1/S2 IgG 33.6 times with mild adverse events.
Registration numberNCT05228912
Key pointsO_ST_ABSQuestionC_ST_ABSWhat is the magnitude of antibody response to vaccination and adverse events after immunization (AEFI) of a fourth dose of BNT162b2 mRNA?
FindingsThis cohort included 112 healthcare workers. We measured S1/S2 IgG vs. SARS-CoV-2 after the first, second, third and fourth dose. Compared to the first dose, antibodies increased 33.6 times the antibody levels after the fourth dose. We found minimal to moderate adverse events. The change in antibodies correlated with AEFI. During the Omicron outbreak 6 (5.3%) had mild SARS-CoV-2.
MeaningA fourth dose of BNT162b2mRNA increases S1/S2 IgG with mild to moderate adverse events. | infectious diseases |
10.1101/2022.04.05.22273469 | Human Papillomavirus Infection Rate by Genotype and Vaccination Rates in Canada: Canadian Health Measures Survey 2009 to 2013 | BackgroundAn infection with certain HPV genotypes can lead to cancer or genital warts. HPV can be detected with PCR-based tests, and some genotypes can be prevented by vaccines. However, since the infection rates of various HPV genotypes have not been well reported, the present study aims to provide this information.
MethodsThe Canadian Health Measures Survey (CHMS) is an ongoing biannual national survey. Between 2009 and 2011, it sampled a nationally representative sample of females aged 14 to 59 years to determine the infection rates of 46 HPV genotypes. Females aged 9 to 29 years and 9 to 59 years were asked whether they received HPV vaccines between 2009 to 2011 (cycle 2) and 2012 to 2013 (cycle 3), respectively. The reported infection rates and vaccination proportions were weighted and adjusted for the survey design.
ResultsAmong the estimated 10,592,968 females aged 14 to 59 years at cycle 2, the HPV genotypes with the highest infection rates were 16, 62, 74, and 54, and the rates were 3.42% (95% CI = 1.67% to 5.17%), 2.14% (95% CI = 0.68% to 3.59%), 2.1% (95% CI = 0.51% to 3.69%), and 2.04% (95% CI = 0.38% to 3.7%), respectively. There were an estimated 6,569,100 and 11,603,752 females aged 9 to 29 and 9 to 59 years at cycles 2 and 3, respectively. The proportions receiving a HPV vaccine were 13.55% (11.18% to 15.92%) and 12.3% (9.8% to 14.79%), respectively. The estimated numbers of females that received HPV vaccines were 890,197 and 1,427,000, respectively.
ConclusionCanada is one of the few countries that conduct national surveys to determine HPV infection rates by genotype, which are not limited to the surveillance of carcinogenic genotypes. Our study found discrepancies between the HPV genotypes whose infections were the most common, that could be detected by PCR tests, that were carcinogenic, and that could be prevented by vaccines. For example, 5 of the 7 genotypes (42, 54, 62, 66, and 74) with infection rates of more than 1% cannot be detected by PCR tests and are not targeted by vaccines. HPV 51 is carcinogenic, associated with genital warts, and can be detected by PCR tests, but it is not targeted by vaccines. We recommend a better alignment of the genotypes targeted by HPV tests and vaccines with those genotypes with the highest infection rates in Canada. | infectious diseases |
10.1101/2022.04.06.22273497 | Prophylactic and reactive vaccination strategies for healthcare workers against MERS-CoV | Several vaccines candidates are in development against Middle East respiratory syndrome-related coronavirus (MERS-CoV), which remains a major public health concern. Using individual-level data on the 2013-2014 Kingdom of Saudi Arabia epidemic, we employ counterfactual analysis on inferred transmission trees ("who-infected-whom") to assess potential vaccine impact. We investigate the conditions under which prophylactic "proactive" campaigns would outperform "reactive" campaigns (i.e. vaccinating either before or in response to the next outbreak), focussing on healthcare workers. Spatial scale is crucial: if vaccinating healthcare workers in response to outbreaks at their hospital only, proactive campaigns perform better, unless efficacy has waned significantly. However, campaigns that react at regional or national level consistently outperform proactive campaigns. Measures targeting the animal reservoir reduce transmission linearly, albeit with wide uncertainty. Substantial reduction of MERS-CoV morbidity and mortality is possible when vaccinating healthcare workers, underlining the need for at-risk countries to stockpile vaccines when available. | epidemiology |
10.1101/2022.04.05.22273458 | Prepandemic prevalence estimates of fatty liver disease and fibrosis defined by liver elastography in the United States | Background & AimsFatty liver disease is a growing public health burden with serious consequences. We estimated prepandemic prevalence of fatty liver disease determined by transient elastography assessed hepatic steatosis and fibrosis, and examined associations with lifestyle and other factors in a United States population sample.
MethodsLiver stiffness and controlled attenuation parameter (CAP) were assessed on 7,923 non-Hispanic white, non-Hispanic black, non-Hispanic Asian, and Hispanic men and women aged 20 years and over in the National Health and Nutrition Examination Survey (NHANES) 2017-March 2020 prepandemic data.
ResultsThe prevalence of fatty liver disease estimated by CAP >300 dB/m was 28.8% and of fibrosis (liver stiffness >8 kPa) was 10.4%. Only 7.2% of participants with fatty liver disease and 10.9% with fibrosis reported being told by a health care provider that they had liver disease. In addition to known risk factors such as metabolic factors and ALT, persons with fatty liver disease were less likely to meet physical activity guidelines, more likely to be sedentary for 12 or more hours a day, and reported a less healthy diet. Persons with fibrosis were less likely to have a college degree and reported a less healthy diet.
ConclusionIn the U.S. population, most persons with fatty liver disease are unaware of their condition. Although physical activity and dietary modifications might reduce the fatty liver disease burden, the COVID pandemic has been less favorable for lifestyle changes. There is an urgent need for fatty liver disease management in high-risk individuals using transient elastography or other noninvasive methods to intervene in disease progression. | gastroenterology |
10.1101/2022.04.06.22273080 | Serious underlying medical conditions and COVID-19 vaccine hesitancy. | ObjectiveTo examine vaccine uptake, hesitancy and explanatory factors amongst people with serious and/or chronic health conditions, including the impact of underlying disease on attitudes to vaccination.
DesignCross-sectional survey.
SettingTen Australian health services.
Participants4683 patients (3560 cancer, 842 diabetes and 281 multiple sclerosis) receiving care at the health services participated in the 42-item survey, between June 30 to October 5, 2021.
Main outcome measuresSociodemographic and disease-related characteristics, COVID-19 vaccine uptake, and the scores of three validated scales which measured vaccine hesitancy and vaccine-related beliefs generally and specific to the participants disease, including the Oxford COVID-19 Vaccine Hesitancy Scale, the Oxford COVID-19 Vaccine Confidence and Complacency Scale and the Disease Influenced Vaccine Acceptance Scale. Multivariable logistic regression was used to determine the associations between scale scores and vaccine uptake.
ResultsOf all participants, 81.5% reported having at least one COVID-19 vaccine. Unvaccinated status was associated with younger age, female sex, lower education and income, English as a second language, and residence in regional areas (all p<0.05). Unvaccinated participants were more likely to report greater vaccine hesitancy and more negative perceptions toward vaccines (all p<0.05). Disease-related vaccine concerns were associated with unvaccinated status and hesitancy, including greater complacency about COVID-19 infection, and concerns relating to vaccine efficacy and impact on their disease and/or treatment (all p<0.05).
ConclusionsDisease-specific concerns impact COVID-19 vaccine-related behaviours and beliefs in people with serious and/or chronic health conditions. This highlights the need to develop targeted strategies and education about COVID-19 vaccination to support medically vulnerable populations and health professionals.
Trial registrationACTRN12621001467820 | public and global health |
10.1101/2022.04.05.22273468 | Pre-Diagnostic Cognitive and Functional Impairment in Multiple Sporadic Neurodegenerative Diseases | INTRODUCTIONThe pathophysiological processes of neurodegenerative diseases begin years before diagnosis. However, pre-diagnostic changes in cognition and physical function are poorly understood, especially in sporadic neurodegenerative disease.
METHODSUK Biobank data was extracted. Cognitive and functional measures in individuals who subsequently developed Alzheimers Disease, Parkinsons Disease, Frontotemporal Dementia, Progressive Supranuclear Palsy, Dementia with Lewy Bodies, or Multiple System Atrophy, were compared against those without neurodegenerative diagnoses. The same measures were regressed against time to diagnosis, after adjusting for the effects of age.
RESULTSThere was evidence for pre-diagnostic cognitive impairment and decline with time, particularly in Alzheimers. Pre-diagnostic functional impairment and decline was observed in multiple diseases.
DISCUSSIONThe scale and longitudinal follow-up of UK Biobank participants provides evidence for cognitive and functional decline years before symptoms become obvious in multiple neurodegenerative diseases. Identifying pre-diagnostic functional and cognitive changes could improve selection for preventive and early disease-modifying treatment trials.
Research in ContextO_ST_ABSSystematic reviewC_ST_ABSStudies of genetic dementia cohorts provide evidence for pre-diagnostic changes in disease biomarkers and cognitive function in several genetic neurode-generative diseases. The pre-diagnostic phase of sporadic neurodegenerative disease has been less well-studied. It is unclear whether early functional or cognitive changes are detectable in sporadic neurodegenerative disease.
InterpretationWe have established an approach to identify cognitive and functional pre-diagnostic markers of neurodegenerative disease years before diagnosis. We found disease-relevant patterns of pre-diagnostic cognitive and functional impairment, and observed a pre-diagnostic linear decline in a number of cognitive and functional measures.
Future DirectionsOur approach can form the basis for pre-diagnostic cognitive and functional screening to recruit into trials of disease prevention and disease modifying therapies for neurodegenerative diseases. A screening panel based on cognition and function could be followed by disease-specific biomarkers to further improve risk stratification. | neurology |
10.1101/2022.04.06.22273512 | Pandemic-EBT and grab-and-go school Meals: Costs, reach, and benefits of two approaches to keep children fed during school closures due to COVID-19 | ImportanceSchool meals improve nutrition and health for millions of U.S. children. School closures due to the COVID-19 pandemic disrupted childrens access to school meals. Two policy approaches were activated to replace missed meals for children from low-income families. The Pandemic Electronic Benefit Transfer (P-EBT) program provided the cash value of missed meals directly to families on debit-like cards to use for making food purchases. The grab-and-go meals program offered prepared meals from school kitchens at community distribution points. The effectiveness of these programs at reaching those who needed them and their costs were unknown.
ObjectiveTo determine how many eligible children were reached by P-EBT and grab-and-go meals, how many meals or benefits were received, and how much each program cost to implement.
DesignCross-sectional study, Spring 2020.
SettingNational.
ParticipantsAll children <19 years old and children age 6-18 eligible to receive free or reduced price meals (FRPM).
Exposure(s)Receipt of P-EBT or grab-and-go school meals.
Main Outcome(s) and Measure(s)Percentage of children reached by P-EBT and grab-and-go school meals; average benefit received per recipient; and average cost, including implementation costs and time costs to families, per meal distributed.
ResultsGrab-and-go school meals reached about 10.5 million children (17% of all US children), most of whom were FRPM-eligible students. Among FRPM-eligible students only, grab-and-go meals reached 27%, compared to 89% reached by P-EBT. Among those receiving benefits, the average monthly benefit was larger for grab-and-go school meals ($148) relative to P-EBT ($110). P-EBT had lower costs per meal delivered - $6.51 - compared to $8.20 for grab- and-go school meals. P-EBT had lower public sector implementation costs but higher uncompensated time costs to families (e.g., preparation time for meals) compared to grab-and-go school meals.
Conclusions and RelevanceBoth programs supported childrens access to food when schools were closed and in complementary ways. P-EBT is an efficient and effective policy option to support food access for eligible children when school is out.
KEY POINTSO_ST_ABSQuestionC_ST_ABSWhat were the operating costs, costs and benefits to families, and proportion of eligible children who received benefits of two programs aimed at replacing school meals missed when schools were closed due to COVID-19?
FindingsIn this cross sectional analysis, we found that the Pandemic-Electronic Benefit Transfer program, in which state agencies sent debit cards loaded with the cash value of missed school meals directly to families, reached nearly all low income students (89%) and cost relatively little per meal provided. In comparison, grab-and-go school meals, in which school food service departments provided prepared meals for offsite consumption, reached 27% of low income children and was associated with larger per meal costs.
MeaningDuring times when children cannot access school meals, state and federal agencies should support cost-efficient programs for schools to distribute prepared meals and activate programs like P-EBT to efficiently reach eligible children. | nutrition |
10.1101/2022.04.04.22273058 | Delta/ Omicron and Omicron/BA.2 co-infections occurring in Immunocompromised hosts | Concomitant infection of multiple SARS-CoV-2 variants has become an increasing concern, as this scenario increases the likelihood of recombinant variants. Detecting co-infection of SARS-CoV-2 variants is difficult to detect by whole genome sequencing approaches, but genotyping methods facilitate detection. We describe 2 cases of Delta/Omicron and 2 cases of Omicron sublineage BA.1/ BA.2 co-infection as detected by a multiplex genotyping fragment analysis method. Findings were confirmed by whole genome sequencing. Review of the patient characteristics revealed co-morbidities and conditions which weaken the immune system and may make them more susceptible to harboring SARS-CoV-2 variant co-infections. | genetic and genomic medicine |
10.1101/2022.04.03.22273345 | Racial/Ethnic, Biomedical, and Sociodemographic Risk Factors for COVID-19 Positivity and Hospitalization in the San Francisco Bay Area | BACKGROUNDThe COVID-19 pandemic has uncovered clinically meaningful racial/ethnic disparities in COVID-19-related health outcomes. Current understanding of the basis for such an observation remains incomplete, with both biomedical and social/contextual variables proposed as potential factors.
PURPOSEUsing a logistic regression model, we examined the relative contributions of race/ethnicity, biomedical, and socioeconomic factors to COVID-19 test positivity and hospitalization rates in a large academic health care system in the San Francisco Bay Area prior to the advent of vaccination and other pharmaceutical interventions for COVID-19.
RESULTSWhereas socioeconomic factors, particularly those contributing to increased social vulnerability, were associated with test positivity for COVID-19, biomedical factors and disease co-morbidities were the major factors associated with increased risk of COVID-19 hospitalization. Hispanic individuals had a higher rate of COVID-19 positivity, while Asian persons had higher rates of COVID-19 hospitalization. Diabetes was an important risk factor for COVID-19 hospitalization, particularly among Asian patients, for whom diabetes tended to be more frequently undiagnosed and higher in severity.
CONCLUSIONSWe observed that biomedical, racial/ethnic, and socioeconomic factors all contributed in varying but distinct ways to COVID-19 test positivity and hospitalization rates in a large, multiracial, socioeconomically diverse metropolitan area of the United States. The impact of a number of these factors differed according to race/ethnicity. Improving over-all COVID-19 health outcomes and addressing racial and ethnic disparities in COVID-19 out-comes will likely require a comprehensive approach that incorporates strategies that target both individual-specific and group contextual factors. | infectious diseases |
10.1101/2022.04.06.22272747 | Longitudinal lung function assessment of patients hospitalised with COVID-19 using 1H and 129Xe lung MRI | IntroductionMicrovascular abnormalities and impaired 129Xe gas transfer have been observed in patients with COVID-19. The progression of pathophysiological pulmonary changes during the post-acute period in these patients remains unclear.
MethodsPatients who were hospitalised due to COVID-19 pneumonia underwent a pulmonary 1H and 129Xe MRI protocol at 6, 12, 25 and 50 weeks after hospital admission. The imaging protocol included: ultra-short echo time, dynamic contrast enhanced lung perfusion, 129Xe lung ventilation, 129Xe diffusion weighted and 129Xe 3D spectroscopic imaging of gas exchange.
Results9 patients were recruited and underwent MRI at 6 (n=9), 12 (n=9), 25 (n=6) and 50 (n=3) weeks after hospital admission. At 6 weeks after hospital admission, patients demonstrated impaired 129Xe gas transfer (RBC:M) but normal lung microstructure (ADC, LmD). Minor ventilation abnormalities present in four patients were largely resolved in the 6-25 week period. At 12 week follow up, all patients with lung perfusion data available (n=6) showed an increase in both pulmonary blood volume and flow when compared to 6 weeks, though this was not statistically significant. At 12 and 25 week follow up, significant improvements in 129Xe gas transfer were observed compared to 6-week examinations, however 129Xe gas transfer remained abnormally low.
ConclusionsThis study demonstrates that multinuclear MRI is sensitive to functional pulmonary changes in the follow up of patients who were hospitalised with COVID-19. Persistent impairment of xenon transfer may represent a physiological mechanism underlying ongoing symptoms in some patients and may indicate damage to the pulmonary microcirculation. | radiology and imaging |
10.1101/2022.04.05.22273393 | CHANGES IN LIFE EXPECTANCY BETWEEN 2019 AND 2021: UNITED STATES AND 19 PEER COUNTRIES | BACKGROUNDPrior studies reported large decreases in US life expectancy during 2020 as a result of the COVID-19 pandemic, disproportionately affecting Hispanic and non-Hispanic Black populations and vastly exceeding the average change in life expectancy in other high-income countries. Life expectancy estimates for 2021 have not been reported. This study estimated changes in life expectancy during 2019-2021 in the US population, in three US racial/ethnic groups, and in 19 peer countries.
METHODSUS and peer country death data for 2019-2021 were obtained from the National Center for Health Statistics, the Human Mortality Database, and overseas statistical agencies. The 19 peer countries included Austria, Belgium, Denmark, England and Wales, Finland, France, Germany, Israel, Italy, Netherlands, New Zealand, Northern Ireland, Norway, Portugal, Scotland, South Korea, Spain, Sweden, and Switzerland. Life expectancy was calculated for 2019 and 2020 and estimated for 2021 using a previously validated modeling method.
RESULTSUS life expectancy decreased from 78.86 years in 2019 to 76.99 years in 2020 and 76.47 years in 2021, a net loss of 2.39 years. In contrast, peer countries averaged a smaller decrease in life expectancy between 2019 and 2020 (0.57 years) and a 0.27-year increase between 2020 and 2021, widening the gap in life expectancy between the United States and peer countries to more than five years. The decrease in US life expectancy was highly racialized: whereas the largest decreases in 2020 occurred among Hispanic and non-Hispanic Black populations, in 2021 only the non-Hispanic White population experienced a decrease in life expectancy.
DISCUSSIONThe US mortality experience during 2020 and 2021 was more severe than in peer countries, deepening a US disadvantage in health and survival that has been building for decades. Over the two-year period between 2019 and 2021, US Hispanic and non-Hispanic Black populations experienced the largest losses in life expectancy, reflecting the legacy of systemic racism and inadequacies in the US handling of the pandemic. Reasons for the crossover in racialized outcomes between 2020 and 2021, in which life expectancy decreased only in the non-Hispanic White population, could have multiple explanations. | public and global health |
10.1101/2022.04.05.22273393 | CHANGES IN LIFE EXPECTANCY BETWEEN 2019 AND 2021: UNITED STATES AND 19 PEER COUNTRIES | BACKGROUNDPrior studies reported large decreases in US life expectancy during 2020 as a result of the COVID-19 pandemic, disproportionately affecting Hispanic and non-Hispanic Black populations and vastly exceeding the average change in life expectancy in other high-income countries. Life expectancy estimates for 2021 have not been reported. This study estimated changes in life expectancy during 2019-2021 in the US population, in three US racial/ethnic groups, and in 19 peer countries.
METHODSUS and peer country death data for 2019-2021 were obtained from the National Center for Health Statistics, the Human Mortality Database, and overseas statistical agencies. The 19 peer countries included Austria, Belgium, Denmark, England and Wales, Finland, France, Germany, Israel, Italy, Netherlands, New Zealand, Northern Ireland, Norway, Portugal, Scotland, South Korea, Spain, Sweden, and Switzerland. Life expectancy was calculated for 2019 and 2020 and estimated for 2021 using a previously validated modeling method.
RESULTSUS life expectancy decreased from 78.86 years in 2019 to 76.99 years in 2020 and 76.47 years in 2021, a net loss of 2.39 years. In contrast, peer countries averaged a smaller decrease in life expectancy between 2019 and 2020 (0.57 years) and a 0.27-year increase between 2020 and 2021, widening the gap in life expectancy between the United States and peer countries to more than five years. The decrease in US life expectancy was highly racialized: whereas the largest decreases in 2020 occurred among Hispanic and non-Hispanic Black populations, in 2021 only the non-Hispanic White population experienced a decrease in life expectancy.
DISCUSSIONThe US mortality experience during 2020 and 2021 was more severe than in peer countries, deepening a US disadvantage in health and survival that has been building for decades. Over the two-year period between 2019 and 2021, US Hispanic and non-Hispanic Black populations experienced the largest losses in life expectancy, reflecting the legacy of systemic racism and inadequacies in the US handling of the pandemic. Reasons for the crossover in racialized outcomes between 2020 and 2021, in which life expectancy decreased only in the non-Hispanic White population, could have multiple explanations. | public and global health |
10.1101/2022.04.05.22273393 | CHANGES IN LIFE EXPECTANCY BETWEEN 2019 AND 2021: UNITED STATES AND 19 PEER COUNTRIES | BACKGROUNDPrior studies reported large decreases in US life expectancy during 2020 as a result of the COVID-19 pandemic, disproportionately affecting Hispanic and non-Hispanic Black populations and vastly exceeding the average change in life expectancy in other high-income countries. Life expectancy estimates for 2021 have not been reported. This study estimated changes in life expectancy during 2019-2021 in the US population, in three US racial/ethnic groups, and in 19 peer countries.
METHODSUS and peer country death data for 2019-2021 were obtained from the National Center for Health Statistics, the Human Mortality Database, and overseas statistical agencies. The 19 peer countries included Austria, Belgium, Denmark, England and Wales, Finland, France, Germany, Israel, Italy, Netherlands, New Zealand, Northern Ireland, Norway, Portugal, Scotland, South Korea, Spain, Sweden, and Switzerland. Life expectancy was calculated for 2019 and 2020 and estimated for 2021 using a previously validated modeling method.
RESULTSUS life expectancy decreased from 78.86 years in 2019 to 76.99 years in 2020 and 76.47 years in 2021, a net loss of 2.39 years. In contrast, peer countries averaged a smaller decrease in life expectancy between 2019 and 2020 (0.57 years) and a 0.27-year increase between 2020 and 2021, widening the gap in life expectancy between the United States and peer countries to more than five years. The decrease in US life expectancy was highly racialized: whereas the largest decreases in 2020 occurred among Hispanic and non-Hispanic Black populations, in 2021 only the non-Hispanic White population experienced a decrease in life expectancy.
DISCUSSIONThe US mortality experience during 2020 and 2021 was more severe than in peer countries, deepening a US disadvantage in health and survival that has been building for decades. Over the two-year period between 2019 and 2021, US Hispanic and non-Hispanic Black populations experienced the largest losses in life expectancy, reflecting the legacy of systemic racism and inadequacies in the US handling of the pandemic. Reasons for the crossover in racialized outcomes between 2020 and 2021, in which life expectancy decreased only in the non-Hispanic White population, could have multiple explanations. | public and global health |
10.1101/2022.04.05.22273415 | Diabetes Management Beliefs among Adults Diagnosed with Type 2 Diabetes in Iran: A Theory Informed Approach from a Theory of Planned Behavior Framework | ObjectiveThe current study was informed by the belief basis of Ajzens (1991) Theory of Planned Behavior (TPB) to identify the important behavioral (advantages and disadvantages), normative (important referents) and control (barriers and facilitators) beliefs associated with the key recommended prevention and management behaviors for adults in Iran diagnosed with Type 2 diabetes (T2D).
MethodsA cross-sectional study was conducted. A total of 115 adults diagnosed with T2D completed a questionnaire examining behavioral, normative and control beliefs and intention in relation to the three diabetes management behaviors including low fat food consumption, carbohydrate counting and physical activity. For each behavior, intention was considered as dependent variable; beliefs were independent variables. Analyses involved three multivariate one-way analysis of variance (MANOVAs).
ResultsThe findings for carbohydrate counting and physical activity suggested behavioral and control beliefs as differentiating high from low intenders to perform the behavior. For carbohydrate counting, behavioral beliefs such as weight control, improving ones health, feeling good and controlling diabetic complications differed significantly between low and high intenders. For physical activity, feeling good, controlling blood sugar and tiredness were among behavioral beliefs differentiating low and high intenders. Medical advice from professionals and greater knowledge were identified as facilitators of carbohydrate counting. High costs were identified as a key barrier preventing individuals from engaging in physical activity. Spouse was the single significant referent influencing carbohydrate counting.
Conclusions & ImplicationsIdentifying the underlying beliefs of key diabetes management behaviors can assist in the design of tailored educational interventions for individuals with T2D. | primary care research |
10.1101/2022.04.05.22272397 | Have deaths of despair risen during the COVID-19 pandemic? A rapid systematic review | ObjectiveTo systematically review the literature on the impact of the COVID-19 pandemic on deaths of despair (suicide, overdoses and drug-related liver diseases).
MethodsFive electronic databases were searched using search terms on deaths of despair and COVID-19.
ResultsThe review of 70 publications included indicates that there is no change or a decline in the suicide rate during the pandemic compared to the pre-pandemic period. Drug-related deaths such as overdose deaths and liver diseases, however, have been increased compared to the pre-pandemic rate. Findings are mainly from middle-high- and high-income countries and data from low-income countries are lacking. Synthesis of data from subgroup analysis indicates that some groups such as Black people, women and younger age groups would be more vulnerable to socioeconomic disruption during the pandemic.
ConclusionStudies included in this review were preliminary and suffered from methodological limitations such as lack of inferential analysis or using provisional data. Further high-quality studies are needed considering the contribution of factors such as disease prevalence, government intervention and environmental characteristics. | public and global health |
10.1101/2022.04.06.22273444 | Does pre-infection stress increase the risk of long COVID? Longitudinal associations between adversity worries and experiences in the month prior to COVID-19 infection and the development of long COVID and specific long COVID symptoms | BackgroundLong COVID is increasingly recognised as public health burden. Demographic and infection-related characteristics have been identified as risk factors, but less research has focused on psychosocial predictors such as stress immediately preceding the index infection. Research on whether stressors predict the development of specific long COVID symptoms is also lacking.
MethodsData from 1,966 UK adults who had previously been infected with COVID-19 and who took part in the UCL COVID-19 Social Study were analysed. The number of adversity experiences (e.g., job loss) and the number of worries about adversity experiences within the month prior to COVID-19 infection were used to predict the development of self-reported long COVID and the presence of three specific long COVID symptoms (difficulty with mobility, cognition, and self-care). The interaction between a three-level index of socio-economic position (SEP; with higher values indicating lower SEP) and the exposure variables in relation to long COVID status was also examined. Analyses controlled for a range of COVID-19 infection characteristics, socio-demographics, and health-related factors.
FindingsOdds of self-reported long COVID increased by 1.25 (95% confidence interval [CI]: 1.04 to 1.51) for each additional worry about adversity in the month prior to COVID-19 infection. Although there was no evidence for an interaction between SEP and either exposure variable, individuals in the lowest SEP group were nearly twice as likely to have developed long COVID as those in the highest SEP group (OR: 1.95; 95% CI: 1.19 to 3.19) and worries about adversity experiences remained a predictor of long COVID (OR: 1.43; 95% CI: 1.04 to 1.98). The number of worries about adversity experiences also corresponded with increased odds of certain long COVID symptoms such as difficulty with cognition (e.g., difficulty remembering or concentrating) by 1.46 (95% CI: 1.02 to 2.09) but not with mobility (e.g., walking or climbing steps) or self-care (e.g., washing all over or dressing).
InterpretationResults suggest a key role of stress in the time preceding the acute COVID-19 infection for the development of long COVID and for difficulty with cognition specifically. These findings point to the importance of mitigating worries and experiences of adversities during pandemics both to reduce their psychological impact but also help reduce the societal burden of longer-term illness.
FundingThe Nuffield Foundation [WEL/FR-000022583], the MARCH Mental Health Network funded by the Cross-Disciplinary Mental Health Network Plus initiative supported by UK Research and Innovation [ES/S002588/1], and the Wellcome Trust [221400/Z/20/Z and 205407/Z/16/Z]. | public and global health |
10.1101/2022.03.30.22273163 | Clinical Impact of IDH1 Mutations and MGMT Methylation in Adult Glioblastoma Multiforme | BackgroundGenetic aberrations and epigenetic alterations have been reported in different types of cancer. Impact of Isocitrate dehydrogenase1 (IDH1) and O6-methylguanine-DNAmethyltransferase (MGMT) in glioblastoma multiforme (GBM) have been of great interest due to their implications in prediction of prognosis of several types of cancer. Authors aimed to investigate the clinical role of IDH1 mutation and MGMT methylation pattern among GBM patients versus non-neurooncological diseases (NND) patients and their impact on survival criteria.
MethodsFormalin-Fixed Paraffin-Embedded (FFPE) tissue sections of 58 GBM and 20 non-onconeurological diseases patients were recruited and IDH1 mutation were detected using Cast-PCR technology and MGMT methylation was detected using Methyl II quantitative PCR approach. Their results were assessed with other clinicopathological criteria and assess its correlation with survival patterns.
ResultsIDH1 mutation was detected among 15 GBM cases (15/58) and it was not reported among NND (P=0.011). Receiver operating characteristic (ROC) curve were plotted to discriminate between MGMT methylation among studied groups. Patients with MGMT methylation [≥] 66% was reported as high methylation, which was recorded significantly in 51.7% and 100% of GBM cases and NND, respectively. Both showed significant difference with performance status, while MGMT methylation was significantly related with tumor size and tumor location. IDH1 mutation and MGMT methylation reported significant increase with GBM patients revealed complete response to treatment. Survival pattern was better for IDH1 mutation and MGMT high methylation as compared to IDH1 wild type or MGMT low-moderate methylation, respectively and favorable survival was detected when both were combined than using either of them alone.
ConclusionDetection of IDH1 mutation and MGMT methylation among GBM patients could aid in prediction of their response to treatment and their survival patterns, and their combination is better than using any of them alone. | oncology |
10.1101/2022.04.05.22273447 | Functional changes in neural mechanisms underlying post-traumatic stress disorder in World Trade Center responders | World Trade Center (WTC) responders exposed to traumatic and environmental stressors during rescue and recovery efforts have higher prevalence (23%) of persistent, clinically significant WTC-related post-traumatic stress disorder (WTC-PTSD). Here, we applied eigenvector centrality (EC) metrics and data driven methods on resting state functional magnetic resonance (fMRI) outcomes to investigate neural mechanisms underlying WTC-PTSD and to identify how EC shifts in brain areas relate to WTC-exposure and behavioral symptoms. Nine brain areas differed significantly and contributed the most to differentiate functional neuro-profiles between WTC-PTSD and non-PTSD responders. The association between WTC-exposure and EC values differed significantly between WTC-PTSD and non-PTSD in the right anterior parahippocampal gyrus and left amygdala (p= 0.010; p= 0.005, respectively, adjusted for multiple comparisons). Within WTC-PTSD, the index of PTSD symptoms was positively associated with EC values in the right anterior parahippocampal gyrus and brainstem. Our understanding of functional changes in neural mechanisms underlying WTC-related PTSD is key to advance intervention and treatment. | epidemiology |
10.1101/2022.04.03.22273225 | Summaries, Analysis and Simulations of Recent COVID-19 Epidemics in Mainland China | BackgroundGlobally COVID-19 epidemics have caused tremendous disasters. China prevented effectively the spread of COVID-19 epidemics before 2022. Recently Omicron and Delta variants cause a surge in reported COVID-19 infections.
MethodsUsing differential equations and real word data, this study modelings and simulates COVID-19 epidemic in mainland China, estimates transmission rates, recovery rates, and blocking rates to symptomatic and asymptomatic infections. The transmission rates and recovery rates of the foreign input COVID-19 infected individuals in mainland China have also been studied.
ResultsThe simulation results were in good agreement with the real word data. The recovery rates of the foreign input symptomatic and asymptomatic infected individuals are much higher than those of the mainland COVID-19 infected individuals. The blocking rates to symptomatic and asymptomatic mainland infections are lower than those of the previous epidemics in mainland China. The blocking rate implemented between March 24-31, 2022 may not prevent the rapid spreads of COVID-19 epidemics in mainland China. For the foreign input COVID-19 epidemics, the numbers of the current symptomatic individuals and the asymptomatic individuals charged in medical observations have decreased significantly after March 17 2022.
ConclusionsNeed to implement more strict prevention and control strategies to prevent the spread of the COVID-19 epidemics in mainland China. Keeping the present prevention and therapy measures to foreign input COVID-19 infections can rapidly reduce the number of foreign input infected individuals to a very low level. | epidemiology |
10.1101/2022.04.03.22273225 | Summaries, Analysis and Simulations of Recent COVID-19 Epidemics in Mainland China | BackgroundGlobally COVID-19 epidemics have caused tremendous disasters. China prevented effectively the spread of COVID-19 epidemics before 2022. Recently Omicron and Delta variants cause a surge in reported COVID-19 infections.
MethodsUsing differential equations and real word data, this study modelings and simulates COVID-19 epidemic in mainland China, estimates transmission rates, recovery rates, and blocking rates to symptomatic and asymptomatic infections. The transmission rates and recovery rates of the foreign input COVID-19 infected individuals in mainland China have also been studied.
ResultsThe simulation results were in good agreement with the real word data. The recovery rates of the foreign input symptomatic and asymptomatic infected individuals are much higher than those of the mainland COVID-19 infected individuals. The blocking rates to symptomatic and asymptomatic mainland infections are lower than those of the previous epidemics in mainland China. The blocking rate implemented between March 24-31, 2022 may not prevent the rapid spreads of COVID-19 epidemics in mainland China. For the foreign input COVID-19 epidemics, the numbers of the current symptomatic individuals and the asymptomatic individuals charged in medical observations have decreased significantly after March 17 2022.
ConclusionsNeed to implement more strict prevention and control strategies to prevent the spread of the COVID-19 epidemics in mainland China. Keeping the present prevention and therapy measures to foreign input COVID-19 infections can rapidly reduce the number of foreign input infected individuals to a very low level. | epidemiology |
10.1101/2022.04.05.22273464 | Genetic Risk for Attention-Deficit/Hyperactivity Disorder Predicts Cognitive Decline and Development of Alzheimer's Disease Pathophysiology in Cognitively Unimpaired Older Adults | BackgroundAttention-Deficit/Hyperactivity Disorder (ADHD) persists in older age and is postulated to be a risk factor for cognitive impairment and Alzheimers Disease (AD). However, this notion relies exclusively on epidemiological associations, and no previous study has linked ADHD with a decline in cognitive performance in older adults or with AD progression. Therefore, this study aimed to determine whether genetic liability for ADHD, as measured by a well-validated ADHD polygenic risk score (ADHD-PRS), is associated with longitudinal cognitive decline and the development of AD pathophysiology in cognitively unimpaired (CU) older adults.
MethodsWe calculated a weighted ADHD-PRS in 212 CU individuals without a clinical diagnosis of ADHD (55-90 years) using whole-genome information. These individuals had baseline amyloid-{beta} (A{beta}) positron emission tomography, as well as longitudinal cerebrospinal fluid (CSF) phosphorylated tau at threonine 181, structural magnetic resonance imaging, and cognitive assessments for up to 6 years. Linear mixed-effects models were used to test the association of ADHD-PRS with cognition and AD biomarkers.
OutcomesHigher ADHD-PRS was associated with greater cognitive decline over 6 years. The combined effect between high ADHD-PRS and brain A{beta} deposition on cognitive deterioration was more significant than each individually. Additionally, higher ADHD-PRS was associated with increased CSF p-tau181 levels and frontoparietal atrophy in CU A{beta}-positive individuals.
InterpretationOur results suggest that genetic liability for ADHD is associated with cognitive deterioration and the development of AD pathophysiology in the CU elderly. These findings indicate that ADHD-PRS might inform the risk of developing cognitive decline in this population.
FundingNational Institute of Health and Brain & Behavioral Research Foundation. | psychiatry and clinical psychology |
10.1101/2022.04.06.22273484 | INTERCEPT pathogen reduction of platelet concentrates induces trans-arachidonic acids and affects eicosanoid formation | Gamma-irradiation of blood products is mandatory to avoid graft versus host disease in patients with immunosuppressed clinical conditions. Pathogen inactivation techniques were implemented to optimize safe blood component supply. The INTERCEPT treatment uses amotosalen together with UVA irradiation. The functional and molecular implications of these essential treatments have not yet been systematically assessed. The irradiation-induced inactivation of nucleic acids may actually be accompanied with modifications of chemically reactive polyunsaturated fatty acids, known to be important mediators of platelet functions. Thus, here we investigated eicosanoids and related fatty acids released upon treatment and during platelet storage for 7 days, complemented by the analysis of functional and metabolic consequences of these treatments. In contrast to gamma-irradiation, here we demonstrate that UVA treatment attenuated the formation of ALOX12-products such as 12-HETE and 12-HEPE but induced the formation of trans-arachidonic acids in addition to 11-HETE and HpODEs. Metabolic and functional issues like glucose consumption, lactate formation, platelet aggregation and clot firmness hardly differed between the two treatment groups. In vitro synthesis of trans-arachidonic acids (trans-AA) out of arachidonic acid in the presence of {beta}-mercaptoethanol suggested that thiol radicals formed by UVA treatment are responsible for the INTERCEPT-specific effects observed in platelet concentrates. It is plausible to assume that trans-AA and other UVA-induced molecules may have specific biological effects on the recipients, which need to be addressed in future studies.
Key pointsO_LIA previously unrecognized radical mechanisms for the generation of trans-fatty acids by UVA was identified
C_LIO_LIIrradiation with UVA was found to immediately affect the generation of polyunsaturated fatty acid oxidation products
C_LI | intensive care and critical care medicine |
10.1101/2022.04.06.22273514 | STIMULATE-ICP-Delphi (Symptoms, Trajectory, Inequalities and Management: Understanding Long-COVID to Address and Transform Existing Integrated Care Pathways Delphi): Study Protocol. | IntroductionAs mortality rates from COVID-19 disease fall, the high prevalence of long-term sequelae (Long COVID) is becoming increasingly widespread, challenging healthcare systems globally. Traditional pathways of care for Long Term Conditions (LTCs) have tended to be managed by disease-specific specialties, an approach that has been ineffective in delivering care for patients with multi-morbidity. The multi-system nature of Long COVID and its impact on physical and psychological health demands a more effective model of holistic, integrated care. The evolution of integrated care systems (ICSs) in the UK presents an important opportunity to explore areas of mutual benefit to LTC, multi-morbidity and Long COVID care. There may be benefits in comparing and contrasting ICPs for Long COVID with ICPs for other LTCs.
Methods and analysisThis study aims to evaluate health services requirements for ICPs for Long COVID and their applicability to other LTCs including multi-morbidity and the overlap with medically not yet explained symptoms (MNYES). The study will follow a Delphi design and involve an expert panel of stakeholders including people with lived experience, as well as clinicians with expertise in Long COVID and other LTCs. Study processes will include expert panel and moderator panel meetings, surveys, and interviews. The Delphi process is part of the overall STIMULATE-ICP programme, aimed at improving integrated care for people with Long COVID.
Ethics and disseminationEthical approval for this Delphi study has been obtained (Research Governance Board of the University of York) as have approvals for the other STIMULATE-ICP studies. Study outcomes are likely to inform policy for ICPs across LTCs. Results will be disseminated through scientific publication, conference presentation and communications with patients and stakeholders involved in care of other LTCs and Long COVID.
RegistrationResearchregistry: https://www.researchregistry.com/browse-the-registry#home/registrationdetails/6246bfeeeaaed6001f08dadc/.
Strengths and limitations of this studyO_ST_ABSGapC_ST_ABSThere is no general model for providing ICPs to many LTCs, especially in case of multi-morbidity or MNYES.
SolutionWe will develop policy recommendations for such ICPs based upon a Delphi process exploring the value and interchangeability of elements of ICPs for Long COVID. This will help patients and clinicians navigate access to, or provide ICPs for, LTCs.
StrengthsA key tenet is that people with lived experience of Long COVID or an LTC will be involved from inception in the design and conduct of the study.
WeaknessesPossible under-representation of digitally hard to reach groups, although efforts will be made to ensure that data collection is widely inclusive, following the NIHR INCLUDE framework. | health systems and quality improvement |
10.1101/2022.04.04.22272833 | Use of face masks did not impact COVID-19 incidence among 10-12-year-olds in Finland | In fall 2021 in Finland, the recommendation to use face masks in schools for pupils ages 12 years and above was in place nationwide. Some cities recommended face masks for younger pupils as well. Our aim was to compare COVID-19 incidence among 10-12-year-olds between cities with different recommendations on the use of face masks in schools. COVID-19 case numbers were obtained from the National Infectious Disease Registry (NIDR) of the Finnish Institute for Health and Welfare, where clinical microbiology laboratories report all positive SARS-CoV-2 tests with unique identifiers in a timely manner, including information such as date of birth, gender, and place of residence. The NIDR is linked to the population data registry, enabling calculation of incidences. We compared the differences in trends of 14-day incidences between Helsinki and Turku among 10-12-year-olds, and for comparison, also among ages 7-9 and 30-49 by using joinpoint regression. According to our analysis, no additional effect seemed to be gained from this, based on comparisons between the cities and between the age groups of the unvaccinated children (10-12 years versus 7-9 years). | infectious diseases |
10.1101/2022.04.06.22273202 | Prediction of deterioration from COVID-19 in patients in skilled nursing facilities using wearable and contact-free devices: a feasibility study | Background and RationaleApproximately 35% of all COVID-19 deaths occurred in Skilled Nursing Facilities (SNFs). In a healthy general population, wearables have shown promise in providing early alerts for actionable interventions during the pandemic. We tested this promise in a cohort of SNFs patients diagnosed with COVID-19 and admitted for post-acute care under quarantine. We tested if 1) deployment of wearables and contact-free biosensors is feasible in the setting of SNFs and 2) they can provide early and actionable insights into deterioration.
MethodsThis prospective clinical trial has been IRB-approved (NCT04548895). We deployed two commercially available devices detecting continuously every 2-3 minutes heart rate (HR), respiratory rate (RR) and uniquely providing the following biometrics: 1) the wrist-worn bracelet by Biostrap yielded continuous oxygen saturation (O2Sat), 2) the under-mattress ballistocardiography sensor by Emfit tracked in-bed activity, tossing, and sleep disturbances. Patients also underwent routine monitoring by staff every 2-4 h. For death outcomes, cases are reported due to the small sample size. For palliative care versus at-home discharges, we report mean{+/-}SD at p<0.05.
ResultsFrom 12/2020 - 03/2021, we approached 26 PCR-confirmed SarsCoV2-positive patients at two SNFs: 5 declined, 21 were enrolled into monitoring by both sensors (female=13, male=8; age 77.2{+/-}9.1). We recorded outcomes as discharged to home (8, 38%), palliative care (9, 43%) or death (4, 19%). The O2Sat threshold of 91% alerted for intervention. Biostrap captured hypoxic events below 91% nine times as often as the routine intermittent pulse oximetry. In the patient deceased, two weeks prior we observed a wide range of O2Sat values (65-95%) captured by the Biostrap device and not noticeable with the routine vital sign spot checks. In this patient, the Emfit sensor yielded a markedly reduced RR (7/min) in contrast to 18/min from two routine spot checks performed in the same period of observation as well as compared to the seven patients discharged home over a total of 86 days of monitoring (RR 19{+/-} 4). Among the patients discharged to palliative care, a total of 76 days were monitored, HR did not differ compared to the patients discharged home (68{+/-}8 vs 70{+/-}7 bpm). However, we observed a statistically significant reduction of RR at 16{+/-}4/min as well as the variances in RR (10{+/-}6 vs 19{+/-}4/min vs16{+/-}13) and activity of palliative care patients vs. patients discharged home.
Conclusion/DiscussionWe demonstrate that wearables and under-mattress sensors can be integrated successfully into the SNF workflows and are well tolerated by the patients. Moreover, specific early changes of oxygen saturation fluctuations and other biometrics herald deterioration from COVID-19 two weeks in advance and evaded detection without the devices. Wearable devices and under-mattress sensors in SNFs hold significant potential for early disease detection. | infectious diseases |
10.1101/2022.04.06.22273516 | Impact of the COVID-19 epidemic on mortality in rural coastal Kenya | BackgroundThe impact of COVID-19 on all-cause mortality in sub-Saharan Africa remains unknown.
MethodsWe monitored mortality among 306,000 residents of Kilifi Health and Demographic Surveillance System, Kenya, through four COVID-19 waves from April 2020-September 2021. We calculated expected deaths using negative binomial regression fitted to baseline mortality data (2010-2019) and calculated excess mortality as observed-minus-expected deaths. We excluded deaths in infancy because of under-ascertainment of births during lockdown. In February 2021, after two waves of wild-type COVID-19, adult seroprevalence of anti-SARS-CoV-2 was 25.1%. We predicted COVID-19-attributable deaths as the product of age-specific seroprevalence, population size and global infection fatality ratios (IFR). We examined changes in cause of death by Verbal Autopsy (VA).
ResultsBetween April 2020 and February 2021, we observed 1,000 deaths against 1,012 expected deaths (excess mortality -1.2%, 95% PI -6.6%, 5.8%). Based on SARS-CoV-2 seroprevalence, we predicted 306 COVID-19-attributable deaths (a predicted excess mortality of 30.6%) within this period. Monthly mortality analyses showed a significant excess among adults aged [≥]45 years in only two months, July-August 2021, coinciding with the fourth (Delta) wave of COVID-19. By September 2021, overall excess mortality was 3.2% (95% PI -0.6%, 8.1%) and cumulative excess mortality risk was 18.7/100,000. By VA, there was a transient reduction in deaths attributable to acute respiratory infections in 2020.
ConclusionsNormal mortality rates during extensive transmission of wild-type SARS-CoV-2 through February 2021 suggests that the IFR for this variant is lower in Kenya than elsewhere. We found excess mortality associated with the Delta variant but the cumulative excess mortality risk remains low in coastal Kenya compared to global estimates. | epidemiology |
10.1101/2022.04.05.22273388 | Effect of common pregnancy and perinatal complications on offspring metabolic traits across the life course: a multi-cohort study | BackgrounCommon pregnancy and perinatal complications are associated with offspring cardiometabolic risk factors. These complications may influence multiple metabolic traits in offspring and these associations might differ with offspring age.
MethodsWe used data from eight population-based cohort studies to examine and compare associations of pre-eclampsia (PE), gestational hypertension (GH), gestational diabetes (GD), preterm birth (PTB), small (SGA) and large (LGA) for gestational age (vs. appropriate size for gestational age (AGA)) with up to 167 plasma/serum-based nuclear magnetic resonance-derived metabolic traits encompassing lipids, lipoproteins, fatty acids, amino acids, ketones, glycerides/phospholipids, glycolysis, fluid balance, and inflammation. Confounder-adjusted regression models were used to examine associations (adjusted for maternal education, parity age at pregnancy, ethnicity, pre/early pregnancy body mass index and smoking, and offspring sex and age at metabolic trait assessment), and results were combined using meta-analysis by five age categories representing different periods of the offspring life course: neonates (cord blood), infancy (mean ages: 1.1-1.6y), childhood (4.2-7.5y); adolescence (12.0-16.0y), and adulthood (22.0-67.8y).
ResultsOffspring numbers for each age category/analysis varied from 8,925 adults (441 PTB) to 1,181 infants (135 GD); 48.4% to 60.0% were females. Pregnancy complications (PE, GH, GD) were each associated with up to three metabolic traits in neonates (P[≤]0.001) with some limited evidence of persistence to older ages. PTB and SGA were associated with 32 and 12 metabolic traits in neonates respectively, which included an adjusted standardised mean difference of -0.89 standard deviation (SD) units for albumin with PTB (95%CI: -1.10 to -0.69, P=1.3x10-17) and -0.41SD for total lipids in medium HDL with SGA (95%CI: -0.56 to -0.25, P=2.6x10-7), with limited evidence of persistence to older ages. LGA was inversely associated with 19 metabolic traits including lower levels of cholesterol, lipoproteins, fatty acids, and amino acids, with associations emerging in adolescence, (e.g., -0.11SD total fatty acids, 95%CI: -0.18 to -0.05, P=0.0009), and attenuating with older age across adulthood.
ConclusionsThese reassuring findings suggest little evidence of wide-spread and long-term impact of common pregnancy and perinatal complications on offspring metabolic traits, with most associations only observed for new-borns rather than older ages, and for perinatal rather than pregnancy complications. | epidemiology |
10.1101/2022.04.07.22273319 | Vaccine effectiveness of BNT162b2 against Omicron and Delta outcomes in adolescents | IntroductionData on vaccine effectiveness (VE) against Omicron in adolescents are limited. We estimated 2-dose and 3-dose VE against Omicron and Delta in adolescents aged 12-17 years in Ontario, Canada.
MethodsWe conducted a test-negative design study among SARS-CoV-2-tested adolescents aged 12-17 years between November 22, 2021 (date of first Omicron detection) and March 6, 2022; we assessed Delta outcomes prior to January 2, 2022. We used multivariable logistic regression to compare the odds of vaccination in cases to symptomatic test-negative controls and calculated VE as 1-adjusted odds ratio.
ResultsVE was lower against symptomatic Omicron infection than against Delta and decreased more rapidly over time, from 51% (95%CI, 38-61%) in the 7-59 days following a second dose to 29% (95%CI, 17-38%) after 180 days, compared to 97% (95%CI, 94-99%) and 90% (95%CI, 79-95%) for the same intervals against symptomatic Delta infection. Overall, 2-dose VE against severe outcomes caused by Omicron was 85% (95%CI, 74-91%) [≥]7 days following a second dose and estimates were similar over time. VE against symptomatic Omicron infection was 62% (95%CI, 49-72%) [≥]7 days following a third dose.
DiscussionTwo-dose VE against symptomatic Omicron infection wanes over time in adolescents. While lower than observed against Delta, protection against severe outcomes appears to be maintained over time. A third dose substantially improves protection against Omicron infection, but 3-dose VE is only moderate at approximately 60% in the early period following vaccination and the duration of this protection is unknown. | public and global health |
10.1101/2022.04.04.22273431 | Prevalence and Factors Associated with Chronic Kidney Disease among Patients Admitted to Medical Ward in a Tertiary Hospital, Northern Ethiopia, a Cross Sectional Study | IntroductionChronic Kidney Disease (CKD) is being recognized as a global public health problem. CKD is a major non-communicable disease with the global prevalence varying between 10.5% and 13.1%. Diabetes and hypertension appear to be the leading causes of CKD and End Stage Renal Disease worldwide. The aim of this study is to determine the prevalence of CKD and its associated factors among patients admitted to medical ward in a tertiary hospital, Northern Ethiopia.
MethodologyAn institution based cross-sectional study was undertaken using systematic random sampling technique to select study participants. Sample sizes of 450 patients were included in the study. Data was collected using a pre-tested semi-structured questionnaire designed to meet the study objective. The data collection period was from October 20, 2017 to March 20, 2018 G.C. Data was analyzed using SPSS version 21.The odds ratio with their 95% confidence interval and P value were calculated. Statistical significance was declared if P value < 0.05.
ResultOf the 450 patients, 260(57.8%) were males. More than half (54.2%) were between ages of 25 to 40 years. The overall prevalence of CKD among patients admitted to medical ward was 17.3% (95% CI 13 - 29.9) and 14.4% (95 % CI 6.2 - 12.3) by Cockcroft Gault and MDRD equations respectively. Prevalence of stage 5 CKD was 61.5% by Cockcroft Gault equation. Hypertension AOR 3(95%CI 1.28, 4.1), history of recurrent urinary tract infection AOR 3.5 (95% CI 1.1, 7.3) and history of using nephrotoxic drugs AOR 3.4 (95% CI 2, 9.3) were significantly associated with CKD.
ConclusionThe prevalence of CKD among adult patients admitted to medical ward in tertiary hospital, Northern Ethiopia was high and majority of patients with CKD were stage 5. Hypertension, use of nephrotoxic agents and recurrent urinary tract infections were significantly associated with CKD. | nephrology |
10.1101/2022.04.06.22273125 | Combined Infection Control Interventions Protect the Essential Workforce from Occupationally-Acquired SARS-CoV-2 during Produce Production, Harvesting and Processing Activities | Essential food workers experience an elevated risk of SARS-CoV-2 infection due to prolonged occupational exposures (e.g., frequent close contact, enclosed spaces) in food production and processing areas, shared transportation (car or bus), and employer-provided shared housing. The purpose of this study was to evaluate the impact of combined food industry interventions and vaccination on reducing the daily cumulative risk of SARS-CoV-2 infection for produce workers. Six linked quantitative microbial risk assessment models were developed in R to simulate daily scenarios experienced by a worker. Standard industry interventions (2 m physical distancing, handwashing, surface disinfection, universal masking, increased ventilation) and two-dose mRNA vaccinations (86-99% efficacy) were modeled individually and jointly to assess risk reductions. The infection risk for an indoor (0.802, 95% Uncertainty Interval [UI]: 0.472-0.984) and outdoor (0.483, 95% UI: 0.255-0.821) worker was reduced to 0.018 (93% reduction) and 0.060 (87.5% reduction) after implementation of combined industry interventions. Upon integration of these interventions with vaccination, the infection risk for indoor (0.001, 95% UI: 0.0001-0.005) and outdoor (0.004, 95% UI: 0.001-0.016) workers was reduced by [≥]99.1%. Food workers face considerable risk of occupationally-acquired SARS-CoV-2 infection without interventions; however, consistent implementation of key infection control measures paired with vaccination effectively mitigates these risks.
SynopsisBundled interventions, particularly if they include vaccination, produce significant reductions (>99%) in SARS-CoV-2 infection risk for essential food workers.
Graphical Abstract
O_FIG O_LINKSMALLFIG WIDTH=200 HEIGHT=187 SRC="FIGDIR/small/22273125v1_ufig1.gif" ALT="Figure 1">
View larger version (69K):
[email protected]@13d2aeeorg.highwire.dtl.DTLVardef@cd856forg.highwire.dtl.DTLVardef@f35998_HPS_FORMAT_FIGEXP M_FIG C_FIG | occupational and environmental health |
10.1101/2022.04.04.22273422 | Distinct cellular dynamics associated with response to CAR-T therapy for refractory B-cell lymphoma | Chimeric Antigen Receptor (CAR)-T cell therapy has revolutionized the treatment of hematologic malignancies. Approximately half of patients with refractory large B-cell lymphomas achieve durable responses from CD19-targeting CAR-T treatment; however, failure mechanisms are identified in only a fraction of cases. To gain novel insights into the basis of clinical response, we performed single-cell transcriptome sequencing of 105 pre- and post-treatment peripheral blood mononuclear cell samples, and infusion products collected from 32 individuals with high-grade B cell lymphoma treated with either of two CD19 CAR-T products: axicabtagene ciloleucel (axi-cel) or tisagenlecleucel (tisa-cel). Expansion of proliferative memory-like CD8 clones was a hallmark of tisa-cel response, whereas axi-cel responders displayed more heterogeneous populations. Elevations in CAR-T regulatory cells (CAR-Tregs) among non-responders to axi-cel were detected, and these populations were capable of suppressing conventional CAR-T cell expansion and driving late relapses in an in vivo model. Our analyses reveal the temporal dynamics of effective responses to CAR-T therapy, the distinct molecular phenotypes of CAR-T cells with differing designs, and the capacity for even small increases in CAR-Tregs to drive relapse. | oncology |
10.1101/2022.04.04.22273172 | Early SARS-CoV-2 reinfections within 60 days highlight the need to consider antigenic variations together with duration of immunity in defining retesting policies | The emergence of the SARS-CoV-2 Omicron variant, characterized by a significant antigenic diversity compared to the previous Delta variant, had led to a decrease in antibody efficacy in both convalescent and vaccinees sera resulting in high number of reinfections and breakthrough cases worldwide. However, to date, reinfections are defined by the ECDC as two positive tests [≥]60 days apart, influencing retesting policies after an initial positive test in several European countries. We illustrate by a clinical case supplemental by epidemiological data that early reinfections do occur within 60 days especially in young, unvaccinated individuals. In older patient groups, unvaccinated and patients with a basic vaccination scheme are more vulnerable to reinfections compared to patients who received a first booster vaccine. For this reason, we consider that the duration of protection offered by a previous infection should be reconsidered, in particular when a shift between consecutive SARS-CoV-2 variants occurs. | infectious diseases |
10.1101/2022.04.06.22273409 | Risk factors for severe COVID-19 in hospitalized children in Canada: A national prospective study from March 2020--May 2021 | BackgroundChildren living with chronic comorbid conditions are at increased risk for severe COVID-19, though there is limited evidence regarding the risks associated with specific conditions and which children may benefit from targeted COVID-19 therapies. The objective of this study was to identify factors associated with severe disease among hospitalized children with COVID-19 in Canada.
MethodsWe conducted a national prospective study on hospitalized children with microbiologically confirmed SARS-CoV-2 infection via the Canadian Paediatric Surveillance Program from April 2020-May 2021. Cases were reported voluntarily by a network of >2800 paediatricians. Hospitalizations were classified as COVID-19-related, incidental infection, or infection control/social admissions. Severe disease (among COVID-19-related hospitalizations only) was defined as disease requiring intensive care, ventilatory or hemodynamic support, select organ system complications, or death. Risk factors for severe disease were identified using multivariable Poisson regression, adjusting for age, sex, concomitant infections, and timing of hospitalization.
FindingsWe identified 544 children hospitalized with SARS-CoV-2 infection, including 60{middle dot}7% with COVID-19-related disease and 39{middle dot}3% with incidental infection or infection control/social admissions. Among COVID-19-related hospitalizations (n=330), the median age was 1{middle dot}9 years (IQR 0{middle dot}1-13{middle dot}3) and 43{middle dot}0% had chronic comorbid conditions. Severe disease occurred in 29{middle dot}7% of COVID-19-related hospitalizations (n=98/330), most frequently among children aged 2-4 years (48{middle dot}7%) and 12-17 years (41{middle dot}3%). Comorbid conditions associated with severe disease included technology dependence (adjusted risk ratio [aRR] 2{middle dot}01, 95% confidence interval [CI] 1{middle dot}37-2{middle dot}95), neurologic conditions (e.g. epilepsy and select chromosomal/genetic conditions) (aRR 1{middle dot}84, 95% CI 1{middle dot}32-2{middle dot}57), and pulmonary conditions (e.g. bronchopulmonary dysplasia and uncontrolled asthma) (aRR 1{middle dot}63, 95% CI 1{middle dot}12-2{middle dot}39).
InterpretationWhile severe outcomes were detected at all ages and among patients with and without comorbidities, neurologic and pulmonary conditions as well as technology dependence were associated with increased risk of severe COVID-19. These findings may help guide vaccination programs and prioritize targeted COVID-19 therapies for children.
FundingFinancial support for the CPSP was received from the Public Health Agency of Canada. | infectious diseases |
10.1101/2022.04.05.22273358 | Built environment's impact on COVID-19 transmission and mental health revealed by COVID-19 Participant Experience data from the All of Us Research Program | ObjectivesThe coronavirus disease 2019 (COVID-19) pandemic has led to millions of deaths. Effectively cutting the transmission of COVID-19 is essential to reduce the impact. Previous studies have observed the potential relationship between the built environment and COVID-19 transmission; however, to date, stringent studies investigating these relationships at the individual level are still insufficient. Here, we aim to examine the relationship between household types and COVID-19 infection (or mental health) during the early stages of the pandemic by using the All of Us Research Program COVID-19 Participant Experience (COPE) survey data.
DesignBased on 62,664 participants responses to COPE from May to July 2020, we matched the cases of self-reported COVID-19 status, anxiety, or stress, with controls of the same race, sex, age group, and survey version. We conducted multiple logistic regressions between one of the outcomes and household type under the adjustment of other related covariates, such as ethnicity, age, social distancing behavior, and house occupancy.
ResultsHousehold type with a shared component was significantly associated with COVID-19 infection (OR=1.19, 95% CI 1.1 to 1.3; p=2x10-4), anxiety (OR=1.26, 95% CI 1.1 to 1.4; p=1.1x10-6), and stress (OR=1.29, 95% CI 1.2 to 1.4, p=4.3x10-10) as compared to free-standing houses after adjusting for the abovementioned confounding factors. Further, frequent nonessential shopping or outings, another indicator of the built environment, was also associated with COVID-19 infection (OR=1.36, 95% CI 1.1 to 1.8; p=0.02), but not associated with elevated mental health conditions.
ConclusionOur study demonstrated that the built environment of houses with a shared component tends to increase the risk of COVID-19 transmission, which consequently led to more anxiety and stress for their dwellers. It also suggested the necessity to improve the quality of the built environment through planning, design, and management toward a more resilient society in coping with future pandemics. | infectious diseases |
10.1101/2022.04.04.22273386 | Drinking water chlorination impact on fecal carriage of extended-spectrum beta-lactamase-producing Enterobacteriaceae in Bangladeshi children in a double-blind, cluster-randomized controlled trial | BackgroundWater, sanitation, and hygiene (WASH) services have the potential to interrupt transmission of antimicrobial-resistant bacteria and reduce the need for antibiotics, thereby reducing selection for resistance. However, evidence of WASH impacts on antimicrobial resistance (AMR) is lacking.
MethodsWe evaluated extended-spectrum beta-lactamase (ESBL)-producing Escherichia coli and ESBL-KESC (Klebsiella spp., Enterobacter spp., Serratia spp., and Citrobacter spp.) carriage in the feces of 479 Bangladeshi children under 5 years of age enrolled in a double-blind, cluster-randomized controlled trial of in-line drinking water chlorination in two low-income urban communities in Bangladesh. We additionally assessed the interventions impact on circulating beta-lactamase genes in fecal metagenomes and in genomes of fecal ESBL-E. coli isolates.
FindingsWe detected ESBL-E. coli in 65% (n = 309) and ESBL-KESC in 12% (n = 56) of enrolled children. We observed no effect of the intervention on the prevalence of ESBL-E. coli (relative risk [95% confidence interval] = 0.98 [0.78, 1.23]) when controlling for study site and age. Although ESBL-KESC (0.76 [0.44, 1.29]) was lower among children in the intervention group, the relative risk was not significant. Concentrations of ESBL-E. coli (log10 CFU/g-wet) were on average [95% confidence interval] 0.13 [-0.16, 0.42] higher in the intervention group and ESBL-KESC (log10 CFU/g-wet) were 0.10 [-0.22, 0.02], lower in the intervention group, when controlling for study site and age. Furthermore, the distribution of ESBL-E.coli sequence types, type of beta-lactamase-encoding genes in ESBL-E. coli isolates, and the presence and relative abundance of beta-lactamase-encoding genes in childrens fecal metagenomes did not differ significantly between the intervention and control children.
InterpretationOne year of in-line drinking water chlorination in communities did not meaningfully impact the carriage of ESBL-E. coli among children in an area of high ESBL-E. coli carriage. While ESBL-KESC was at lower prevalence than ESBL-E. coli, in the intervention group, limited study power prevented a clear interpretation of treatment effect. Development and evaluation of effective interventions to reduce AMR carriage are needed to support calls for WASH embedded in current National and Global AMR Action Plans. | epidemiology |
10.1101/2022.04.04.22273417 | Evaluating the effects of an intervention to improve the health environment for mothers and children in health centres (BECEYA) in Mali: a qualitative study | BackgroundAn intervention aiming to improve maternal and children environment in healthcare facilities (BECEYA) was launched in three regions of Mali. This study aimed to explore the perceptions and experiences of patients and their companions, community actors and healthcare facilities staff on the effects of the BECEYA intervention in two regions of Mali.
MethodsWe conducted a qualitative study using an empirical phenomenological approach. Through purposive sampling, women who attended antenatal care in the selected healthcare centres, companions, and health facility staff members were recruited. Data were collected during January and February 2020 through semi-structured individual interviews and focus groups. According to Braun & Clarke approach, audio recordings were transcribed verbatim, and a thematic analysis was conducted in five main steps.
ResultsWe recruited 26 participants in individual interviews and 20 participants in focus groups. Donabedian conceptual framework of quality of care was used to present the perceived changes following the implementation of the BECEYA project. The themes that emerged from data analysis are perceived changes in terms of infrastructure (perceived changes in the characteristics of the healthcare facilities setting, including the infrastructure introduced by the BECEYA project), process (changes in the delivery and use of care introduced or resulting from BECEYA activities), and outcome (the direct and indirect effects of these changes on the health status of patients and the population).
ConclusionThe study identified the positive effects of women users of the services, their companions and health centre staff following the implementation of the project. Therefore, this study contributes to the advancement of knowledge to show the link between improving the health environment of health centres in developing countries, a major aspect of the quality of care, and maternal and child health care. | health systems and quality improvement |
10.1101/2022.04.09.22273420 | Characteristics and outcomes of COVID-19 patients during B.1.1.529 (Omicron) dominance compared to B.1.617.2 (Delta) in 89 German hospitals | BackgroundThe SARS-CoV-2 variant of concern B.1.1.529 (Omicron) was first described in November 2021 and soon became the dominant variant worldwide. Existing data suggests a reduced disease severity in comparison to B.1.617.2 (Delta). Differences in characteristics and in-hospital outcomes of patients with COVID-19 in Germany during the Omicron period compared to Delta are not thoroughly studied. Surveillance for severe acute respiratory infections (SARI) represents an integral part of infectious disease control in Germany.
MethodsAdministrative data from 89 German Helios hospitals was retrospectively analysed. Laboratory-confirmed SARS-CoV-2 infections were identified by ICD-10-code U07.1 and SARI cases by ICD-10-codes J09-J22. COVID-19 cases were stratified by concomitant SARI. A nine-week observational period between December 6, 2021 and February 6, 2022 was defined and divided into three phases with respect to the dominating virus variant (Delta, Delta to Omicron transition, Omicron). Regression analyses adjusted for age, gender and Elixhauser comorbidities were applied to assess in-hospital patient outcomes.
ResultsA total cohort of 4,494 inpatients was analysed. Patients in the Omicron dominance period were younger (mean age 61.6 vs. 47.8; p<0.01), more likely to be female (54.7% vs. 47.5%; p<0.01) and characterized by a lower comorbidity burden (mean Elixhauser comorbidity index 8.2 vs. 5.4; p<0.01). Comparing Delta and Omicron periods, patients were at significantly lower risk for intensive care treatment (adjusted odds ratio 0.64 [0.51-0.8]; p<0.001), mechanical ventilation (adjusted odds ratio 0.38 [0.28-0.51]; p<0.001), and in-hospital mortality (adjusted odds ratio 0.42 [0.32-0.56]; p<0.001). This also applied to the separate COVID-SARI group. During the Delta to Omicron transition, case numbers of COVID-19 without SARI exceeded COVID-SARI.
ConclusionPatient characteristics and outcomes differ during the Omicron dominance period as compared to Delta suggesting a reduced disease severity with Omicron infections. SARI surveillance might play a crucial role in assessing disease severity of future SARS-CoV-2 variants. | infectious diseases |
10.1101/2022.04.09.22273420 | Characteristics and outcomes of COVID-19 patients during B.1.1.529 (Omicron) dominance compared to B.1.617.2 (Delta) in 89 German hospitals | BackgroundThe SARS-CoV-2 variant of concern B.1.1.529 (Omicron) was first described in November 2021 and soon became the dominant variant worldwide. Existing data suggests a reduced disease severity in comparison to B.1.617.2 (Delta). Differences in characteristics and in-hospital outcomes of patients with COVID-19 in Germany during the Omicron period compared to Delta are not thoroughly studied. Surveillance for severe acute respiratory infections (SARI) represents an integral part of infectious disease control in Germany.
MethodsAdministrative data from 89 German Helios hospitals was retrospectively analysed. Laboratory-confirmed SARS-CoV-2 infections were identified by ICD-10-code U07.1 and SARI cases by ICD-10-codes J09-J22. COVID-19 cases were stratified by concomitant SARI. A nine-week observational period between December 6, 2021 and February 6, 2022 was defined and divided into three phases with respect to the dominating virus variant (Delta, Delta to Omicron transition, Omicron). Regression analyses adjusted for age, gender and Elixhauser comorbidities were applied to assess in-hospital patient outcomes.
ResultsA total cohort of 4,494 inpatients was analysed. Patients in the Omicron dominance period were younger (mean age 61.6 vs. 47.8; p<0.01), more likely to be female (54.7% vs. 47.5%; p<0.01) and characterized by a lower comorbidity burden (mean Elixhauser comorbidity index 8.2 vs. 5.4; p<0.01). Comparing Delta and Omicron periods, patients were at significantly lower risk for intensive care treatment (adjusted odds ratio 0.64 [0.51-0.8]; p<0.001), mechanical ventilation (adjusted odds ratio 0.38 [0.28-0.51]; p<0.001), and in-hospital mortality (adjusted odds ratio 0.42 [0.32-0.56]; p<0.001). This also applied to the separate COVID-SARI group. During the Delta to Omicron transition, case numbers of COVID-19 without SARI exceeded COVID-SARI.
ConclusionPatient characteristics and outcomes differ during the Omicron dominance period as compared to Delta suggesting a reduced disease severity with Omicron infections. SARI surveillance might play a crucial role in assessing disease severity of future SARS-CoV-2 variants. | infectious diseases |
10.1101/2022.04.07.22273557 | Mortality in Switzerland in 2021 | ObjectiveTo analyze mortality trends in Switzerland in 2021, the second year of the COVID-19 pandemic.
MethodsUsing data from the Swiss Federal Statistical Office, we compared mortality in Switzerland in 2021 with that of previous years in terms of standardized weekly deaths, standardized (annual) mortality rates (overall and stratified by age and sex) and life expectancy.
ResultsAfter a favorable first half of the year and a fairly standard second half in terms of mortality in Switzerland, the year 2021 ended with a wave of deaths of moderate intensity related to the 5th wave of COVID-19. Overall, and after a notable increase in mortality in 2020 (+9.2% compared to 2019), the pre-pandemic mortality level was approximately recovered in 2021 (+0.6% compared to 2019). Life expectancy, after declining by 10 months for men and 6 months for women in 2020, returned in 2021 to levels slightly above those of 2019 for women (85.7 years, +0.1 years from 2019) and regained 2018 levels for men (81.6 years, still -0.3 years from 2019). The age group responsible for the small remaining loss for men was the 50-70 age group, which had similar mortality in 2020 and 2021.
ConclusionsThe second year of the COVID-19 pandemic in Switzerland was characterized by an approximate return to pre-pandemic mortality levels, with a faster recovery for women than for men compared to 2020. | public and global health |
10.1101/2022.04.07.22273545 | Both COVID-19 infection and vaccination induce high-affinity cross-clade responses to SARS-CoV-2 variants | The B.1.1.529 (omicron) variant has rapidly supplanted most other SARS-CoV-2 variants. Using microfluidics-based antibody affinity profiling (MAAP), we have recently shown that current therapeutic monoclonal antibodies exhibit a drastic loss of affinity against omicron. Here, we have characterized affinity and IgG concentration in the plasma of 39 individuals with multiple trajectories of SARS-CoV-2 infection and/or vaccination as well as in 2 subjects without vaccination or infection. Antibody affinity in patient plasma samples was similar against the wild-type, delta, and omicron variants (KA ranges: 122{+/-}155, 159{+/-}148, 211{+/-}307 M-1, respectively), indicating a surprisingly broad and mature cross-clade immune response. We then determined the antibody iso- and subtypes against multiple SARS-CoV-2 spike domains and nucleoprotein. Postinfectious and vaccinated subjects showed different profiles, with IgG3 (p = 0.002) but not IgG1, IgG2 or IgG4 subtypes against the spike ectodomain being more prominent in the former group. Lastly, we found that the ELISA titers against the wildtype, delta, and omicron RBD variants correlated linearly with measured IgG concentrations (R=0.72) but not with affinity (R=0.29). These findings suggest that the wild-type and delta spike proteins induce a polyclonal immune response capable of binding the omicron spike with similar affinity. Changes in titers were primarily driven by antibody concentration, suggesting that B-cell expansion, rather than affinity maturation, dominated the response after infection or vaccination. | public and global health |
10.1101/2022.04.08.22273629 | Prevalence and correlates of urogenital schistosomiasis in school going children in Maramba compound of Livingstone District, Zambia | BackgroundSchistosomiasis is an acute and chronic parasitic disease that is caused by trematode worms (blood flukes) of the genus Schistosoma. Schistosoma haematobium (S. haematobium) is known to cause urogenital schistosomiasis. The disease is the second most common socio-economically devastating tropical parasitic disease after malaria in Africa. In Zambia, it affects over a million school going children, mostly in rural communities due to unsafe water and inadequate sanitation facilities. This study aimed to determine the presence of S. haematobium in urine specimens of school going children in Maramba compound of Livingstone and establish factors associated with the acquisition and spread of the parasite.
MethodsA structured questionnaire was administered on all children with signed consent from their guardians/parents and afterward spot urine specimens were collected in sterile containers for macroscopically/microscopically examination by an independent laboratory technologist.
ResultsA total of 173 school going children participated in the study. Parasitic eggs were detected in 6 specimens providing a prevalence of 3.47% (p<0.01) and this had a strong association with presence of microscopic red blood cells (p<0.01), dysuria (p=0.026), washing in a stream (p=0.01), and the perception on bilharzia acquisition (p<0.01).
ConclusionThe prevalence of urogenital schistosomiasis among school going children in Maramba compound was 3.47%, and the correlates of the infection included washing in a stream, older age and poor knowledge on schistosomiasis. Participants that had schistosomiasis often presented with hematuria and lacked knowledge on disease acquisition, health effects and preventive measures. This calls for more robust sensitization of school going children and periodic screening to curb the disease. | public and global health |
10.1101/2022.04.04.22273392 | Potential gains in health-adjusted life expectancy by reducing burden of non-communicable diseases: a population-based study | BackgroundThe United Nations Sustainable Development Goals (SDGs) target 3.4 aims to reduce premature mortality attributable to non-communicable diseases (NCDs) by one-third of their 2015 levels by 2030. Although meeting this target leads to longevity, survivors may suffer from long-term disability caused by NCDs. This paper quantifies the potential gains in health-adjusted life expectancy for people aged 30-70 years (HALE[30-70)) by examining the reductions in disability in addition to premature mortality. Additionally, we also assessed the feasibility of meeting the SDGs target 3.4.
MethodsWe extracted data from the Global Burden of Disease Study 2019 for all NCDs and four major NCDs (cancers, cardiovascular diseases, chronic respiratory diseases, and diabetes mellitus) in 188 countries from 1990 to 2019. Bayesian age-period-cohort models were used to predict possible premature mortality in 2030. The life table was used to estimate the unconditional probability of death and HALE[30-70). Estimates of the potential gains in HALE[30-70) were based on three alternative future scenarios: a) eliminating all premature deaths and disability from a specific cause, b) meeting SDGs target 3.4, and survivors disability is eliminated, and c) meeting SDGs target 3.4, but survivors remain disabled for the rest of their lives.
ResultsIn 2030, the unconditional probability of premature mortality for four major NCDs in most countries remained at more than two-thirds of the 2015 baseline. In all scenarios, the high-income group has the greatest potential gains in HALE[30-70), above the global average of HALE[30-70). In scenario A, the potential gains in HALE[30-70) of reducing premature mortality for four major NCDs are significantly lower than those for all NCDs (range of difference for all income groups: 2.88 - 3.27 years). In scenarios B and C, the potential gains of HALE[30-70) in reducing premature mortality for all NCDs and the four major NCDs are similar (scenario B: 0.14 - 0.22, scenario C: 0.05 - 0.19). In scenarios A and B, countries from the high-income group have the greatest potential gains in HALE[30-70) from cancer intervention, whilst countries from the other income groups result in a greater possible HALE[30-70) gains from cardiovascular diseases control. In scenario C, countries from each income group have the largest potential gains in HALE[30-70) from diabetes reduction and chronic respiratory diseases prevention.
ConclusionsAchieving SDGs target 3.4 remains challenging for most countries. The elimination of disability among the population who benefit from the target could lead to a sizable improvement in HALE[30-70). Reducing premature death and disability at once and attaching equal importance to each to in line with the WHO goal of "leaving no one behind". | public and global health |
10.1101/2022.04.06.22273530 | Household food insecurity risk indices for English neighbourhoods: measures to support local policy decisions | BackgroundIn England, the responsibility to address food insecurity lies with local government, yet the prevalence of this social inequality is unknown in small subnational areas. In 2018 an index of small-area household food insecurity risk was developed and utilised by public and third sector organisations to target interventions; this measure needed updating to better support decisions in different contexts.
MethodsWe held interviews with stakeholders (n=11) and completed a scoping review to identify appropriate variables to create an updated risk measure. We then sourced a range of open access secondary data to develop an indices of food insecurity risk in English neighbourhoods. Following a process of data transformation and normalisation, we tested combinations of variables and identified the most appropriate data to reflect household food insecurity risk in urban and rural areas.
ResultsEight variables, reflecting both household circumstances and local service availability, were separated into two domains with equal weighting for a new index, the Complex Index, and a subset of these make up the Simple Index. Within the Complex Index the Compositional Domain includes population characteristics while the Structural Domain reflects access to resources. The Compositional Domain is correlated well with free school meal eligibility (rs=0.705) and prevalence of childhood obesity (rs=0.641). This domain was the preferred measure for use in most areas when shared with stakeholders, and when assessed alongside other configurations of the variables. Areas of highest risk were most often located in the North of England.
ConclusionWe recommend the use of the Compositional Domain for all areas, with inclusion of the Structural Domain in rural areas where locational disadvantage makes it more difficult to access services. These measures can aid local policy makers and planners when allocating resources and interventions to support households who may experience food insecurity. | public and global health |
10.1101/2022.04.08.22273608 | Outpatient and Home Pulmonary Rehabilitation Program Post COVID-19: A study protocol for clinical trial | BackgroundThe coronavirus disease 2019 (COVID-19) is a widespread, highly contagious inflammatory process that causes respiratory, physical and psychological dysfunction. COVID-19 mainly affects the respiratory system and evolves in the acute phase from mild cases with common symptoms, such as fever, cough, and fatigue, to the moderate-to-severe form, causing massive alveolar damage resulting in dyspnea and hypoxemia that can rapidly progress to pneumonia, and acute respiratory distress syndrome. The acute form usually causes severe pulmonary sequelae such as pulmonary fibrosis or progression to organ failure, leading to worsening metabolic dysfunction and/or death.
PurposeTo verify the effects of an outpatient and home pulmonary rehabilitation program (PRP) on clinical symptoms, pulmonary function, physical activity level, functional status, autonomic activity, peripheral muscle strength, static and functional balance, functional mobility, anxiety and depression, post-traumatic stress, health-related quality of life, and survival of patients with sequelae from COVID-19.
MethodsThis study will be a cohort, parallel, two-arm multicentric study, to be carried out in three clinical centers, with blind evaluation, with 06 weeks of training and follow-up. This study was designed according to the recommendations of the CONSORT statement. To be involved in this clinical study, according to the inclusion criteria, women and men aged between 16 and 75 years affected by COVID-19. The proposed PRP is based on the guidelines recommended by the Global Initiative for Chronic Obstructive Lung Disease and, consists of a combination of aerobic and muscle strengthening exercises, lasting six weeks, with a frequency of three times a week.
DiscussionIn patients infected with COVID-19 with persistent symptoms and sequelae, PRP mainly seeks to improve dyspnea, relieve anxiety and depression, prevent, and reduce complications and/or dysfunctions, reduce morbidity and mortality, and improve health-related quality of life.
Trial registrationThis study was registered at clinicaltrials.gov (ID: COVID-19 PULMONARY REHAB NCT04982042). | rehabilitation medicine and physical therapy |
10.1101/2022.04.06.22273526 | Psychosocial predictors of COVID-19 vaccine uptake among pregnant women: a cross-sectional study in Greece | BackgroundUnvaccinated pregnant women with symptomatic COVID-19 have been found to have a higher risk of iatrogenic preterm births, intensive care unit admission, and invasive ventilation.
ObjectiveTo estimate the vaccination rate of pregnant women against the COVID-19 and to evaluate psychosocial factors associated with vaccine uptake among them.
MethodsWe conducted an anonymous cross-sectional study with a convenience sample in Greece from December 2021 to March 2022. We measured socio-demographic data of pregnant women, COVID-19-related vaccination status, worry about the side effects of COVID-19 vaccines, trust in COVID-19 vaccines, and COVID-19-related stress.
ResultsThe study population included 812 pregnant women with a mean age of 31.6 years. Among the pregnant women, 58.6% had received a COVID-19 vaccine. The most important reasons that pregnant women were not vaccinated were doubts about the safety and effectiveness of COVID-19 vaccines (31.4%), fear that COVID-19 vaccines could be harmful to fetus (29.4%), and fear of adverse side effects of COVID-19 vaccines (29.4%). Increased danger and contamination fears, increased fears about economic consequences, and higher levels of trust in COVID-19 vaccines were related with COVID-19 vaccine uptake. On the other hand, increased compulsive checking and reassurance seeking and increased worry about the adverse side effects of COVID-19 vaccines reduced the likelihood of pregnant women being vaccinated against the COVID-19.
ConclusionsAn understanding of the psychosocial factors associated with COVID-19 vaccine uptake in pregnant women is paramount to persuade women to get vaccinated against the COVID-19. There is a need for targeted educational campaigns to increase knowledge about COVID-19 vaccines and reduce COVID-19 vaccine hesitancy in pregnancy. | obstetrics and gynecology |
10.1101/2022.04.05.22273477 | Molecular identification of species of family Chaetomiaceae (Sordariomycetes, Ascomycota) from soil, dung and water in Sudan | Species of Chaetomiaceae family are ubiquitous filamentous fungi, that responsible for a wide range of opportunistic human infections. To date, it encompasses more than 300 species have been described in genus Chaetomium. It have been globally reported as being are capable of colonizing various substrates. Due to the lack of genetic studies on the species belonging to this genus in Sudan, This work aimed to investigate the environmental fungal occurrence within the family Chaetomiaceae by using morphological characters and molecular sequencing.
A total of 260 environmental samples from soil, animal dung and water were collected from six different states in Sudan in two ecozones: .desert or semis desert ecozones (Dongola in Northern sudan, El-Obeid in western sudan); a low rainfall woodland savanna ecozone (Gazira, El Geteina and Khartoum from central Sudan and AlQadarif in eastern Sudan).
During a study of environmental fungi in Sudan, 119 isolates were identified as members of Chaetomiaceae after the ITS sequencing combined with an examination of the macro- and micromorphology. Out of 63 Chaetomium strains obtained from soil, animal dung and water samples, 25 were obtained from soil, 22 from animal dung and 16 from water. 56 additional strains isolated from other genus within the Chaetomiaceae family, such as (Amesia, Collariella, Ovatospora, Subramaniula and Thielavia) were recorded for the first time in Sudan.
In conclusion: Sequence-based identification of fungal isolates is often considered to be the most reliable and accurate identification method. | genetic and genomic medicine |
10.1101/2022.04.08.22273605 | Impact of SARS-CoV-2 vaccination on systemic immune responses in people living with HIV | The severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) causes coronavirus disease 2019 (COVID-19) and an ongoing global pandemic. Despite the development of vaccines, which protect healthy people from severe and life-threatening COVID-19, the immunological responses of people with secondary immunodeficiencies to SARS-CoV-2 mRNA vaccines are currently not well understood. Human Immunodeficiency Virus (HIV), causing acquired immunodeficiency syndrome (AIDS), targets CD4+ T helper (Th) cells that orchestrate the immune response. Anti-retroviral therapy suppresses HIV burden and restores Th cell numbers. Here, we investigated the humoral and cellular immune responses elicited by the BTN162b2 vaccine in a cohort of people living with HIV (PLWH), who receive anti-retroviral therapy. While antibody responses in PLWH increased progressively after the first and second vaccination compared to baseline, they were reduced compared to HIV negative study participants (controls). CD8+ T cells exhibited a general activated phenotype and increased effector and effector memory compartments. In contrast, CD4+ Th cell responses exhibited a vaccination-dependent increase and were comparable between PLWH and controls. In line with their reduced humoral response, the correlation between neutralizing antibodies and the CD4+ T cell response was decreased in PLWH compared to healthy controls. Interestingly, CD4+ T cell activation negatively correlated with the CD4 to CD8 ratio, indicating that low CD4 T cell numbers do not necessarily interfere with cellular immune responses. Taken together, our data demonstrate that COVID-19 mRNA vaccination in PLWH results in potent cellular immune responses, but the reduced antibody responses suggest that booster vaccination might be required for preventing disease. | allergy and immunology |
10.1101/2022.04.07.22273504 | Histopathological growth patterns of liver metastasis: updated consensus guidelines for pattern scoring, perspectives, and recent mechanistic insights. | The first consensus guidelines for scoring the histopathological growth patterns (HGPs) of liver metastases were established in 2017. Since then, numerous studies have applied these guidelines, have further substantiated the potential clinical value of the HGPs in patients with liver metastases from various tumour types and are starting to shed light on the biology of the distinct HGPs. In the present guidelines, we give an overview of these studies, discuss novel strategies for predicting the HGPs of liver metastases, such as deep learning algorithms for whole slide histopathology images and medical imaging, and highlight liver metastasis animal models that exhibit features of the different HGPs. Based on a pooled analysis of large cohorts of patients with liver-metastatic colorectal cancer, we propose a new cut-off to categorize patients according to the HGPs. An up-to-date standard method for HGP assessment within liver metastases is also presented with the aim of incorporating HGPs into the decision-making processes surrounding the treatment of patients with liver metastatic cancer. Finally, we propose hypotheses on the cellular and molecular mechanisms that drive the biology of the different HGPs, opening some exciting pre-clinical and clinical research perspectives. | pathology |
10.1101/2022.04.04.22273397 | Pretend play predicts receptive and expressive language trajectories in young children with autism | The effect of pretend play in 2 to 5-year-old children with ASD was investigated in the largest and the longest observational study to-date. Parents assessed the development of 7,069 children quarterly for three years on five subscales: combinatorial receptive language, expressive language, sociability, sensory awareness, and health. Pretend play was associated with superior developmental trajectories: 1.9-fold faster improvement of combinatorial receptive language (p<0.0001), 1.4-fold faster improvement of expressive language (p<0.0001), and 1.3-fold faster improvement of sensory awareness (p=0.0009). Pretend play had little effect on sociability and health. The strong association of pretend play with combinatorial receptive language remained significant even when controlling for expressive language. Similarly, the effect of pretend play on expressive language remained significant even when controlling for combinatorial receptive language. The effect of pretend play on combinatorial receptive language (but not on the expressive language) was stronger than the effects of seizures, sleep problems or high-TV exposure. The strong effect of pretend-play supports earlier studies indicating that it is an important stepping stone for language acquisition, particularly, the acquisition of combinatorial language. | pediatrics |
10.1101/2022.04.05.22273473 | Therapeutic of four antiviral drugs in treatment of COVID-19: A protocol for systematic review and Network Meta-analysis | IntroductionThe COVID-19 pandemic has resulted in a global crisis in public health, however, there are still no safe and effective drugs to resist COVID-19 until now. We will use network meta-analysis to analysis available evidence from RCTs to compare the safety and efficacy of four antiviral drugs (including Ribavirin, Arbidol, Chloroquine Phosphate and Interferon) alone or in combination, in patients with COVID-19 on the basis of standard treatment, to reveal the robustness and strength of evidence for relative efficacy against COVID-19, which will provide better evidence for future clinical decision-making.
MethodsUsing English and Chinese search strategies to search 8 databases including PubMed, Web of Science, Embase, the Cochrane Library, CNKI, CBM, WANFANG Database and VIP. In addition, manual search for references in publications has been supplemented by electronic search. To enhance the effectiveness of this study, only randomized controlled trials of four antiviral drugs (Ribavirin, Arbidol, Chloroquine Phosphate, Interferon) used alone or in combination with the primary therapy shall be included.
AnalysisThe nucleic acid turning negative, complete absorption of lung inflammation, adverse reactions, aggravation and death shall be the primary outcome measures; whereas temperature return to normal, hospitalization, and positive rate after discharge will be the secondary outcomes. To ensure the quality of the systematic evaluation of this study, study screening, data extraction and quality evaluation will be carried out independently by two reviewers, and any differences will be resolved through consultation between them or by a third reviewer.
Ethics and disseminationThis systematic review will evaluate the efficacy of four antiviral drugs (Ribavirin, Arbidol, Chloroquine Phosphate and Interferon) for Covid-19 in adults. Since all included data will be obtained from published articles, it does not require ethical approval and will be published in a peer-reviewed journal.
PROSPERO registration numberCRD42022300104. | public and global health |