id
stringlengths 16
27
| title
stringlengths 18
339
| abstract
stringlengths 95
38.7k
| category
stringlengths 7
44
|
---|---|---|---|
10.1101/2022.03.09.22272098 | Duration of vaccine effectiveness against SARS-CoV2 infection, hospitalisation, and death in residents and staff of Long-Term Care Facilities (VIVALDI): a prospective cohort study, England, Dec 2020-Dec 2021 | BackgroundLong-term care facilities (LTCF) have been prioritised for vaccination, but data on potential waning of vaccine effectiveness (VE) and the impact of booster doses in this vulnerable population remains scarce.
MethodsWe included residents and staff from 331 LTCFs enrolled in VIVALDI (ISRCTN 14447421), who underwent routine PCR testing between Dec 8, 2020 - Dec 11, 2021 in a Cox proportional hazards regression, estimating VE against SARS-CoV2 infection, COVID-19-related hospitalisation, and COVID-19-related death after 1-3 vaccine doses, stratifying by previous SARS-CoV2 exposure.
ResultsFor 15,518 older residents, VE declined from 50{middle dot}7% (15{middle dot}5, 71{middle dot}3) to 17{middle dot}2% ([~]23{middle dot}9, 44{middle dot}6) against infection; from 85{middle dot}4% (60{middle dot}7, 94{middle dot}.6) to 54{middle dot}3% (26{middle dot}2, 71{middle dot}7) against hospitalisation; and from 94{middle dot}4% (76{middle dot}4, 98{middle dot}7) to 62{middle dot}8% (32{middle dot}9, 79{middle dot}4) against death, when comparing 2-12 weeks and [≥]12 weeks after two doses. For 19,515 staff, VE against infection declined slightly from 50{middle dot}3% (32{middle dot}7, 63{middle dot}3) to 42{middle dot}1% 29{middle dot}5, 52{middle dot}4). High VE was restored following a third dose, with VE of 71{middle dot}6% (53{middle dot}5, 82{middle dot}7) and 78{middle dot}3% (70{middle dot}1, 84{middle dot}3) against infection and 89{middle dot}9% (80{middle dot}0, 94{middle dot}6) and 95{middle dot}8% (50{middle dot}4, 99{middle dot}6) against hospitalisation, for residents and staff respectively; and 97{middle dot}5% (88{middle dot}1, 99{middle dot}5) against death for residents.
InterpretationSubstantial waning of VE is observed against all outcomes in residents from 12 weeks after a primary course of AstraZeneca or mRNA vaccines. Boosters restore protection, and maximise immunity across all outcomes. These findings demonstrate the importance of boosting and the need for ongoing surveillance of VE in this vulnerable cohort.
FundingUK Government Department of Health and Social Care.
Research in ContextO_ST_ABSEvidence before this studyC_ST_ABSWe searched MEDLINE and medRxiv for studies reporting vaccine effectiveness (VE) over time after two or three doses against SARS-CoV2 infection, COVID-19-related hospitalisation, or COVID-19-related death amongst staff or residents of long-term care facilities (LTCFs), that were published between Jan 1, 2020, and December 21, 2021. We used variations of the search terms "COVID-19" OR "SARS-CoV-2" AND "vaccine effectiveness" OR "vaccine efficacy" AND "care homes" OR "long term care facilities".
We identified 8 articles reporting two-dose data from LTCFs, including 1 peer-reviewed paper from Israel, 1 preprint from Denmark, 1 preprint from Norway, 1 peer-reviewed paper from France, two peer-reviewed papers from Spain, 1 peer-reviewed paper from the USA, and 1 preprint from England; however none of these studies examined waning of protection over time after two doses. Five studies (mRNA vaccines 3-4 weeks interval) reported short-term two-dose VE of 49-71% in residents, and 82-90% in staff. Two-dose VE was reported to be 75-88% against hospitalisation, 87-97% against death, and 86% against either outcome. An English study of residents (Pfizer or AstraZeneca, 8-12 week interval) reported 73% VE against infection and noted VE waning from 7 weeks after the first dose, but did not examine waning after the second dose. All of these studies were set prior to emergence of the Delta variant and did not examine waning of immunity due to short lengths of follow-up after Dose 2. Only one study (USA) compared Pfizer/Moderna two-dose VE against infection in LTCF residents before (67{middle dot}5% [60{middle dot}1-73{middle dot}5%]) and during (53{middle dot}1% [49{middle dot}1-56{middle dot}7%]) Delta variant predominance; however, authors could not access vaccination dates therefore did not account for any waning of immunity over time; they also did not examine any severe clinical outcomes.
We identified only one correspondence piece from Israel (Pfizer 3-4 week interval) describing the benefit of a third booster dose in LTCFs; it reported relative rate reductions of 71% for infection and 80%, for hospitalisation in the period after booster roll-out. However, individual-level VE estimates by time since vaccination were not reported, and adjustment for prior infection was not undertaken.
Overall, there was a paucity of data on non-mRNA vaccines, waning of immunity over time after two doses, and VE following a third (booster) dose in LTCF populations, which we address in this study.
Added value of this studyWe report findings from a prospective cohort study that includes 15,518 residents and 19,515 staff from 331 LTCFs across England, who underwent routine PCR testing 2-3 times per month, looking at SARS-CoV2 vaccine effectiveness over 12 months (Dec 8, 2020-Dec 11, 2021), which is the longest duration of follow-up of any study within this vulnerable cohort. We evaluated the effectiveness of first, second, and booster vaccine doses of AstraZeneca, Pfizer, and Moderna against infection, hospitalisation, and death over the 12 months when the Alpha and Delta variants were dominant. Our findings affirm that complete vaccination with two doses of AstraZeneca or mRNA vaccines offers moderate protection against infection, and high protection against severe clinical outcomes, however this protection declines over time, particularly for residents. A third booster dose of an mRNA vaccine restores, and indeed maximises, VE to 71{middle dot}6% (53{middle dot}5, 82{middle dot}7) and 78{middle dot}3% (70{middle dot}1, 84{middle dot}3) against infection, and 89{middle dot}9% (80{middle dot}0, 94{middle dot}6) and 95{middle dot}8% (50{middle dot}4, 99{middle dot}6) against hospitalisation, for residents and staff respectively, and to 97{middle dot}5% (88{middle dot}1, 99{middle dot}5) against death for residents, with similar protection offered after the third dose irrespective of primary course type.
This is the first study to examine and describe waning of immunity over a one-year period, as well as vaccine effectiveness of a booster dose, in a large cohort of LTCF staff and residents.
Implications of all the available evidenceTaken together, our findings indicate high short-term immunity against SARS-CoV2 infection and very high immunity against severe clinical outcomes of COVID-19 for LTCF residents and staff following vaccination. However substantial waning in vaccine-derived immunity is seen beyond 3 months, irrespective of vaccine type, suggesting the need for regular boosting to maintain protection in this vulnerable cohort. Although this analysis took place in the pre-Omicron period, these trends of waning immunity over time are likely to be generalisable across variants, carrying important implications for long-term vaccination policy in LTCFs. Ongoing surveillance in this vulnerable cohort remains crucial, in order to describe further changes in vaccine-induced immunity, particularly in the context of new variants. | infectious diseases |
10.1101/2022.03.09.22272098 | Duration of vaccine effectiveness against SARS-CoV2 infection, hospitalisation, and death in residents and staff of Long-Term Care Facilities (VIVALDI): a prospective cohort study, England, Dec 2020-Dec 2021 | BackgroundLong-term care facilities (LTCF) have been prioritised for vaccination, but data on potential waning of vaccine effectiveness (VE) and the impact of booster doses in this vulnerable population remains scarce.
MethodsWe included residents and staff from 331 LTCFs enrolled in VIVALDI (ISRCTN 14447421), who underwent routine PCR testing between Dec 8, 2020 - Dec 11, 2021 in a Cox proportional hazards regression, estimating VE against SARS-CoV2 infection, COVID-19-related hospitalisation, and COVID-19-related death after 1-3 vaccine doses, stratifying by previous SARS-CoV2 exposure.
ResultsFor 15,518 older residents, VE declined from 50{middle dot}7% (15{middle dot}5, 71{middle dot}3) to 17{middle dot}2% ([~]23{middle dot}9, 44{middle dot}6) against infection; from 85{middle dot}4% (60{middle dot}7, 94{middle dot}.6) to 54{middle dot}3% (26{middle dot}2, 71{middle dot}7) against hospitalisation; and from 94{middle dot}4% (76{middle dot}4, 98{middle dot}7) to 62{middle dot}8% (32{middle dot}9, 79{middle dot}4) against death, when comparing 2-12 weeks and [≥]12 weeks after two doses. For 19,515 staff, VE against infection declined slightly from 50{middle dot}3% (32{middle dot}7, 63{middle dot}3) to 42{middle dot}1% 29{middle dot}5, 52{middle dot}4). High VE was restored following a third dose, with VE of 71{middle dot}6% (53{middle dot}5, 82{middle dot}7) and 78{middle dot}3% (70{middle dot}1, 84{middle dot}3) against infection and 89{middle dot}9% (80{middle dot}0, 94{middle dot}6) and 95{middle dot}8% (50{middle dot}4, 99{middle dot}6) against hospitalisation, for residents and staff respectively; and 97{middle dot}5% (88{middle dot}1, 99{middle dot}5) against death for residents.
InterpretationSubstantial waning of VE is observed against all outcomes in residents from 12 weeks after a primary course of AstraZeneca or mRNA vaccines. Boosters restore protection, and maximise immunity across all outcomes. These findings demonstrate the importance of boosting and the need for ongoing surveillance of VE in this vulnerable cohort.
FundingUK Government Department of Health and Social Care.
Research in ContextO_ST_ABSEvidence before this studyC_ST_ABSWe searched MEDLINE and medRxiv for studies reporting vaccine effectiveness (VE) over time after two or three doses against SARS-CoV2 infection, COVID-19-related hospitalisation, or COVID-19-related death amongst staff or residents of long-term care facilities (LTCFs), that were published between Jan 1, 2020, and December 21, 2021. We used variations of the search terms "COVID-19" OR "SARS-CoV-2" AND "vaccine effectiveness" OR "vaccine efficacy" AND "care homes" OR "long term care facilities".
We identified 8 articles reporting two-dose data from LTCFs, including 1 peer-reviewed paper from Israel, 1 preprint from Denmark, 1 preprint from Norway, 1 peer-reviewed paper from France, two peer-reviewed papers from Spain, 1 peer-reviewed paper from the USA, and 1 preprint from England; however none of these studies examined waning of protection over time after two doses. Five studies (mRNA vaccines 3-4 weeks interval) reported short-term two-dose VE of 49-71% in residents, and 82-90% in staff. Two-dose VE was reported to be 75-88% against hospitalisation, 87-97% against death, and 86% against either outcome. An English study of residents (Pfizer or AstraZeneca, 8-12 week interval) reported 73% VE against infection and noted VE waning from 7 weeks after the first dose, but did not examine waning after the second dose. All of these studies were set prior to emergence of the Delta variant and did not examine waning of immunity due to short lengths of follow-up after Dose 2. Only one study (USA) compared Pfizer/Moderna two-dose VE against infection in LTCF residents before (67{middle dot}5% [60{middle dot}1-73{middle dot}5%]) and during (53{middle dot}1% [49{middle dot}1-56{middle dot}7%]) Delta variant predominance; however, authors could not access vaccination dates therefore did not account for any waning of immunity over time; they also did not examine any severe clinical outcomes.
We identified only one correspondence piece from Israel (Pfizer 3-4 week interval) describing the benefit of a third booster dose in LTCFs; it reported relative rate reductions of 71% for infection and 80%, for hospitalisation in the period after booster roll-out. However, individual-level VE estimates by time since vaccination were not reported, and adjustment for prior infection was not undertaken.
Overall, there was a paucity of data on non-mRNA vaccines, waning of immunity over time after two doses, and VE following a third (booster) dose in LTCF populations, which we address in this study.
Added value of this studyWe report findings from a prospective cohort study that includes 15,518 residents and 19,515 staff from 331 LTCFs across England, who underwent routine PCR testing 2-3 times per month, looking at SARS-CoV2 vaccine effectiveness over 12 months (Dec 8, 2020-Dec 11, 2021), which is the longest duration of follow-up of any study within this vulnerable cohort. We evaluated the effectiveness of first, second, and booster vaccine doses of AstraZeneca, Pfizer, and Moderna against infection, hospitalisation, and death over the 12 months when the Alpha and Delta variants were dominant. Our findings affirm that complete vaccination with two doses of AstraZeneca or mRNA vaccines offers moderate protection against infection, and high protection against severe clinical outcomes, however this protection declines over time, particularly for residents. A third booster dose of an mRNA vaccine restores, and indeed maximises, VE to 71{middle dot}6% (53{middle dot}5, 82{middle dot}7) and 78{middle dot}3% (70{middle dot}1, 84{middle dot}3) against infection, and 89{middle dot}9% (80{middle dot}0, 94{middle dot}6) and 95{middle dot}8% (50{middle dot}4, 99{middle dot}6) against hospitalisation, for residents and staff respectively, and to 97{middle dot}5% (88{middle dot}1, 99{middle dot}5) against death for residents, with similar protection offered after the third dose irrespective of primary course type.
This is the first study to examine and describe waning of immunity over a one-year period, as well as vaccine effectiveness of a booster dose, in a large cohort of LTCF staff and residents.
Implications of all the available evidenceTaken together, our findings indicate high short-term immunity against SARS-CoV2 infection and very high immunity against severe clinical outcomes of COVID-19 for LTCF residents and staff following vaccination. However substantial waning in vaccine-derived immunity is seen beyond 3 months, irrespective of vaccine type, suggesting the need for regular boosting to maintain protection in this vulnerable cohort. Although this analysis took place in the pre-Omicron period, these trends of waning immunity over time are likely to be generalisable across variants, carrying important implications for long-term vaccination policy in LTCFs. Ongoing surveillance in this vulnerable cohort remains crucial, in order to describe further changes in vaccine-induced immunity, particularly in the context of new variants. | infectious diseases |
10.1101/2022.03.09.22272155 | Identification and Quantification of Bioactive Compounds Suppressing SARS-CoV-2 Signals in Wastewater-based Epidemiology Surveillance | Recent SARS-CoV-2 wastewater-based epidemiology (WBE) surveillance have documented a positive correlation between the number of COVID-19 patients in a sewershed and the level of viral genetic material in the wastewater. Efforts have been made to use the wastewater SARS-CoV-2 viral load to predict the infected population within each sewershed using a multivariable regression approach. However, reported clear and sustained variability in SARS-CoV-2 viral load among treatment facilities receiving industrial wastewater have made clinical prediction challenging. Several classes of molecules released by regional industries and manufacturing facilities, particularly the food processing industry, can significantly suppress the SARS-CoV-2 signals in wastewater by breaking down the lipid-bilayer of the membranes. Therefore, a systematic ranking process in conjugation with metabolomic analysis was developed to identify the wastewater treatment facilities exhibiting SARS-CoV-2 suppression and identify and quantify the chemicals suppressing the SARS-COV-2 signals. By ranking the viral load per diagnosed case among the sewersheds, we successfully identified the wastewater treatment facilities in Missouri, USA that exhibit SARS-CoV-2 suppression (significantly lower than 5 x 1011 gene copies/reported case) and determined their suppression rates. Through both untargeted global chemical profiling and targeted analysis of wastewater samples, 40 compounds were identified as candidates of SARS-CoV-2 signal suppression. Among these compounds, 14 had higher concentrations in wastewater treatment facilities that exhibited SARS-CoV-2 signal suppression compared to the unsuppressed control facilities. Stepwise regression analyses indicated that 4-nonylphenol, palmitelaidic acid, sodium oleate, and polyethylene glycol dioleate are positively correlated with SARS-CoV-2 signal suppression rates. Suppression activities were further confirmed by incubation studies, and the suppression kinetics for each bioactive compound were determined. According to the results of these experiments, bioactive molecules in wastewater can significantly reduce the stability of SARS-CoV-2 genetic marker signals. Based on the concentrations of these chemical suppressors, a correction factor could be developed to achieve more reliable and unbiased surveillance results for wastewater treatment facilities that receive wastewater from similar industries. | epidemiology |
10.1101/2022.03.09.22272147 | Associations between maternal infections during pregnancy and childhood IQ scores using the ALSPAC birth cohort | Maternal prenatal infections have been linked to childrens neurodevelopment and cognitive outcomes. It remains unclear, however, whether infections occurring during specific vulnerable gestational periods can affect childrens cognitive outcomes. The study aimed to examine maternal infections in each trimester of pregnancy and associations with childrens verbal, performance, and total IQ scores. The ALSPAC birth cohort was used to investigate associations between maternal infections in pregnancy and childhood IQ outcomes. Infection data from mothers and cognition data from children were included with the final study sample size comprising 7,410 mother-child participants. Regression analysis was used to examine links between maternal infections occurring at each trimester of pregnancy and childrens cognitive IQ scores at 18 months, 4 years, and 8 years. Infections in the third trimester were significantly associated with decreased verbal IQ at age 4 (p<.05, adjusted R2 = .004); decreased verbal IQ (p<.01, adjusted R2 = .001), performance IQ (p<.01, adjusted R2 = .0008), and total IQ at age 8 (p<.01, adjusted R2 = .001). Results suggest that later maternal infections could have a latent effect on cognitive development, only emerging when cognitive load increases over time, though magnitude of effect appears to be small. Performance IQ may be more vulnerable to trimester-specific exposure to maternal infection as compared to verbal IQ. Future research could include examining potential mediating mechanisms on childhood cognition, such as possible moderating effects of early childhood environmental factors, and if effects persist in future cognitive outcomes. | epidemiology |
10.1101/2022.03.10.22271048 | Accumulation of dihydrosphingolipids and neutral lipids is related to steatosis and fibrosis damage in human and animal models of non-alcoholic fatty liver disease | BackgroundDihydrosphingolipids are lipid molecules biosynthetically related to ceramides. An increase in ceramides is associated with enhanced fat storage in the liver and inhibition of their synthesis is reported to prevent the appearance of steatosis in animal models. However, the precise association of dihydrosphingolipids with non-alcoholic fatty liver disease (NAFLD) is yet to be established. We employed a diet-induced NAFLD mouse model to study the association between this class of compounds and disease progression.
MethodsMice were fed a high-fat diet enriched in cholesterol and supplemented with glucose and fructose up to 40 weeks. A mouse subgroup was treated with carbon tetrachloride to accelerate fibrosis development. Animals were sacrificed at different time-points to reproduce the full spectrum of histological damage found in human disease, including steatosis (NAFL) and steatohepatitis (NASH) with and without significant fibrosis. Blood and liver tissue samples were obtained from patients (n=195) whose NAFLD severity was assessed histologically. Lipidomic analysis was performed using liquid chromatography-tandem mass spectrometry.
ResultsTriglyceride, cholesterol ester and dihydrosphingolipid levels were increased in the liver of model mice in association with the degree of steatosis. Dihydroceramide concentrations increased with the histological severity of the disease in liver samples of mice (0.024 {+/-} 0.003 vs 0.049 {+/-} 0.005, non-NAFLD vs NASH-fibrosis, p<0.0001) and patients (0.105 {+/-} 0.011 vs 0.165 {+/-} 0.021, p=0.0221). Several dihydroceramide and dihydrosphingomyelin species were increased in plasma of NAFLD patients and correlated with accumulation of liver triglycerides.
ConclusionsDihydrosphingolipids accumulate in the liver in response to increased free fatty acid overload and are correlated with progressive histological damage in NAFLD. The increase in dihydrosphingolipids is related to upregulation of hepatic expression of enzymes involved in de novo synthesis of ceramides.
HIGHLIGHTSO_LINeutral lipids and dihydrosphingolipids accumulate in liver in correlation with the histological severity of NAFLD in both mice and humans.
C_LIO_LIThe ceramide pathway is stimulated to alleviate the free fatty acid excess in liver of NAFLD models.
C_LIO_LIAppearance of significant fibrosis is associated with reduced concentrations of neutral lipids but not dihydrosphingolipids in a mouse model of NAFLD.
C_LI | gastroenterology |
10.1101/2022.03.09.22272162 | Preliminary Rasch analysis of the Multidimensional Assessment of Interoceptive Awareness in adults with stroke | PurposeThe Multidimensional Assessment of Interoceptive Awareness (MAIA) measures interoceptive body awareness, which includes aspects such as attention regulation, self-regulation, and body listening. Our purpose was to validate the MAIA in adults with stroke using Rasch Measurement Theory.
MethodsThe original MAIA has 32 items grouped into eight separately scored subscales that measure aspects of body awareness. Using Rasch Measurement Theory, we evaluated the unidimensionality of the entire scale and investigated person and item fit, person separation reliability, targeting, local item dependence, and principal components analysis of residuals.
ResultsForty-one adults with chronic stroke (average 3.8 years post-stroke, 13 women, and average age, 57{+/-}13 years) participated in the study. Overall fit (X2=62.26, p=0.26) and item fit were obtained after deleting 3 items and rescoring 26 items. One participant did not fit the model (2.44%). There were no floor (0.00%) or ceiling effects (0.00%). Local item dependence was found in 42 pairs. The person separation reliability was 0.91, and the person mean location was 0.06{+/-}1.12 logits.
ConclusionsThe MAIA demonstrated good targeting and reliability, as well as good item and person fit in adults with chronic stroke. A study with a larger sample size is needed to validate our findings. | rehabilitation medicine and physical therapy |
10.1101/2022.03.09.22272162 | Preliminary Rasch analysis of the Multidimensional Assessment of Interoceptive Awareness in adults with stroke | PurposeThe Multidimensional Assessment of Interoceptive Awareness (MAIA) measures interoceptive body awareness, which includes aspects such as attention regulation, self-regulation, and body listening. Our purpose was to validate the MAIA in adults with stroke using Rasch Measurement Theory.
MethodsThe original MAIA has 32 items grouped into eight separately scored subscales that measure aspects of body awareness. Using Rasch Measurement Theory, we evaluated the unidimensionality of the entire scale and investigated person and item fit, person separation reliability, targeting, local item dependence, and principal components analysis of residuals.
ResultsForty-one adults with chronic stroke (average 3.8 years post-stroke, 13 women, and average age, 57{+/-}13 years) participated in the study. Overall fit (X2=62.26, p=0.26) and item fit were obtained after deleting 3 items and rescoring 26 items. One participant did not fit the model (2.44%). There were no floor (0.00%) or ceiling effects (0.00%). Local item dependence was found in 42 pairs. The person separation reliability was 0.91, and the person mean location was 0.06{+/-}1.12 logits.
ConclusionsThe MAIA demonstrated good targeting and reliability, as well as good item and person fit in adults with chronic stroke. A study with a larger sample size is needed to validate our findings. | rehabilitation medicine and physical therapy |
10.1101/2022.03.09.22272158 | Distinct immune-effector and metabolic profile of CD8+ T Cells in patients with autoimmune polyarthritis induced by therapy with immune-checkpoint inhibitors | ObjectivesRheumatic immune-related adverse events (irAE) such as (poly)arthritis in patients undergoing immune checkpoint inhibitor (ICI) treatment pose a major clinical challenge. ICI-therapy improves CD8+ T cell (CD8) function, but CD8 contribute to chronic inflammation in autoimmune arthritis (AA). Thus, we studied whether immune-functional and metabolic changes in CD8 explain the development of musculoskeletal irAE in ICI-treated patients.
MethodsPeripheral CD8 obtained from ICI-treated patients with and without musculoskeletal irAEs and from AA-patients with and without history of malignancy were stimulated in media containing 13C-labeled glucose with and without Tofacitinib. Changes in metabolism, immune-mediator release, expression of effector cell-surface molecules, and inhibition of tumor cell growth were quantified.
ResultsCD8 from irAE patients showed significantly lower frequency and expression of cell-surface molecules characteristic for activation, effector-functions, homing, exhaustion and apoptosis and reduced release of cytotoxic and pro-inflammatory immune-mediators compared to CD8 from ICI-patients who did not develop irAE. This was accompanied by a lower glycolytic rate. Gene-expression analysis of pre-ICI-treated CD8 revealed over 30 differentially expressed transcripts in patients who later developed musculoskeletal irAEs. In vitro Tofacitinib treatment did not significantly change the immune-metabolic profile nor the capacity to inhibit the growth of the human lung-cancer cell-line H838.
ConclusionsOur study shows that CD8 from ICI-treated patients who develop a musculoskeletal irAE have a distinct immune-effector and metabolic profile from those that remain irAE-free. This specific irAE profile overlaps with the one observed in CD8 from AA-patients and may prove useful for novel therapeutic strategies to manage ICI-induced irAEs.
Key messages
What is already known about this subject?O_LIImmune-checkpoint inhibition (ICI) therapies have a high success rate regarding progression free and overall survival of cancer patients. However, up to 20% of the ICI-treated patients develop musculoskeletal immune-related adverse events (irAE) that are often associated with severely reduced quality of life.
C_LIO_LITo avoid precocious ICI-treatment termination, strategies to treat rheumatic irAE have to be simultaneously efficient in curbing musculoskeletal symptoms without interfering with the antitumoral therapy.
C_LIO_LICD8+ T cells play a pivotal role both in arthritis pathogenesis and antitumoral responses.
C_LI
What does this study add?O_LIImmuno-functional and metabolic analysis of peripheral CD8+ T cells from patients with musculoskeletal irAEs revealed that they share a common profile with those from patients with chronic autoimmune polyarthritis (AA) but are distinct from ICI-treated patients who remained irAE-free.
C_LIO_LICD8+ T cells from irAE patients treated in vitro with the JAK-pathway inhibitor Tofacitinib still maintained the capacity to release cytokines and cytolytic molecules, express immune-effector cell surface molecules, and prevent the growth of a human lung-cancer cell line.
C_LI
How might this impact on clinical practice or future developments?O_LIThe specific immuno-functional and metabolic profile in rheumatic irAEs and its overlap to AA-profile is a potential starting point for a better understanding of the pathogenesis and identification of ICI-patients at risk of developing an irAE.
C_LIO_LIJAK inhibitors may expand the thus far limited therapeutic armamentarium to cope with severe, refractory and / or chronical rheumatic irAEs.
C_LI | rheumatology |
10.1101/2022.03.09.22272149 | The Social Factors, Epigenomics, and Lupus in African American Women (SELA) study: protocol for an observational mechanistic study examining the interplay of multiple individual and social factors on lupus outcomes in a health disparity population | IntroductionDespite the disproportional impact of systemic lupus erythematosus (SLE) on historically marginalized racial and ethnic communities, the individual and sociocultural factors underlying these health disparities remain elusive. We report the design and methods for a study aimed at identifying the epigenetic mechanisms by which risk and resiliency social factors affect gene function and thereby influence SLE in a health disparity population.
Methods and analysisThe Social Factors, Epigenomics, and Lupus in African American Women (SELA) study is a cross-sectional, case-control study involving the Medical University of South Carolina, Emory University, and Wake Forest School of Medicine. A total of 600 self-reported African American females will be invited to participate. All participants will respond to questionnaires that capture detailed sociodemographic and medical history, validated measures of racial discrimination, vicarious racism stress, social support, healthcare utilization and lost productivity, as well as disease activity and damage for cases. Physician-reported disease activity will also be incorporated Participants will choose if they wish to receive their genetic ancestry estimates and be involved in research. Blood samples are required to provide serum, plasma, PBMCs counts, DNA and RNA. The primary goals of SELA are to identify variation in DNA methylation (DNAm) associated with self-reported exposure to racial discrimination and exposure to social support, to evaluate whether social DNAm sites affect gene expression, to identify the synergistic effects of social factors on DNAm changes on SLE, and to develop a social factors-DNAm predictive model for disease outcomes. This study was approved by and will be conducted in cooperation with the Sea Island Families Project Citizen Advisory Committee.
Discussion and disseminationSELA will respond to the pressing need to identify the regulatory mechanisms through which social exposures influence SLE in a health disparity population, clarify the interplay and underlying mechanism by which various positive and negative social determinants of health influence epigenomic variation, and how the resulting biological changes may contribute to the lupus health disparity. Results will be published and shared with patients and the community. These findings may inform the development of psychosocial interventions that prevent or mitigate risk exposures, and services or interventions that promote positive exposures. Development of these novel treatments and preventative interventions, as informed by the results of this study, is paramount to the closure of the health disparities gap. | rheumatology |
10.1101/2022.03.09.22272149 | The Social Factors, Epigenomics, and Lupus in African American Women (SELA) study: protocol for an observational mechanistic study examining the interplay of multiple individual and social factors on lupus outcomes in a health disparity population | IntroductionDespite the disproportional impact of systemic lupus erythematosus (SLE) on historically marginalized racial and ethnic communities, the individual and sociocultural factors underlying these health disparities remain elusive. We report the design and methods for a study aimed at identifying the epigenetic mechanisms by which risk and resiliency social factors affect gene function and thereby influence SLE in a health disparity population.
Methods and analysisThe Social Factors, Epigenomics, and Lupus in African American Women (SELA) study is a cross-sectional, case-control study involving the Medical University of South Carolina, Emory University, and Wake Forest School of Medicine. A total of 600 self-reported African American females will be invited to participate. All participants will respond to questionnaires that capture detailed sociodemographic and medical history, validated measures of racial discrimination, vicarious racism stress, social support, healthcare utilization and lost productivity, as well as disease activity and damage for cases. Physician-reported disease activity will also be incorporated Participants will choose if they wish to receive their genetic ancestry estimates and be involved in research. Blood samples are required to provide serum, plasma, PBMCs counts, DNA and RNA. The primary goals of SELA are to identify variation in DNA methylation (DNAm) associated with self-reported exposure to racial discrimination and exposure to social support, to evaluate whether social DNAm sites affect gene expression, to identify the synergistic effects of social factors on DNAm changes on SLE, and to develop a social factors-DNAm predictive model for disease outcomes. This study was approved by and will be conducted in cooperation with the Sea Island Families Project Citizen Advisory Committee.
Discussion and disseminationSELA will respond to the pressing need to identify the regulatory mechanisms through which social exposures influence SLE in a health disparity population, clarify the interplay and underlying mechanism by which various positive and negative social determinants of health influence epigenomic variation, and how the resulting biological changes may contribute to the lupus health disparity. Results will be published and shared with patients and the community. These findings may inform the development of psychosocial interventions that prevent or mitigate risk exposures, and services or interventions that promote positive exposures. Development of these novel treatments and preventative interventions, as informed by the results of this study, is paramount to the closure of the health disparities gap. | rheumatology |
10.1101/2022.03.09.22272131 | Parkinson's disease causality and heterogeneity: a proteogenomic view | Pathogenesis and clinical heterogeneity in Parkinsons disease (PD) have been evaluated from genetic, pathological, and clinical perspective. Technology allowing for high-throughput proteomic analyses in cerebrospinal fluid (CSF) has opened new opportunities to scrutinize this heterogeneity. This is to date the most comprehensive proteomics profiling in CSF of PD patients and controls, both in terms of subjects (n=1103) and proteins (n=4135).
Combining CSF aptamer-based proteomics with genetics across all samples we determined the protein quantitative trait loci (pQTLs) linking genetic variants with protein abundance in CSF. Analyzing pQTLs together with summary statistics from the largest PD genome wide association study (GWAS) led us to identify 68 potential causal proteins by Mendelian randomization. The top PD causal protein, GPNMB, has colocalization support and has been reported to be upregulated in the substantia nigra of PD patients.
We also examined three subcohorts of PD patients: LRRK2 variant carriers (LRRK2+), GBA variant carriers (GBA+) and idiopathic PD patients, each with their respective controls. The second goal was to identify proteomic differences between PD patients and controls within and between the three subcohorts. Statistical analyses revealed six proteins differentially expressed when comparing GBA+ PD patients with unaffected GBA+ controls, seven proteins when comparing LRRK2+ PD patients with unaffected LRRK2+ controls and 23 proteins when comparing idiopathic PD patients with healthy controls who did not carry any severe PD mutations.
Furthermore a hypothesis-free stratification of idiopathic PD patients based on their proteomic profile revealed two protein modules based on the co-expression network structure. Based on these modules, cluster analysis revealed two patient endotypes for idiopathic PD.
Differences in the CSF proteomic signature between subcohorts and between idiopathic endotypes, as well as causal targets identified using both proteomics and genetics may together influence the way we approach the identification of potential therapeutic targets in PD. | neurology |
10.1101/2022.03.10.22272119 | Digital gait biomarkers, but not clinical ataxia scores, allow to capture 1-year longitudinal change in Spinocerebellar ataxia type 3 (SCA3) | Measures of step variability and body sway during gait have shown to correlate with clinical ataxia severity in several cross-sectional studies. However, to serve as a valid progression biomarker, these gait measures have to prove their sensitivity to robustly capture longitudinal change, ideally within short time-frames (e.g. one year). We present the first multi-center longitudinal gait analysis study in spinocerebellar ataxias (SCAs). We performed a combined cross-sectional (n=28) and longitudinal (1-year interval, n=17) analysis in SCA3 subjects (including 7 pre-ataxic mutation carriers). Longitudinal analysis revealed significant change in gait measures between baseline and 1-year follow-up, with high effect sizes (stride length variability: p=0.01, effect size rprb=0.66; lateral sway: p=0.007, rprb=0.73). Sample size estimation for lateral sway reveals a required cohort size of n=43 for detecting a 50% reduction of natural progression, compared to n=240 for the clinical ataxia score SARA. These measures thus present promising motor biomarkers for upcoming interventional studies. | neurology |
10.1101/2022.03.09.22272063 | Deciphering the heterogeneous niche in the tumor progression of hepatocellular carcinoma: a Spatial single-cell landscape and multi-omics atlas analysis | Hepatocellular carcinoma (HCC) is an invasive disease which is characteristic with highly heterogeneous molecular phenotype, rich blood supply, and unique immune niche, therefore it is of great significance to explore the tumor heterogeneous niche and clonal evolution progress of these malignant cells. Based on the advance in single-cell technology, spatial transcriptome technology, and Oxford nanopore technology, this study innovatively reconstructed and delineated the heterogeneity of the HCC tumor niche and its tumor progression pattern. Our results showed that the copy number variation (CNV) of cells in cancer lesions and liver cirrhosis lesions of the same patient is basically the same and is mainly regulated by transcription factors such as TP53, HOXA7, FOXN3, and PPARG, suggests that malignant cells of common origin gradually evolve into different lesions in a very rare numbers of different CNVs, which are mainly regulated by expression patterns and mediate the heterogeneity between the tumor and cirrhosis lesions. Angiogenesis-related genes (SREBF1, ZNF585A, and HOXB5) may mediate communication between HCC subpopulations and endothelial cells via exosomes, thereby contributing to the angiogenic niche before HCC metastasis. In addition, numerous CNVs were found in patients with early recurrent HCC, and these mutated genes is the potential niche genes for the early tumor recurrence. In summary, this study provides a general transcriptional landscape of the ecological structure of HCC, systematically maps the molecular, cellular, and spatial composition of different HCC cell niches, and provides a scientific and theoretical basis at the molecular and cellular levels for personalized and accurate treatment strategies for HCC. | oncology |
10.1101/2022.03.07.22272003 | Higher polygenic risk for melanoma is associated with improved survival | BackgroundAlthough there are well-known prognostic factors for survival from cutaneous melanoma (CM) such as primary tumour thickness and stage of the tumour at diagnosis, the role of germline genetic factors in determining survival is not well understood.
ObjectiveTo perform a genome-wide association study (GWAS) meta-analysis of melanoma-specific survival (MSS), and test whether a CM-susceptibility polygenic risk score (PRS) is associated with MSS.
MethodsWe conducted two Cox proportional-hazard GWAS of MSS using data from the Melanoma Institute Australia (MIA; 5,762 patients with melanoma; 800 deaths from melanoma) and UK Biobank (UKB: 5,220 patients with melanoma; 241 deaths from melanoma). The GWAS were adjusted for age, sex and the first ten genetic principal components, and combined in a fixed-effects inverse-variance-weighted meta-analysis. Significant (P<5x10-8) results were investigated in the Leeds Melanoma Cohort (LMC; 1,947 patients with melanoma; 370 melanoma deaths). We also developed a CM-susceptibility PRS using a large independent GWAS meta-analysis (23,913 cases, 342,870 controls). The PRS was tested for an association with MSS in the MIA and UKB cohorts, with replication in the LMC.
ResultsTwo loci were significantly associated with MSS in the meta-analysis of MIA and UKB with lead SNPs rs41309643 (G allele frequency 1.6%, hazard ratio [HR] 2.09, 95% confidence interval [CI] 1.61-2.71, P=2.08x10-8) on chromosome 1, and rs75682113 (C allele frequency 1.8%, HR=2.38, 95% CI=1.77--3.21, P=1.07x10-8) on chromosome 7. While neither SNP replicated (P>0.05) in the LMC, rs75682113 was significantly associated in the combined discovery and replication sets and requires confirmation in additional cohorts.
After adjusting for age at diagnosis, sex and the first ten principal components, a one standard deviation increase in the CM-susceptibility PRS was associated with improved MSS in the discovery meta-analysis (HR=0.88, 95% CI=0.83--0.94, P=6.93x10-5; I2=88%). The association with the PRS was not replicated (P > 0.05) in LMC, but remained significantly associated with MSS in the meta-analysis of the discovery and replication results.
ConclusionWe found two loci potentially associated with MSS, and evidence that increased germline genetic susceptibility to develop CM may be associated with improved MSS. | oncology |
10.1101/2022.03.07.22272003 | Higher polygenic risk for melanoma is associated with improved survival | BackgroundAlthough there are well-known prognostic factors for survival from cutaneous melanoma (CM) such as primary tumour thickness and stage of the tumour at diagnosis, the role of germline genetic factors in determining survival is not well understood.
ObjectiveTo perform a genome-wide association study (GWAS) meta-analysis of melanoma-specific survival (MSS), and test whether a CM-susceptibility polygenic risk score (PRS) is associated with MSS.
MethodsWe conducted two Cox proportional-hazard GWAS of MSS using data from the Melanoma Institute Australia (MIA; 5,762 patients with melanoma; 800 deaths from melanoma) and UK Biobank (UKB: 5,220 patients with melanoma; 241 deaths from melanoma). The GWAS were adjusted for age, sex and the first ten genetic principal components, and combined in a fixed-effects inverse-variance-weighted meta-analysis. Significant (P<5x10-8) results were investigated in the Leeds Melanoma Cohort (LMC; 1,947 patients with melanoma; 370 melanoma deaths). We also developed a CM-susceptibility PRS using a large independent GWAS meta-analysis (23,913 cases, 342,870 controls). The PRS was tested for an association with MSS in the MIA and UKB cohorts, with replication in the LMC.
ResultsTwo loci were significantly associated with MSS in the meta-analysis of MIA and UKB with lead SNPs rs41309643 (G allele frequency 1.6%, hazard ratio [HR] 2.09, 95% confidence interval [CI] 1.61-2.71, P=2.08x10-8) on chromosome 1, and rs75682113 (C allele frequency 1.8%, HR=2.38, 95% CI=1.77--3.21, P=1.07x10-8) on chromosome 7. While neither SNP replicated (P>0.05) in the LMC, rs75682113 was significantly associated in the combined discovery and replication sets and requires confirmation in additional cohorts.
After adjusting for age at diagnosis, sex and the first ten principal components, a one standard deviation increase in the CM-susceptibility PRS was associated with improved MSS in the discovery meta-analysis (HR=0.88, 95% CI=0.83--0.94, P=6.93x10-5; I2=88%). The association with the PRS was not replicated (P > 0.05) in LMC, but remained significantly associated with MSS in the meta-analysis of the discovery and replication results.
ConclusionWe found two loci potentially associated with MSS, and evidence that increased germline genetic susceptibility to develop CM may be associated with improved MSS. | oncology |
10.1101/2022.03.09.22272106 | The effect of focused muscle contraction therapy on chronic pain and Brodmann Area activity in former National Football League players | NFL players have a traumatic injury rate approaching 100%; chronic pain with decreased concentration occur commonly. This study examined the role of a novel focused muscle contraction therapy for the treatment of chronic pain and identified its impact on brain activity. Chronic pain was assessed by numerical score, neuropathic component, and impact on daily activities in 8 retired players. Brain activity was characterized by QEEG with low-resolution electromagnetic tomography analysis and functional measures of visual and auditory attention. Focused muscle contraction muscle therapy administered twice weekly for 6 months was tapered to twice monthly by 12 months. Brodmann Areas (BA) 4 and 9, known to associate with chronic pain, showed values outside the clinically normal range; mean pain duration was 16.5 {+/-} 12.9 years. At 6 months, 5/8 subjects reported pain scores of 0. High beta wave activity was seen in BA 19, 21, 29, 30, and 39, affecting auditory, visual, and body perceptions. Clinically relevant improvements were observed in auditory attention and visual stamina. Pain relief was sustained through 18 months of follow-up. Focused muscle contraction therapy appears to redirect brain activity to new areas of activity which are associated with long-lasting relief of chronic pain and its detriments. This study was registered with clinicaltrial.gov #NCT04822311. | pain medicine |
10.1101/2022.03.09.22272156 | The impact of the one child policy on China's infant mortality from 1970-1989: A quasi-experimental study | BackgroundChinas One Child Policy (OCP), enforced in 1980, was the largest fertility control policy in human history, and its effects are still widely debated. Previous analyses of the OCP have focused on population reduction and sex ratios, among others. This study evaluates the impact of the OCP on infant mortality in China using quasi-experimental methods.
MethodsCountry-level panel data for China and 106 other countries were extracted from the World Bank data repository for the years 1970-1989. The primary outcome was infant mortality rate (IMR) defined as deaths among children <1 year of age per 1000 live births. The impact of the OCP was estimated using uncontrolled and controlled interrupted time series (ITS) designs that control for within group time varying characteristics. Models were further adjusted for a comprehensive set of covariates and corrected for first-order autocorrelation. Countries included in the control groups were identified using the method of synthetic controls and propensity score matching (PSM). Valid control groups reduced threats from selection bias and unobserved time invariant confounders.
ResultsAdjusting for GDP per capita, GDP growth, womens education, and population density, IMR decreased on average by 2.5 deaths per 1,000 annually before the OCP was enforced in China. The pre-OCP trends in IMR moved in tandem across China, synthetic and PSM controls, validating the parallel trends assumption. After the OCP, the trend in IMR in China increased by 2.24 (95% CI 1.39-3.09) per year, compared to the IMR trend in the pre-OCP period, comparing differences in China to synthetic controls. Results were similar and confirmed across PSM (1.96, 0.56-3.37) and uncontrolled ITS (2.14, 1.39-2.89) models. No significant immediate level change in IMR was observed in the controlled ITS models.
ConclusionsThe reduction in IMR slowed down significantly after the OCP was enforced in China. The slowdown is likely due to a relative increase in the ratio of the IMR of females to males suggesting that practices such as female infanticide, abandonment, and neglect, stemming from a strong son preference were primary contributors. Coercive policies to reduce fertility can have unintended consequences. | health policy |
10.1101/2022.03.09.22272171 | Vaccine-induced antibody level predicts the clinical course of breakthrough infection of COVID-19 caused by delta and omicron variants: a prospective observational cohort study | BackgroundOmicron variant viruses spread rapidly, even in individuals with high vaccination rates. This study aimed to determine the utility of the antibody against the spike protein level as a predictor of the disease course of COVID-19 in vaccinated patients.
MethodsBetween 11 December 2021 and 10 February 2022, we performed a prospective observational cohort study in South Korea, which included patients infected with delta -and -omicron variants. Multivariable logistic regression analysis to determine the association between antibody levels and the outcomes was conducted.The relationship between antibody levels and cycle threshold (Ct) values was confirmed using a generalised linear model.
ResultsFrom 106 vaccinated patients (39 delta and 67 omicron), the geometric mean titres of antibodies in patients withfever ([≥]37.5 {degrees}C), hypoxia ([≤]94% of SpO2), pneumonia, C-reactive protein (CRP) elevation (>8 mg/L), or lymphopenia (<1,100 cells/L) were 1,201.5 U/mL, 98.8 U/mL, 774.1 U/mL, 1,335.1 U/mL, and 1,032.2 U/mL, respectively. Increased antibody levels were associated with a decrease in the fever occurrence (adjusted odds ratio [aOR], 0.23; 95% confidence interval [CI], 0.12-0.51), hypoxia (aOR, 0.23; 95% CI, 0.08-0.7), CRP elevation (aOR, 0.52; 95% CI, 0.29-0.0.94), and lymphopenia (aOR, 0.57; 95% CI, 0.33-0.98). Ct values showed a positive correlation between antibody levels (P =0.02).
ConclusionAntibody levels are predictive of the clinical course of COVID-19 in vaccinated patients with delta and omicron variant infections. Our data highlight the need for concentrated efforts to monitor patients with SARS-CoV-2 infection who are at risk of low antibody levels.
SummaryIn this prospective observation cohort study, antibody level predicts clinical course of breakthrough infection of COVID-19. Fever (aOR 0.23[0.12-0.51], hypoxia (aOR 0.23[0.08-0.7]), CRP elevation(aOR 0.52[0.29-0.0.94] and lymphopenia (aOR 0.57[0.33-0.98]) were inversely correlated with antibody levels. | infectious diseases |
10.1101/2022.03.09.22272145 | Bacterial culture use, etiology and antibiotic susceptibility of common bacterial infections in Indonesian hospitals in 2019 | ObjectivesTo describe the use of bacterial cultures, and the etiology and antibiotic susceptibility of common high-priority bacteria isolated from hospitalized patients in Jakarta, Indonesia.
MethodsWe conducted a hospital-wide cross-sectional study of all inpatients receiving systemic antibiotic treatment (WHO ATC J01) in six hospitals in 2019, capturing routine data on antibiotic treatment and cultures. We reported bug-drug combinations for Escherichia coli and the ESKAPE group of bacteria.
Results562 patients (52% women, median age 46 years) had 587 diagnoses, with pneumonia (258, 44%) most common. One or more culture specimens were taken in 38% (215/562) overall, a sputum culture in 25% (64/258) of pneumonia patients; and a blood culture in 52% (16/31) of sepsis patients. 50% of positive blood culture results were reported after 4 days. From 670 culture specimens, 279 bacteria were isolated, 214 (77%) were Gram-negative, including Klebsiella pneumoniae (70, 25%), Pseudomonas aeruginosa (36, 13%), and E. coli (21, 11%). Resistance included third-generation cephalosporin-resistant K. pneumoniae (77%), E. coli (65%) and Enterobacter spp (81%); carbapenem-resistant K. pneumoniae (26%), P. aeruginosa (24%), E. coli (33%), Acinetobacter spp (57%), and Enterobacter spp (60%); and meticillin-resistant S. aureus (71%). Vancomycin-resistant S. aureus (0%) and Enterococcus faecalis (12%) were uncommon. Multi-drug resistance was 30% for K. pneumoniae, 29% for P. aeruginosa, 49% for E. coli, 42% for Acinetobacter spp, and 71% for S. aureus.
ConclusionsIn Indonesian hospitals, bacterial cultures were underused and antibiotic resistance is at alarming levels. Enhanced context-specific infection prevention, diagnostic and antibiotic stewardship interventions are urgently needed. | infectious diseases |
10.1101/2022.03.09.22272154 | Methadone Distribution Increased from 2010 to 2019 for Opioid Use Disorder Treatment in the US | ObjectivesTo identify US prescription trends in methadone distribution for OUD from 2010 to 2020.
MethodsThe weight of methadone in grams distributed to OTPs per state was derived from the US Drug Enforcement Administrations Automated Reports and Consolidated Ordering System. Methadone was adjusted for state population and compared across all fifty states and Washington DC from 2010 to 2020.
ResultsThe overall distribution of methadone to OTPs significantly (P < 0.0001) increased from 2010 to 2019 (+61.0%) and from 2015 to 2020 (+26.22%). The states with the highest percent change from 2010 to 2020 were Montana (+897.02%), Alaska (+421.11%), and Vermont (+353.67%). In contrast with prior increase of distribution, from 2019 (pre-COVID-19 pandemic) to 2020 (during pandemic), there was no significant change in the distribution of methadone to OTPs (-0.09%). Ohio (+26.02%) significantly increased while Alabama (-21.96%), New Hampshire (-24.13%) and Florida (-28.97%) significantly decreased methadone relative to the national mean.
ConclusionsThis investigation revealed two trends related to methadone distribution in the US: increased utilization over the past decade and a plateau in utilization from 2019 to 2020. Policies are needed to remove access barriers to methadone treatment during the COVID-19 pandemic in order to reduce the worsening crisis of opioid overdoses in the US. | addiction medicine |
10.1101/2022.03.09.22272168 | Long COVID and its associated factors among COVID survivors in the community from a middle-income country: an online cross-sectional study | IntroductionPatients with COVID-19 usually recover and return to normal health, however some patients may have symptoms that last for weeks or even months after recovery. This persistent state of ill health is known as Long COVID if it continues for more than 12 weeks and are not explained by an alternative diagnosis. Long Covid has been overlooked in low and middle income countries. Therefore, we conducted an online survey among the COVID-19 survivors in the community to explore their Long COVID symptoms, factors associated with Long COVID and how Long COVID affected their work.
MethodsThis was a cross sectional study conducted from July to September 2021, during the implementation of a nationwide movement control order (MCO). Data was collected using the REDCap electronic data capture tool. The questionnaire was distributed in social and news media. The questionnaire covers information such as socio-demographic characteristics, existing comorbidities, self-perception on health, information on the acute COVID-19 condition and treatment received, symptoms and duration of post-COVID condition and effects on occupation. Results: A total of 732 COVID-19 survivors responded. There were slightly more females (58.7%), younger and more highly educated respondents. More than half of them were overweight or obese and about two third were free of comorbidities. Among these respondents, about 56% were without or with mild symptoms during their acute COVID-19 conditions. A total of 21.1% of the respondents reported to experience Long COVID. The most commonly reported symptoms for Long COVID were fatigue, brain fog, depression, anxiety, insomnia, arthralgia or myalgia. Females had 58% higher odds (95% CI: 1.02, 2.45) of experiencing Long COVID. Patients with moderate and severe levels of acute COVID-19 symptoms had OR of 3.01 (95% CI: 1.21, 7.47) and 3.62 (95% CI: 1.31, 10.03) respectively for Long COVID.
ConclusionThis study provides additional insight on the symptoms and duration of post-COVID symptoms as well as the associated factors with Long COVID among COVID-19 survivors in Malaysia. Recognition of Long COVID and its associated factors is important in planning prevention, rehabilitation, clinical management to improve recovery and long-term COVID-19 outcomes. | epidemiology |
10.1101/2022.03.09.22272165 | Disentangling the effect of measures, variants and vaccines on SARS-CoV-2 Infections in England: A dynamic intensity model | In this paper, we estimate the path of daily SARS-CoV-2 infections in England from the beginning of the pandemic until the end of 2021. We employ a dynamic intensity model, where the mean intensity conditional on the past depends both on past intensity of infections and past realised infections. The model parameters are time-varying and we employ a multiplicative specification along with logistic transition functions to disentangle the time-varying effects of non-pharmaceutical policy interventions, of different variants and of protection (waning) of vaccines/boosters. We show that earlier interventions and vaccinations are key to containing an infection wave. We consider several scenarios that account for more infectious variants and different protection levels of vaccines/boosters. These scenarios show that, as vaccine protection wanes, containing a new wave in infections and an associated increase in hospitalisations in the near future will require further booster campaigns and/or non-pharmaceutical interventions. | epidemiology |
10.1101/2022.03.09.22272172 | Amputation in Kingdom of Saudi Arabia: Etiology, Characteristics and Clinical status from a Large Tertiary Rehabilitation Center | IntroductionLimb amputation significantly impacts the patients physical, emotional, and social life. Multiple etiological factors may lead to limb amputation. Unfortunately, there is limited data about amputation in Saudi Arabia. Since Sultan Bin Abdulaziz Humanitarian City (SBAHC) is a tertiary centre receiving such patients, we collected data from ten years to look at the etiology, characteristics, and clinical impact of amputations in the Kingdom.
MethodsA retrospective study of 1409 amputee patients data at SBAHC collected over ten years included demographic variables, etiology, site, level, and type of amputation. We also collected Functional independent measurement (FIM) scores for 618 patients.
ResultMales constitute the majority of amputees (75.7%) and the average age in males was higher in males compared to females (45 vs 36 years respectively, p<0.001). Vascular diseases (42.3%) and Trauma (36.8%) were the leading cause of amputation in this cohort. Diabetes mellitus was the most frequent comorbidity (40.5%), followed by hypertension (26.2%). Transtibial amputations were the most typical (22.58%), followed by trans-femoral amputation (14.12%). Traumatic trans-femoral amputation was more prevalent among young adults than traumatic trans-tibial amputation. Trauma-related amputation cases were highest in the age group of 21-30 years (69.2%), while vascular related amputations were highest in the age group of 70 and above years (89.5%). FIM scores improved significantly in locomotion (33.6%) followed by transfer (30.6%) and self-care (16.4%) at six-month post discharge compared to admission (all p-values<0.01).
ConclusionVascular pathology arising due to chronic diseases is the primary risk factor that may lead to amputation and warrant primary and secondary prevention programs. Seatbelt enforcement can significantly decrease amputation related to Road Traffic Accidents, which has been the second most familiar cause of amputation. | rehabilitation medicine and physical therapy |
10.1101/2022.03.10.22272226 | A Pilot Validation Study Comparing FIBI, a Slide-Free Imaging Method, with Standard FFPE H&E Tissue Section Histology for Primary Surgical Pathology Diagnosis | IntroductionDigital pathology whole slide images (WSI) have been recently approved by the FDA for primary diagnosis in clinical surgical pathology practices. These WSI are generated by digitally scanning standard formalin-fixed and paraffin-embedded (FFPE) H&E-stained tissue sections mounted on glass microscope slides. Novel imaging methods are being developed that can capture the surface of tissue without requiring prior fixation, paraffin embedding, or tissue sectioning. One of these methods, FIBI (Fluorescence Imitating Brightfield Imaging), an optically simple and low-cost technique, was developed by our team and used in this study.
Methods100 de-identified surgical pathology samples were obtained from the UC Davis Health Pathology Laboratory. Samples were first digitally imaged by FIBI, and then embedded in paraffin, sectioned at 4 {micro}m, mounted on glass slides, H&E stained, and scanned using the Aperio/Leica AT2 scanner. The resulting digital images from both FIBI and H&E scan sets were uploaded to PathPresenter and viewed in random order and modality (FIBI or H&E) by each of 4 reading pathologists. After a 30-day washout, the same 100 cases, in random order, were presented in the alternate modality to what was first shown, to the same 4 reading pathologists. The data set consisted, therefore, of 100 reference diagnoses and 800 study pathologist reads (400 FIBI and 400 H&E). Each study read was compared to the reference diagnosis for that case, and also compared to that readers diagnosis across both modalities for each case. Categories of concordance, minor and major discordance were adjudicated by the study team based on established criteria.
ResultsThe combined category, concordance or minor discordance, was scored as "no major discordance." The overall agreement rate (compared to the reference diagnosis), across 800 reads, was 97.9%. This consisted of 400 FIBI reads at 97.0% vs. reference and 400 H&E reads vs. reference at 98.8%. Minor discordances (defined as alternative diagnoses without clinical treatment or outcome implications) were 6.1% overall, 7.2% for FIBI and 5.0% for HE.
ConclusionsPathologists without specific experience or training in FIBI imaging interpretation can provide accurate diagnosis from FIBI slide-free images. Concordance/discordance rates are similar to published rates for comparisons of WSI to standard light microscopy of glass slides for primary diagnosis that led to FDA approval. The present study was more limited in scope but suggests that a follow-on formal clinical trial is feasible. It may be possible, therefore, to develop a slide-free, non-destructive approach for primary pathology diagnosis. Such a method promises improved speed, reduced cost, and better conservation of tissue for advanced ancillary studies. | pathology |
10.1101/2022.03.10.22272222 | The changing impact of vaccines in the COVID-19 pandemic | The Omicron wave has left a global imprinting of immunity which changes the COVID landscape. In this study, we simulate six hypothetical variants emerging over the next year and evaluate the impact of existing and improved vaccines. We base our study on South Africas infection- and vaccination-derived immunity. Our findings illustrate that variant-chasing vaccines will only add value above existing vaccines in the setting where a variant emerges if we can shorten the window between variant introduction and vaccine deployment to under three weeks, an impossible time-frame without significant NPI use. This strategy may have global utility, depending on the rate of spread from setting to setting. Broadly neutralizing and durable next-generation vaccines could avert over three-times as many deaths from an immune-evading variant compared to existing vaccines. Our results suggest it is crucial to develop next-generation vaccines and redress inequities in vaccine distribution to tackle future emerging variants. | public and global health |
10.1101/2022.03.10.22272135 | KNOWLEDGE OF AND PRACTICES AROUND ZOONOTIC DISEASES AMONGST ACTORS IN THE LIVESTOCK TRADE IN THE LAKE VICTORIA CRESCENT ECOSYSTEM IN EAST AFRICA | BackgroundZoonotic diseases pose a direct threat to health and undercut livelihoods in the communities in which they occur. A combination of anthropogenic, animal, and ecosystem activities drives the emergence and re-emergence of zoonotic diseases. Consequently, One Health approaches are necessary to alleviate disease impacts. To be effective, understanding peoples Knowledge, Attitudes, and Practices concerning these disease threats are essential for their prevention, control, and eventual elimination. Livestock traders interact closely with livestock, which puts these traders at high risk of infection and creates conditions by which they may spread zoonotic pathogens through the animals they trade. It is, thus, essential to examine practices among actors involved in livestock trade to understand how well to mitigate these risks in a non-pastoral production system.
MethodsBusia County was selected for this study because it is a predominantly crop-producing area, with cross-border (between Kenya and Uganda) trade in livestock. A qualitative study was conducted among the actors in the livestock trade on their knowledge, attitudes, and practices that may contribute to the spread, control, and prevention of zoonotic disease transmission. A thematic analysis framework was used to categorize and synthesize data from In-depth interviews (IDIs) and the Key informant interviews(KIIs).
ResultsWhereas participants could list the signs of zoonotic diseases, they could not identify these diseases by name, which, demonstrates insufficient knowledge of zoonosis. Brucellosis, Foot and Mouth Disease (FMD), and Anthrax were broadly mentioned as diseases of importance by many actors this shows that they are the common livestock diseases in the area. The actors identified sick animals by checking for dropped ears, mass mucus production; diarrhea; bloody urinal discharge; and general animal activity levels, an animal that is not actively eating or walking will be a sign of sickness. To manage the spread of these diseases, they wash their animals, isolate sick animals from the rest of the stock; vaccinate their animals against certain diseases as preventive practices, they also seek help from animal health professionals for the sick animals as curative practices. The practices of skinning dead animals before burying them and the consumption of dead carcasses risk the increase of zoonotic disease transmission. These practices are drawn from cultural values and beliefs about the curses and potential loss of entire stock if livestock is unceremoniously disposed of upon death, irrespective of the cause.
Conclusions and recommendationsLivestock actors have agency in the prevention and elimination of zoonotic diseases, hence, they need to be involved when developing intervention programs and policies for the extension services. Training these actors as a continuum of animal health workers blends lay and professional knowledge, which alongside their intense contact with large numbers of animals, becomes a critical disease surveillance tool this may also play a role in supporting state actors in disease surveillance and response. There is an urgent need to increase awareness of zoonoses within livestock keepers and traders, these campaigns, need should deploy multi-disciplinary teams with an understanding of human health, animal health, and social scientists so that the risky but deeply rooted traditional practices like skinning and eating dead animals can be minimized. | public and global health |
10.1101/2022.03.10.22272230 | Tau, β-amyloid, and glucose metabolism following service-related Traumatic Brain Injury in Vietnam war veterans: The AIBL-VETS study | Traumatic Brain Injury (TBI) is common amongst military veterans and has been associated with an increased risk of dementia. It is unclear if this is due to increased risk for Alzheimers disease (AD) or other mechanisms. This case control study sought evidence for AD, as defined by the 2018 NIA-AA research framework1, by measuring tau, {beta}-amyloid and glucose metabolism using positron emission tomography (PET) in veterans with service-related TBI.
Seventy male Vietnam war veterans -- 40 with TBI (aged 68.0{+/-}2.5 years) and 30 controls (aged 70.1{+/-}5.3 years) -- with no prior diagnosis of dementia or mild cognitive impairment underwent {beta}-amyloid (18F-Florbetaben), tau (18F-Flortaucipir) and 18F-FDG PET. The TBI cohort included 15 participants with mild, 16 with moderate, and 9 with severe injury. {beta}-amyloid level was calculated using the Centiloid (CL) method and tau was measured by Standardized Uptake Value Ratios (SUVR) using the cerebellar cortex as reference region. Analyses were adjusted for age and APOE-e4. The findings were validated in an independent cohort from the ADNI-DOD study.
There were no significant nor trending differences in {beta}-amyloid or tau levels or 18F-FDG uptake between the TBI and control groups before and after controlling for covariates. The {beta}-amyloid and tau findings were replicated in the ADNI-DOD validation cohort and persisted when the AIBL-VETS and ADNI-DOD cohorts were combined (114 TBI vs 87 controls in total). These findings suggest that TBI is not associated with the later life accumulation of the neuropathological markers of AD. | neurology |
10.1101/2022.03.10.22272216 | Semi-automatic segmentation of the fetal brain from Magnetic Resonance Imaging | Template-based segmentation techniques have been used for targeting deep brain structures in fetal MR images. In this study, two registration algorithms were compared to determine the optimal strategy of segmenting subcortical structures in T2-weighted images acquired during the third trimester of pregnancy. Adult women with singleton pregnancies (n=9) ranging from 35-39 weeks gestational age were recruited. Fetal MRI were performed on 1.5 T and 3 T scanners. Automatic fetal brain segmentation and volumetric reconstruction algorithms were performed on all subjects using the NiftyMIC software. An atlas of cortical and subcortical structures (36 weeks gestation) was registered into native space using ANTs (Automatic Normalization Tools) and FLIRT (FMRIBs linear image registration tool). The cerebellum and thalamus were manually segmented. Dice coefficients were calculated to validate the reliability of automatic methods and to compare the performance between ANTs (nonlinear) and FLIRT (affine) registration algorithms compared to the gold-standard manual segmentations. The Dice-kappa values for the automated labels were compared using the Wilcoxon test. Comparing cerebellum and thalamus masks against the manually segmented masks, the median Dice-kappa coefficients for ANTs and FLIRT were 0.76 (interquartile range [IQR]= 0.56-0.83) and 0.65 (IQR=0.5-0.73), respectively. The Wilcoxon test (Z=4.9, P<0.01) indicated that the ANTs registration method performed better than FLIRT for the fetal cerebellum and thalamus. We found that a nonlinear registration method, provided improved results compared to an affine transformation. Nonlinear registration methods may be preferable for subcortical segmentations in MR images acquired in third-trimester fetuses. | obstetrics and gynecology |
10.1101/2022.03.10.22272174 | Effect of working from home on the association between job demands and psychological distress | PurposeLimited information is available about the association between workplace psychosocial factors and general mental health status among workers during the COVID-19 pandemic. This study examined how working from home affected the association between job demands and psychological distress (PD).
MethodA cross-sectional online survey was conducted in December 2020 (N=27,036). The dependent variable (PD) was assessed using the Kessler Psychological Distress Scale. Job demands were assessed using the Job Content Questionnaire. Working from home was determined by participants responses to the question: "Do you currently work from home?" We used a two-level regression analysis adjusted for prefecture; each individual-level variable at level 1 was nested into each prefecture at level 2, stratified by working from home or not.
ResultsOverall, 21.3% of participants worked from home. The interaction between working from home and job demands was significant. Job demands were positively associated with PD. The stratified analysis showed the associations were weaker among employees who worked from home compared with those who did not.
ConclusionThe association between job demands and PD may be weakened by working from home. | occupational and environmental health |
10.1101/2022.03.10.22272232 | The Association of Scoliosis Properties with Spinal Cord Tethering: A Statistical Model for Prognostication | ObjectiveTo evaluate the relationship between the structural measures of scoliosis and underlying spinal cord tethering (SCT) and proposing a statistical prognostication model.
Study designCross-sectional.
SettingAcademic healthcare center
Methods128 definite scoliosis cases that were candidates for corrective surgery were enrolled. Anterior-posterior whole column digital radiographs and whole-spine MRI (supine for all samples and adjuvant prone MRI for suspected cases with tight filum terminal) were performed. Univariate and multiple logistic regression were used for the analysis of association and interaction. Association of SCT with structural features of scoliosis -Cobb angle, convexity, and type (idiopathic and congenital)- age, and sex were assessed.
ResultsNone of the study variables showed a statistical association with SCT in univariable and multiple logistic regressions. After inclusion of Cobb angle-convexity-type interaction, higher Cobb angle, idiopathic scoliosis, dextrosoliosis, and male gender had a significant effect. Stratification for convexity discovered a positive association of Cobb angle and SCT in idiopathic patients with dextroscoliosis (1.02 [1.01-1.03], 0.049). In contrast, in congenital cases, the rate of SCT decreased by higher left-sided Cobb angles but it was not statistically significant (0.94 [0.88-1.01], 0.104).
ConclusionThe risk of spinal cord tethering was not zero in any of the subgroups and no SCT-free group could be detected. Conventional MRI should be preoperatively performed for every case of scoliosis and thoroughly examined for signs of tethering. Clear imaging of patients at higher risk of SCT should not be decisive and further workup should be utilized before proceeding with reconstructive surgery. | orthopedics |
10.1101/2022.03.10.22272204 | Cellular and humoral immune response to a third dose of BNT162b2 COVID-19 vaccine - a prospective observational study | BackgroundSince the introduction of various vaccines against SARS-CoV-2 at the end of 2020, rates of infection have continued to climb worldwide. This led to the establishment of a third dose vaccination in several countries, known as a booster. To date, there has been little real-world data about the immunological effect of this strategy.
MethodsWe compared the humoral- and cellular immune response before and after the third dose of BioNTech/Pfizer vaccine BNT162b2, following different prime-boost regimes. Humoral immunity was assessed by determining anti-SARS-CoV-2 binding antibodies using a standardized quantitative assay. In addition, neutralizing antibodies were measured using a commercial surrogate ELISA-assay. Interferon-gamma release was measured after stimulating blood-cells with SARS-CoV-2 specific peptides using a commercial assay to evaluate the cellular immune response.
ResultsThe median antibody level increased significantly after the third dose to 2663.1 BAU/ml vs. 101.4 BAU/ml (p < 0.001) before administration of the boosting dose. This was also detected for neutralizing antibodies with a binding inhibition of 99.68% {+/-} 0.36% vs. 69.06% {+/-} 19.88% after the second dose (p < 0.001).
96.3% of the participants showed a detectable T-cell-response after the third dose with a mean interferon-gamma level of 2207.07 mIU/ml {+/-} 1905 mIU/ml.
ConclusionThis study detected a BMI-dependent increase after the third dose of BNT162b2 following different vaccination protocols, whereas all participants showed a significant increase of their immune response. This, in combination with the limited post-vaccination-symptoms underlines the potential beneficial effect of a BNT162b2-boosting dose. | infectious diseases |
10.1101/2022.03.10.22272097 | Time-Varying Death Risk After SARS-CoV-2-Infection in Swedish Long-Term Care Facilities | BackgroundThe case fatality rate of SARS-CoV-2 has been high among residents of long-term care (LTC) facilities. It is unknown whether there is also higher mortality after the first month from documented infection.
MethodsWe extended the follow-up period to 8 months of a previous, retrospective cohort study based on the Swedish Senior Alert register. 3731 LTC residents infected with SARS-CoV-2 were matched to 3731 uninfected controls using time-dependent propensity scores on age, sex, body mass index, health status, comorbidities, and prescription medication use. In a sensitivity analysis, residents were also matched on geographical region and time of Senior Alert registration.
ResultsMedian age was 87 years (65% women). Excess mortality was highest 5 days after documented infection (hazard ratio 19.1; 95% confidence interval [CI], 14.6-24.8); subsequently excess mortality decreased rapidly. After the second month, mortality rate became lower in infected residents than in controls. Median survival of uninfected controls was 577 days (1.6 years), much lower than national life expectancy in Sweden at age 87 (5.05 years in men, 6.07 years in women). During days 61-210 of follow-up, hazard ratio for death was 0.41 (95% CI, 0.34-0.50) (0.76 (95% CI, 0.62-0.93) in the sensitivity analysis).
ConclusionsNo excess mortality was observed in LTC residents who survived acute SARS-CoV-2 infection (the first month). Life expectancy of uninfected residents was much lower than that of the general population of same age and sex. This difference should be taken into account in calculations of years of life lost among LTC residents.
Key messagesO_LISARS-CoV-2 infection sharply increased mortality risk among residents of long-term care (LTC) facilities in the first month.
C_LIO_LIThe mortality risk in infected residents of LTC facilities rapidly returned to baseline and dropped below the mortality risk of uninfected controls after the first month.
C_LIO_LIThe mortality rate remained lower in infected residents than in uninfected controls for 8 months of follow-up.
C_LIO_LINo excess mortality was observed in LTC residents who survived the acute SARS-CoV-2 infection.
C_LIO_LILoss of life-years in infected residents of LTC facilities is much lower than suggested for age- and sex- general population life tables.
C_LI | infectious diseases |
10.1101/2022.03.10.22272097 | Time-varying death risk after SARS-CoV-2-infection in Swedish long-term care facilities | BackgroundThe case fatality rate of SARS-CoV-2 has been high among residents of long-term care (LTC) facilities. It is unknown whether there is also higher mortality after the first month from documented infection.
MethodsWe extended the follow-up period to 8 months of a previous, retrospective cohort study based on the Swedish Senior Alert register. 3731 LTC residents infected with SARS-CoV-2 were matched to 3731 uninfected controls using time-dependent propensity scores on age, sex, body mass index, health status, comorbidities, and prescription medication use. In a sensitivity analysis, residents were also matched on geographical region and time of Senior Alert registration.
ResultsMedian age was 87 years (65% women). Excess mortality was highest 5 days after documented infection (hazard ratio 19.1; 95% confidence interval [CI], 14.6-24.8); subsequently excess mortality decreased rapidly. After the second month, mortality rate became lower in infected residents than in controls. Median survival of uninfected controls was 577 days (1.6 years), much lower than national life expectancy in Sweden at age 87 (5.05 years in men, 6.07 years in women). During days 61-210 of follow-up, hazard ratio for death was 0.41 (95% CI, 0.34-0.50) (0.76 (95% CI, 0.62-0.93) in the sensitivity analysis).
ConclusionsNo excess mortality was observed in LTC residents who survived acute SARS-CoV-2 infection (the first month). Life expectancy of uninfected residents was much lower than that of the general population of same age and sex. This difference should be taken into account in calculations of years of life lost among LTC residents.
Key messagesO_LISARS-CoV-2 infection sharply increased mortality risk among residents of long-term care (LTC) facilities in the first month.
C_LIO_LIThe mortality risk in infected residents of LTC facilities rapidly returned to baseline and dropped below the mortality risk of uninfected controls after the first month.
C_LIO_LIThe mortality rate remained lower in infected residents than in uninfected controls for 8 months of follow-up.
C_LIO_LINo excess mortality was observed in LTC residents who survived the acute SARS-CoV-2 infection.
C_LIO_LILoss of life-years in infected residents of LTC facilities is much lower than suggested for age- and sex- general population life tables.
C_LI | infectious diseases |
10.1101/2022.03.10.22272203 | Arterial Oxygen and Carbon Dioxide Tension and Acute Brain Injury in Extracorporeal Cardiopulmonary Resuscitation Patients: Analysis of the Extracorporeal Life Support Organization Registry | ObjectiveAcute brain injury remains common after extracorporeal cardiopulmonary resuscitation. Using a large international multicenter cohort, we investigated the impact of peri-cannulation arterial oxygen (PaO2) and carbon dioxide (PaCO2) on ABI occurrence.
DesignRetrospective cohort study.
SettingData in the Extracorporeal Life Support Organization Registry from 2009 to 2020.
PatientsAdult patients ([≥]18 years old) who underwent extracorporeal cardiopulmonary resuscitation.
InterventionsNone.
Measurements and Main ResultsOf 3,125 patients with extracorporeal cardiopulmonary resuscitation (median age=58, 69% male), 488 (16%) experienced at least one form of acute brain injury, which included ischemic stroke, intracranial hemorrhage, seizures, and brain death. 217 (7%) experienced ischemic stroke and 88 (3%) experienced intracranial hemorrhage. The registry collects two blood gas data pre- (6 hours before) and on- (24 hours after) extracorporeal membrane oxygenation (ECMO) cannulation. Blood gas parameters were classified as: hypoxia (<60mmHg), normoxia (60-119mmHg), and mild (120-199mmHg), moderate (200-299mmHg), and severe hyperoxia ([≥]300mmHg); hypocarbia (<35mmHg), normocarbia (35-44mmHg), mild (45-54mmHg) and severe hypercarbia ([≥]55mmHg). In multivariable logistic regression analysis, pre-ECMO hypoxia (aOR=1.46, 95%CI: 1.03-2.08, p=0.04) and on-ECMO severe hyperoxia (aOR=1.55, 95%CI: 1.02-2.36, p=0.04) were associated with composite ABI. Also, on-ECMO severe hyperoxia was associated with intracranial hemorrhage (aOR=1.88, 95%CI: 1.02-3.47, p=0.04) and in-hospital mortality (aOR=3.51, 95%CI: 1.98-6.22, p<0.001). Pre- and on-ECMO PaCO2 levels were not significantly associated with composite ABI or mortality, though mild hypercarbia pre- and on-ECMO were protective of ischemic stroke and intracranial hemorrhage, respectively.
ConclusionsEarly severe hyperoxia ([≥]300mmHg) on ECMO was a significant risk factor for acute brain injury and mortality for patients undergoing extracorporeal cardiopulmonary resuscitation. Careful consideration should be given in early oxygen delivery in ECPR patients who are at risk of reperfusion injury. | cardiovascular medicine |
10.1101/2022.03.10.22272207 | Correlates of circulating extracellular vesicle cargo with key clinical features of type 1 diabetes | Type 1 diabetes (T1D) is a heterogeneous disease with a slower evolution in individuals diagnosed at older ages. There are no validated clinical or laboratory biomarkers to predict the rate of insulin secretion decline either before or after the clinical onset of the disease, or the rate of progression to chronic complications of the disease. This pilot study aimed to characterize the proteomic and phosphoproteomic landscape of circulating extracellular vesicles (EVs) across a range of obesity in carefully matched established T1D and control subjects. We used archived serum samples from 17 human subjects (N=10 with T1D and N=7 normal healthy volunteers) from the ACME study (NCT03379792). EVs were isolated using EVtrap(R) technology (Tymora). Mass spectrometry-based methods were used to detect the global circulating EV proteome and phosphoproteome. Differential expression, coexpression network (WGCNA), and pathway enrichment analyses were implemented. The detected proteins and phosphoproteins were highly enriched (75%) in exosomal proteins cataloged in the ExoCarta database. A total of 181 differentially expressed EV proteins and 15 differentially expressed EV phosphoproteins were identified, with 8 upregulated EV proteins (i.e., CD63, RAB14, VCP, BSG, FLNA, GNAI2, LAMP2, and EZR) and 1 downregulated EV phosphoprotein (i.e., TUBA1B) listed among the top 100 ExoCarta proteins. This suggests that T1D could indeed modulate EV biogenesis and secretion. Enrichment analyses of both differentially expressed EV proteins and EV phosphoproteins identified associations of upregulated features with neutrophil, platelet, and immune response functions, as well as prion disease and other neurodegenerative diseases, among others. On the other hand, downregulated EV proteins were involved in MHC class II signaling and the regulation of monocyte differentiation. Potential novel key roles in T1D for C1q, plasminogen, IL6ST, CD40, HLA-DQB1, and phosphorylated S100A9, are highlighted. Remarkably, WGCNA uncovered two protein modules significantly associated with pancreas size, which may be implicated in the pathogenesis of T1D. Similarly, these modules showed significant enrichment for membrane compartments, processes associated with inflammation and the immune response, and regulation of viral processes, among others. This study demonstrates the potential of EV proteomic and phosphoproteomic signatures to provide insight into the pathobiology of type 1 diabetes and its complications. | endocrinology |
10.1101/2022.03.11.22272268 | Immortal time bias in older vs younger age groups: a simulation study with application to a population-based cohort of patients with colon cancer | BackgroundIn observational studies, the risk of immortal time bias (ITB) increases with the likelihood of early death, itself increasing with age. We investigated how age impacts the magnitude of ITB when estimating the effect of surgery on one-year overall survival (OS) in patients with stage IV colon cancer aged 50-74 and 75-99 in England.
MethodUsing simulations, we compared estimates from a time-fixed exposure model to three methods addressing ITB: time-varying exposure, delayed entry, and landmark methods. We then estimated the effect of surgery on one-year OS using a national population-based cohort of patients derived from the CORECT-R resource.
ResultsIn simulations, the magnitude of ITB was larger among older patients when their probability of early death increased or treatment was delayed. The bias was corrected using the methods addressing ITB. When applied to CORECT-R data, these methods yielded smaller effects than the time-fixed exposure approach but effects were similar in both age groups.
ConclusionITB must be addressed in all longitudinal studies, particularly, when investigating the effect of an exposure on an outcome in different groups of people (e.g., age groups) with different distributions of exposure and outcomes. | epidemiology |
10.1101/2022.03.11.22272248 | The Microbiome in Adult Acute Appendicitis | Acute appendicitis is a common acute surgical emergency; however, the pathogenesis of adult appendicitis remains poorly understood. The microbiome is increasingly thought to play a key role in inflammatory disease of the bowel and similarly, may play a role in appendicitis.
AimThis study aimed to characterize the microbiome of adult acute appendicitis in a prospective cohort.
MethodWe recruited 60 adults with acute appendicitis and 20 healthy controls. Rectal swabs were taken from each patient. After DNA extraction, 16S rRNA amplicon sequencing was carried out for analysis of diversity and taxonomic abundance.
ResultsPhylogenetic sequencing of the samples indicated that there is a difference between the microbial composition of those with acute appendicitis and healthy controls, with a statistically significant decrease in alpha diversity in rectal swabs of appendicitis patients compared to healthy controls. At the genus level, we saw an increased abundance of potential pathogens, e.g. Parvimonas and Acinetobacter, and a decrease in commensal taxa such as Faecalibacterium, Blautia and Lachnospiraceae in appendicitis patients compared to healthy controls.
ConclusionAn imbalance in the gut microbiome may contribute to the pathogenesis of adult acute appendicitis. | gastroenterology |
10.1101/2022.03.10.22272213 | Cov2clusters: genomic clustering of SARS-CoV-2 sequences | BackgroundThe COVID-19 pandemic remains a global public health concern. Advances in sequencing technologies has allowed for high numbers of SARS-CoV-2 whole genome sequence (WGS) data and rapid sharing of sequences through global repositories to enable almost real-time genomic analysis of the pathogen. WGS data has been used previously to group genetically similar viral pathogens to reveal evidence of transmission, including methods that identify distinct clusters on a phylogenetic tree. Identifying clusters of linked cases can aid in the regional surveillance and management of the disease. In this study, we present a novel method for producing stable genomic clusters of SARS-CoV-2 cases, cov2clusters, and compare the sensitivity and stability of our approach to previous methods used for phylogenetic clustering using real-world SARS-CoV-2 sequence data obtained from British Columbia, Canada,
ResultsWe found that cov2clusters produced more stable clusters than previously used phylogenetic clustering methods when adding sequence data through time, mimicking an increase in sequence data through the pandemic. Our method also showed high sensitivity when compared to epidemiologically informed clusters.
ConclusionsOur new approach allows for the identification of stable clusters of SARS-CoV-2 from WGS data. Producing high-resolution SARS-CoV-2 clusters from sequence data alone can a challenge and, where possible, both genomic and epidemiological data should be used in combination. | public and global health |
10.1101/2022.03.10.22272213 | Cov2clusters: genomic clustering of SARS-CoV-2 sequences | BackgroundThe COVID-19 pandemic remains a global public health concern. Advances in sequencing technologies has allowed for high numbers of SARS-CoV-2 whole genome sequence (WGS) data and rapid sharing of sequences through global repositories to enable almost real-time genomic analysis of the pathogen. WGS data has been used previously to group genetically similar viral pathogens to reveal evidence of transmission, including methods that identify distinct clusters on a phylogenetic tree. Identifying clusters of linked cases can aid in the regional surveillance and management of the disease. In this study, we present a novel method for producing stable genomic clusters of SARS-CoV-2 cases, cov2clusters, and compare the sensitivity and stability of our approach to previous methods used for phylogenetic clustering using real-world SARS-CoV-2 sequence data obtained from British Columbia, Canada,
ResultsWe found that cov2clusters produced more stable clusters than previously used phylogenetic clustering methods when adding sequence data through time, mimicking an increase in sequence data through the pandemic. Our method also showed high sensitivity when compared to epidemiologically informed clusters.
ConclusionsOur new approach allows for the identification of stable clusters of SARS-CoV-2 from WGS data. Producing high-resolution SARS-CoV-2 clusters from sequence data alone can a challenge and, where possible, both genomic and epidemiological data should be used in combination. | public and global health |
10.1101/2022.03.11.22272255 | Comparison of radiation dose and cancer risk between conventional polychromatic imaging and gemstone spectral imaging in computed tomography (CT) examination: a phantom study | This study aims to measure the radiation doses and estimate the associated lifetime attributable risks (LARs) of cancer incidence resulted from gemstone spectral imaging (GSI) and conventional polychromatic imaging (CPI) in computed tomography (CT) examinations. Organ doses were measured using an adult phantom and thermoluminescent dosimeters. Four scans, including head CT, thorax CT, abdomen CT and pelvis CT, were performed on the phantom. LARs of cancer incidence were estimated for Chinese population and US population. The effective doses of thorax CT was the highest. With CPI, it was 7.94 mSv for females, and 7.74 mSv for males. With GSI, it was 6.81 mSv for females, and 7.69 mSv for males. With GSI, the corresponding LARs for males and females aged 20-70 years were less than 0.062% and 0.154%, respectively. With CPI, these values were less than 0.074% and less than 0.171%, respectively. The LARs decreased in both US population and Chinese population when exposed age increased, with thorax scan causing the highest risk. GSI does not increased radiation exposure or cancer risk compared to CPI. These findings may allow the application of GSI in patients referred for CT. | radiology and imaging |
10.1101/2022.03.11.22272255 | Comparison of radiation dose and cancer risk between conventional polychromatic imaging and gemstone spectral imaging in computed tomography (CT) examination: a phantom study | This study aims to measure the radiation doses and estimate the associated lifetime attributable risks (LARs) of cancer incidence resulted from gemstone spectral imaging (GSI) and conventional polychromatic imaging (CPI) in computed tomography (CT) examinations. Organ doses were measured using an adult phantom and thermoluminescent dosimeters. Four scans, including head CT, thorax CT, abdomen CT and pelvis CT, were performed on the phantom. LARs of cancer incidence were estimated for Chinese population and US population. The effective doses of thorax CT was the highest. With CPI, it was 7.94 mSv for females, and 7.74 mSv for males. With GSI, it was 6.81 mSv for females, and 7.69 mSv for males. With GSI, the corresponding LARs for males and females aged 20-70 years were less than 0.062% and 0.154%, respectively. With CPI, these values were less than 0.074% and less than 0.171%, respectively. The LARs decreased in both US population and Chinese population when exposed age increased, with thorax scan causing the highest risk. GSI does not increased radiation exposure or cancer risk compared to CPI. These findings may allow the application of GSI in patients referred for CT. | radiology and imaging |
10.1101/2022.03.10.22272123 | Polymorphism in IFNAR contributes to glucocorticoid response and outcome in ARDS and COVID-19 | The use of glucocorticoids has given contradictory results for treating acute respiratory distress syndrome (ARDS). Here we report a novel disease association of a SNP rs9984273, which is situated in the interferon alpha/beta receptor (IFNAR2) gene in an area corresponding to a binding motif of the glucocorticoid receptor (GR). The minor allele of SNP rs9984273 associates with higher IFNAR expression, lower IFN-gamma and IL-6 levels and less severe form of coronavirus diseases (COVID-19) according to the COVID-19 Host Genetics Initiative database, and better outcome in interferon (IFN) beta treated patients with ARDS. Thus, the distribution of this SNP within clinical study arms may explain the contradictory results of multiple ARDS studies and outcomes in COVID-19 concerning type I IFN signalling and glucocorticoids.
One-Sentence SummarySingle nucleotide polymorphism in interferon receptor contributes to corticosteroid response and outcome in ARDS and COVID-19 | respiratory medicine |
10.1101/2022.03.10.22272186 | Smoking, e-cigarettes and the effect on respiratory symptoms among a population sample of youth: retrospective study. | IntroductionE-cigarettes have been steadily increasing in popularity, both as cessation methods for smoking and for recreational and social reasons. This increase in vaping may pose cardiovascular and respiratory risks. We aimed to assess respiratory symptoms in youth users of e-cigarettes and cigarettes.
MethodsA cross-sectional survey design was utilized to assess Canadian youth aged 16-25 years old. Participants were recruited from the Ontario Tobacco Research Unit Youth and Young Adult Research Registration Panel November 2020 to March 2021. A total of 3,082 subjects completed the baseline survey. Of these, 2660 individuals who did not have asthma were included in the analysis. The exposure of interest was vaping dose, pack equivalent years, equivalent to cigarette pack years incorporating number of puffs per day, number of days vaped per month, and number of years vaped. Respiratory symptoms were measured using the five-item Canadian Lung Health Test. Poisson regression analyses were performed while adjusting for demographic confounders, stratified by smoking status. A non-stratified model tested the interaction of status and vaping dose and the effect of vaping device used was assessed among ever vapers. Analyses controlled for demographic characteristics, use of cannabis and alcohol, and survey date.
ResultsEach additional puff year increased the rate ratio of respiratory symptoms by a factor of 11.36 (95%CI: 4.61-28.00; p<0.001) for never smokers, but among current daily smokers higher pack equivalent years were not associated with more respiratory symptoms (0.83; 95% CI: 0.23., 3.11). Among current vapers, those using pod-style devices were more likely to have more respiratory symptoms (1.25; 95% CI: 1.08, 1.45) after adjusting for dose.
ConclusionsVaping is associated with an increased risk of reporting respiratory symptoms among never smoking youth and non-daily ever cigarette smokers. Use of e-cigarettes among non-smokers should be discouraged.
FundingCanadian Institutes of Health Research | respiratory medicine |
10.1101/2022.03.10.22272242 | JAK signaling was involved in the pathogenesis of Polymyalgia Rheumatica | ObjectivesPolymyalgia Rheumatica (PMR) is a common inflammatory disease in elderly persons whose pathogenesis is unclear. We aimed to explore the pathogenetic features of PMR and find a new therapeutic strategy.
MethodsWe included 11 patients with PMR and 20 age-matched and sex-matched healthy controls (HC) in this study. The disease features were described. The gene expression profiles were analyzed in peripheral blood mononuclear cells (PBMCs) by RNA sequencing and were confirmed by RT-PCR. We also tested gene expression profiles in five patients with PMR after tofacitinib therapy.
ResultsPatients with PMR experienced pain with high disease activity scores. The gene expression of PBMCs in patients with PMR differed from that in HC by RNA sequencing. GO and KEGG analysis demonstrated that inflammatory response and cytokine-cytokine receptor interaction were the most remarkable pathways. There were markedly expanded IL6R, IL1B, IL1R1, JAK2, TLR2, TLR4, TLR8, CCR1, CR1, S100A8, S100A12, and IL17RA expressions. Those genes may trigger the JAK signaling. Furthermore, tofacitinib, a pan JAK inhibitor, effectively treated five patients with PMR, leading to clinical remission and a significant decrease in inflammatory genes.
ConclusionsMany inflammatory genes associated with JAK signaling were increased in patients with PMR, suggesting an important role of JAK signaling in PMR disease development. JAK inhibitors may effectively treat PMR.
Key messagesO_LIPatients with PMR had significant inflammatory genes expression. JAK signaling may be highly activated.
C_LIO_LITofacitinib may treat PMR with clinical remission and a significant decrease in inflammatory genes.
C_LI | rheumatology |
10.1101/2022.03.10.22272227 | Stable Sparse Classifiers predict cognitive impairment from gait patterns | BackgroundAlthough gait patterns disturbances are known to be related to cognitive decline, there is no consensus on the possibility of predicting one from the other. It is necessary to find the optimal gait features, experimental protocols, and computational algorithms to achieve this purpose.
PurposesTo assess the efficacy of the Stable Sparse Classifiers procedure (SSC) for discriminating young and older adults, as well as healthy and cognitively impaired elderly groups from their gait patterns. To identify the walking tasks or combinations of tasks and specific spatio-temporal gait features (STGF) that allow the best prediction with SSC.
MethodsA sample of 125 participants (40 young- and 85 older-adults) was studied. They underwent assessment with five neuropsychological tests that explore different cognitive domains. A summarized cognitive index (MDCog), based on the Mahalanobis distance from normative data, was calculated. The sample was divided into three groups (young adults, healthy and cognitively impaired elderly adults) using k-means clustering of MDCog in addition to Age. The participants executed four walking tasks (normal, fast, easy- and hard-dual tasks) and their gait patterns, measured with a body-fixed Inertial Measurement Unit, were used to calculate 16 STGF and dual-task costs. SSC was then employed to predict which group the participants belonged to. The classifications performance was assessed using the area under the receiver operating curves (AUC). The set of STGF features and tasks producing the most accurate classifications were identified.
ResultsThe comparison between the three groups revealed significant differences for all STGF in all tasks, while the global AUC of the classification using SSC was 0.87. The classification between the groups of elderly people revealed that the combination of the easy dual-task and the fast walking task had the best prediction performance (AUC = 0.86). Gait variability in step and stride time and the RMS value of vertical acceleration were the features with the largest predictive power. SSC prediction accuracy was better than the accuracies obtained with linear discriminant analysis and support vector machine classifiers.
ConclusionsThe study corroborated that the changes in gait patterns can be used to discriminate between young and older adults and more importantly between healthy and cognitively impaired adults. A subset of gait tasks and STGF optimal for achieving this goal with SSC were identified, with the latter method superior to other classification techniques. | neurology |
10.1101/2022.03.10.22272228 | Establishing a baseline for human cortical folding morphological variables: a multicenter study | AO_SCPLOWBSTRACTC_SCPLOWDifferences in the way human cerebral cortices fold have been correlated to health, disease, development, and aging. But to obtain a deeper understating of the mechanisms that generate such differences it is useful to derive ones morphometric variables from first principles. This work explores one such set of variables that arise naturally from a model for universal self-similar cortical folding that was validated on comparative neuroanatomical data. We aim to establish a baseline for these variables across the human lifespan using a heterogeneous compilation of cross-sectional datasets, as the first step to extend the model to incorporate the time evolution of brain morphology. We extracted the morphological features from structural MRI of 3650 subjects: 3095 healthy controls (CTL) and 555 Alzheimers Disease (AD) patients from 9 datasets, which were harmonized with a straightforward procedure to reduce the uncertainty due to heterogeneous acquisition and processing. The unprecedented possibility of analyzing such a large number of subjects in this framework allowed us to compare CTL and AD subjects lifespan trajectories, testing if AD is a form of accelerated aging at the brain structural level. After validating this baseline from development to aging, we estimate the variables uncertainties and show that Alzheimers Disease is similar to premature aging when measuring global and local degeneration. This new methodology may allow future studies to explore the structural transition between healthy and pathological aging and may be essential to generate data for the cortical folding process simulations.
Significance StatementUnderstating Cortical folding is of increasing interest in neurosciences as it has been used to discriminate disease in humans while integrating pieces of knowledge from compared neuroanatomy and neuroproliferations programs. Here we propose estimating the baseline of cortical folding variables from multi-site MRI human images, evaluating the changing rate of its independent variables through the human lifespan, and proposing a simple harmonization procedure to combine multicentric datasets. Finally, we present a practical application of these techniques comparing Alzheimers Disease and Cognitive Unimpaired Controls based on the estimated changing rates.
HighlightsO_LIBaseline of independent cortical folding variables from 3650 multi-site human MRI
C_LIO_LIPropose a simple harmonization procedure to combine multicentric datasets
C_LIO_LIEvaluate the changing rate of independent variables through the human lifespan
C_LIO_LIPractical application comparing Alzheimers Disease and Controls rates
C_LI | neurology |
10.1101/2022.03.10.22272228 | Establishing a baseline for human cortical folding morphological variables: a multicenter study | AO_SCPLOWBSTRACTC_SCPLOWDifferences in the way human cerebral cortices fold have been correlated to health, disease, development, and aging. But to obtain a deeper understating of the mechanisms that generate such differences it is useful to derive ones morphometric variables from first principles. This work explores one such set of variables that arise naturally from a model for universal self-similar cortical folding that was validated on comparative neuroanatomical data. We aim to establish a baseline for these variables across the human lifespan using a heterogeneous compilation of cross-sectional datasets, as the first step to extend the model to incorporate the time evolution of brain morphology. We extracted the morphological features from structural MRI of 3650 subjects: 3095 healthy controls (CTL) and 555 Alzheimers Disease (AD) patients from 9 datasets, which were harmonized with a straightforward procedure to reduce the uncertainty due to heterogeneous acquisition and processing. The unprecedented possibility of analyzing such a large number of subjects in this framework allowed us to compare CTL and AD subjects lifespan trajectories, testing if AD is a form of accelerated aging at the brain structural level. After validating this baseline from development to aging, we estimate the variables uncertainties and show that Alzheimers Disease is similar to premature aging when measuring global and local degeneration. This new methodology may allow future studies to explore the structural transition between healthy and pathological aging and may be essential to generate data for the cortical folding process simulations.
Significance StatementUnderstating Cortical folding is of increasing interest in neurosciences as it has been used to discriminate disease in humans while integrating pieces of knowledge from compared neuroanatomy and neuroproliferations programs. Here we propose estimating the baseline of cortical folding variables from multi-site MRI human images, evaluating the changing rate of its independent variables through the human lifespan, and proposing a simple harmonization procedure to combine multicentric datasets. Finally, we present a practical application of these techniques comparing Alzheimers Disease and Cognitive Unimpaired Controls based on the estimated changing rates.
HighlightsO_LIBaseline of independent cortical folding variables from 3650 multi-site human MRI
C_LIO_LIPropose a simple harmonization procedure to combine multicentric datasets
C_LIO_LIEvaluate the changing rate of independent variables through the human lifespan
C_LIO_LIPractical application comparing Alzheimers Disease and Controls rates
C_LI | neurology |
10.1101/2022.03.11.22272240 | Rates of regional tau accumulation in ageing and across the Alzheimer's disease continuum: An AIBL 18F-MK6240 PET study | BackgroundTau PET imaging enables prospective longitudinal observation of the rate and location of tau accumulation in Alzheimers disease (AD). 18F-MK6240 is a newer, high affinity tracer for the paired helical filaments of tau in AD. It is widely used in clinical trials, despite sparse longitudinal natural history data. We aimed to evaluate the impact of disease stage, and two reference regions on the magnitude and effect size of regional change.
MethodsOne hundred and fifty-eight participants: 83 cognitively unimpaired (CU) A{beta}-, 37 CU A{beta}+, 19 mild cognitively impaired (MCI) A{beta}+ and 19 AD A{beta}+ had annual 18F-MK6240 PET for one or two years (mean 1.6 years). Standardized uptake value ratios (SUVR) were generated for three in-house composite ROI: mesial temporal (Me), temporoparietal (Te), and rest of neocortex (R), and a Free-Surfer derived meta-temporal (MT) ROI. Two reference regions were examined: cerebellar cortex (SUVRCb) and eroded subcortical white matter (SUVRWM).
ResultsLow rates of accumulation were seen in CU A{beta}-, predominantly in the mesial temporal lobe (MTL). In CU A{beta}+, increase was greatest in the MTL, particularly the amygdala. In MCI A{beta}+, a similar increase was seen in MTL, but also globally in the cortex. In AD A{beta}+, greatest increase was in temporoparietal and frontal regions, with a decrease in the MTL. In CU and MCI increases were greater using SUVRWM. In AD, the SUVRCb showed marginally greater increase. Interpolation of the data suggests it takes approximately two decades to accumulate tau to the typical levels found in AD, similar to the rates of accumulation of A{beta} plaques.
ConclusionsThe rate of tau accumulation varies according to brain region and baseline disease stage, confirming previous reports. The PET measured change is greater, with fewer outliers, using an eroded white matter reference region, except in AD. While the eroded subcortical white matter reference may be preferred for trials in preclinical AD, the cerebellar cortex would be preferred for trials in symptomatic AD.
Trial registrationNot applicable. | neurology |
10.1101/2022.03.10.22272231 | How often do cancer researchers make their data and code available and what factors are associated with sharing? | BackgroundVarious stakeholders are calling for increased availability of data and code from cancer research. However, it is unclear how commonly these products are shared, and what factors are associated with sharing. Our objective was to evaluate how frequently oncology researchers make data and code available, and explore factors associated with sharing.
MethodsA cross-sectional analysis of a random sample of 306 articles indexed in PubMed in 2019 presenting original cancer research was performed. Outcomes of interest included the prevalence of affirmative sharing declarations and the rate with which declarations connected to useable data. We also investigated associations between sharing rates and several journal characteristics (e.g., sharing policies, publication models), study characteristics (e.g., cancer rarity, study design), open science practices (e.g., pre-registration, pre-printing) and citation rates between 2020-2021.
ResultsOne in five studies declared data were publicly available (95% CI: 15-24%). However, when actual data availability was investigated this percentage dropped to 16% (95% CI: 12-20%), and then to less than 1% (95% CI: 0-2%) when data were checked for compliance with key FAIR principles. While only 4% of articles that used inferential statistics reported code to be available (10/274, 95% CI: 2-6%), the odds of reporting code to be available were 5.6 times higher for researchers who shared data. Compliance with mandatory data and code sharing policies was observed in 48% and 0% of articles, respectively. However, 88% of articles included data availability statements when required. Policies that encouraged data sharing did not appear to be any more effective than not having a policy at all. The only factors associated with higher rates of data sharing were studying rare cancers and using publicly available data to complement original research.
ConclusionsData and code sharing in oncology occurs infrequently, and at a lower frequency than would be expected due to non-compliance with journal policies. There is also a large gap between those declaring data to be available, and those archiving data in a way that facilitates its reuse. We encourage journals to actively check compliance with sharing policies, and researchers consult community accepted guidelines when archiving the products of their research. | oncology |
10.1101/2022.03.10.22272217 | Malaria burden and associated risk factors in an area of pyrethroid-resistant vectors in southern Benin | Malaria remains the main cause of morbidity and mortality in Benin despite the scale-up of long-lasting insecticidal nets (LLINs), indoor residual spraying, and malaria case management. This study aimed to determine the malaria burden and its associated risk factors in a rural area of Benin characterized by high net coverage and pyrethroid-resistant mosquito vectors. A community-based cross-sectional survey was conducted in three districts in southern Benin. Approximately 4,320 randomly selected participants of all ages were tested for malaria using rapid diagnostic tests within 60 clusters. Risk factors for malaria infection were evaluated using mixed-effect logistic regression models. Despite high population net use (96%), malaria infection prevalence was 43.5% (cluster range: 15.1-72.7%). Children (58.7%) were more likely to be infected than adults (31.2%), with a higher malaria prevalence among older children (5-10 years: 69.1%; 10-15 years: 67.9%) compared to young children (<5 years: 42.1%); however, young children were more likely to be symptomatic. High household density, low socioeconomic status, young age (<15years), poor net conditions, and low net usage during the previous week were significantly associated with malaria infection. Malaria prevalence remains high in this area of intense pyrethroid resistance despite high net use. New classes of LLINs effective against resistant vectors are therefore crucial to further reduce malaria in this area. | infectious diseases |
10.1101/2022.03.10.22272237 | Long COVID in Children and Adolescents: A Systematic Review and Meta-analyses. | ObjectiveTo estimate the prevalence of long COVID in children and adolescents and identify the full spectrum of signs and symptoms present after acute SARS-CoV-2 infection.
MethodsTwo independent investigators searched PubMed and Embase in order to identify observational studies that met the following criteria: 1) a minimum of 30 patients, 2) ages ranged from 0 to 18 years, 3) published in English, 4) published before February 10th, 2022, and 5) meets the National Institute for Healthcare Excellence (NICE) definition of long COVID, which consists of both ongoing (4 to 12 weeks) and post-COVID-19 ([≥]12 weeks) symptoms. For COVID symptoms reported in two or more studies, random-effects meta-analyses were performed using the MetaXL software to estimate the pooled prevalence, and Review Manager (RevMan) software 5.4 was utilized to estimate the Odds Ratios (ORs) with a 95% confidence interval (CI). Heterogeneity was assessed using I2 statistics. The Preferred Reporting Items for Systematic Reviewers and Meta-analysis (PRISMA) reporting guideline was followed (registration PROSPERO CRD42021275408).
ResultsThe literature search yielded 68 articles for long COVID in children and adolescents. After screening, 21 studies met the inclusion criteria and were included in the systematic review and meta-analyses. A total of 80,071 children and adolescents with COVID-19 were included. The prevalence of long COVID was 25.24% (95% CI 18.17-33.02), and the most prevalent clinical manifestations were mood symptoms (16.50%; 95% CI 7.37-28.15), fatigue (9.66%; 95% CI 4.45-16.46), and sleep disorders (8.42%; 95% CI 3.41-15.20). When compared to controls, children infected by SARS-CoV-2 had a higher risk of persistent dyspnea (OR 2.69 95%CI 2.30-3.14), anosmia/ageusia (OR 10.68, 95%CI 2.48, 46.03), and/or fever (OR 2.23, 95%CI 1.22-4.07). The main limitation of these meta-analyses is the probability of bias, which includes lack of standardized definitions, recall, selection, misclassification, nonresponse and/or loss of follow-up, and the high level of heterogeneity.
ConclusionThese meta-analyses provide an overview of the broad symptomatology of long COVID in minors, which may help improve management, rehabilitation programs, and future development of guidelines and therapeutic research for COVID-19. | infectious diseases |
10.1101/2022.03.10.22272182 | Positive association between country-level screening rates for chlamydia and gonorrhea and self-reported diagnoses in MSM: an ecological analysis | We assessed if there was a country-level association between intensity of screening MSM for Neisseria gonorrhoeae and Chlamydia trachomatis and self-reported incidence of these infections in 2010 and 2017. Unexpectedly, we found that screening intensity was associated with a higher incidence of these infections.
Short summaryWe found a positive association between intensity of screening for chlamydia and gonorrhoea and the self-reported incidence of these infections in MSM in European countries. | infectious diseases |
10.1101/2022.03.11.22272244 | Periodontopathogens in oral cancer: a meta-analysis of bacterial taxa of the oral microbiome associated with risk factors of oral squamous cell carcinoma | This study aims to investigate the distribution of microbial taxa that are present in abundance in the oral cavity of patients diagnosed with Oral Squamous Cell Carcinoma (OSCC). We begin with a search for relevant literature on the OSCC microbiome in electronic databases (PubMed and Google Scholar). From the identified literature, studies were considered for data extraction based on an inclusion criteria according to PRISMA guidelines. From an initial 1217 published studies, a total of 15 relevant studies were identified that fulfilled the inclusion criteria. These studies were conducted for the detection of microbial taxa in the oral cavities of patients with OSCC by correlation with healthy controls for differential microbial abundance. The data from the selected studies provided evidence on microbial taxa in different anatomical sites of the oral cavity i.e. gingival region, tongue, buccal site and floor of the mouth. The most common method for the detection of microbial flora in the literature was 16s rRNA sequencing. Only those studies from the literature were considered for further analysis that showed the association of risk factors i.e. tobacco smoking and smokeless, betel quid, alcohol and periodontitis with OSCC. Risk factors in the resulting 6 studies showed a strong odds ratio (OR) with statistical significance (p-value <0.05). The calculated risk ratio (RR) of these risk factors also demonstrated substantial heterogeneity. These studies showed an increase in the abundance of periodontopathogens belonging to the genus Fusobacterium, Capnocytophaga, Prevotella, Parvimonas and Porphyromonas. The microbial taxa associated in abundance with risk factors of OSCC such as smoked or smokeless tobacco, betel quid and alcohol were quite similar to the microbial taxa that cause periodontitis. The detection for abundance of periodontopathogens in OSCC a class of putative biomarkers at early stages of tumor development in OSCC, in individuals exposed to these risk factors. | dentistry and oral medicine |
10.1101/2022.03.10.22272236 | EMS prehospital response to the COVID-19 pandemic in the US: A brief literature review | This study aimed to analyze prehospital Emergency Medical Services (EMS) response to the COVID-19 pandemic in the US through a brief systematic review of available literature in context with international prehospital counterparts. An exploration of the NCBI repository was performed using a search string of relevant keywords which returned n=5128 results; articles that met the inclusion criteria (n=77) were reviewed and analyzed in accordance with PRISMA and PROSPERO recommendations. Methodical quality was assessed using critical appraisal tools, and the Eggers test was used for risk of bias reduction upon linear regression analysis of a funnel plot. Sources of heterogeneity as defined by P < 0.10 or I^2 > 50% were interrogated. Findings were considered within ten domains: structural/systemic; clinical outcomes; clinical assessment; treatment; special populations; dispatch/activation; education; mental health; perspectives/experiences; and transport. Findings suggest, EMS clinicians have likely made significant and unmeasured contributions to care during the pandemic via nontraditional roles, i.e., COVID-19 testing and vaccine deployment. EMS plays a critical role in counteracting the COVID-19 pandemic in addition to the worsening opioid epidemic, both of which disproportionately impact patients of color. As such, being uniquely influential on clinical outcomes, these providers may benefit from standardized education on care and access disparities such as racial identity. Access to distance learning continuing education opportunities may increase rates of provider recertification. Additionally, there is a high prevalence of vaccine hesitancy among surveyed nationally registered EMS providers. Continued rigorous investigation on the impact of COVID-19 on EMS systems and personnel is warranted to ensure informed preparation for future pandemic and infectious disease response. | emergency medicine |
10.1101/2022.03.10.22272234 | Inferring Insulin Secretion Rate From Sparse Patient Glucose and Insulin Measures | The insulin secretion rate (ISR) contains information that can provide a personal, quantitative understanding of endocrine function. The ISR, if it can be reliably inferred from sparse measurements, could be used both for understanding the source of a problem with glucose regulation system and for monitoring people with diabetes.
ObjectiveThis study aims to develop a model-based method for inferring ISR and related physiological information among people with different glycemic conditions in a robust manner.
MethodsAn algorithm for estimating and validating ISR for different compartmental models with unknown but estimable ISR function and absorption/decay rates describing the dynamics of insulin and C-peptide accumulation was developed. The algorithm was applied to data from 17 subjects with normal glucose regulatory systems and 9 subjects with cystic fibrosis related diabetes (CFRD) in which glucose, insulin and C-peptide were measured in course of oral glucose tolerance tests (OGTT).
ResultsThe model-based algorithm was able to successfully estimate ISR for a diverse set of patients. For CFRD subjects, due to high maximum observed glucose values, we observed better estimation for the degradation rates than for healthy patients. Even though ISR and C-peptide secretion rate (CSR) have different physiological characteristics and nonlinear relations with plasma glucose concentration, our model-based algorithm reproduced the expected linear relationship between ISR and CSR. And the estimated ISR can differentiate normal and CFRD patients: the ISR for individuals with CFRD is substantially lower compared with the ISR for individuals normal glucose regulatory systems.
SignificanceA new estimation method is available to infer the ISR profile, plasma insulin, and C-peptide absorption rates from sparse measurements of insulin, C-peptide, and plasma glucose concentrations. | endocrinology |
10.1101/2022.03.10.22272219 | Relative assessment of cloth mask protection against ballistic droplets: a frugal approach | During the COVID-19 pandemic, the relevance of evaluating the effectiveness of face masks -especially those made at home using a variety of materials- has become obvious. However, quantifying mask protection often requires sophisticated equipment. Using a frugal stain technique, here we quantify the "ballistic" droplets reaching a receptor from a jet-emitting source which mimics a coughing, sneezing or talking human -in real life, such droplets may host active SARS-CoV-2 virus able to replicate in the nasopharynx. We demonstrate that materials often used in home-made face masks block most of the droplets. Mimicking situations eventually found in daily life, we also show quantitatively that less liquid carried by ballistic droplets reaches a receptor when a blocking material is deployed near the source than when located near the receptor, which supports the paradigm that your face mask does protect you, but protects others even better than you. Finally, the blocking behavior can be quantitatively explained by a simple mechanical model. | epidemiology |
10.1101/2022.03.10.22272221 | Spatial patterns of excess mortality in the first year of the COVID-19 pandemic in Germany | In order to quantify the impact of the SARS-CoV-2/COVID-19 pandemic, several studies have estimated excess mortality rather than infections or COVID-19-related deaths. The current study investigates excess mortality in Germany in 2020 at a small-scale spatial level (400 counties) and under consideration of demographic changes. Mortality is operationalized using standardized mortality ratios (SMRs), visualized on maps, and analyzed descriptively. Regional mortality and COVID-19-related morbidity are tested for spatial dependence by the Morans I index. It is, furthermore, tested whether all-cause mortality is associated with COVID-19-related morbidity by correlation coefficients. Excess mortality only occurrs in a minority of counties. There are large regional disparities of all-cause mortality and COVID-19-related morbidity. In older age groups, both indicators show spatial dependence. (Excess) mortality in older age groups is impacted by COVID-19, but this association is not found for young and middle age groups. | epidemiology |
10.1101/2022.03.10.22272196 | Milder disease trajectory among COVID-19 patients hospitalised with the SARS-CoV-2 Omicron variant compared with the Delta variant in Norway | Using individual-level national registry data, we conducted a cohort study to estimate differences in the length of hospital stay, and risk of admission to an intensive care unit and in-hospital death among patients infected with the SARS-CoV-2 Omicron variant, compared to patients infected with Delta variant in Norway. We included 409 (38%) patients infected with Omicron and 666 (62%) infected with Delta who were hospitalised with COVID-19 as the main cause of hospitalisation between 6 December 2021 and 6 February 2022. Omicron patients had a 48% lower risk of intensive care admission (aHR: 0.52, 95%CI: 0.34-0.80) and a 56% lower risk of in-hospital death (aHR: 0.44, 95%CI: 0.24-0.79) compared to Delta patients. Omicron patients had a shorter length of stay (with or without ICU stay) compared to Delta patients in the age groups from 18-79 years and those who had at least completed their primary vaccination. This supports growing evidence of reduced disease severity among hospitalised Omicron patients compared with Delta patients. | epidemiology |
10.1101/2022.03.10.22272202 | Comorbidities in autism spectrum disorder and their etiologies | ImportancePerinatal exposures have been associated with autism spectrum disorder (ASD). It is unknown whether perinatal exposures are also associated with the burden of comorbidities in ASD across different medical domains.
ObjectiveTo examine the burden and pattern of comorbidities in individuals with ASD, and evaluate the associations between perinatal exposures linked with ASD and distinct comorbidities.
DesignWe used data from family-based study (SPARK) which recruited families with one or more children with a clinically confirmed ASD diagnosis across clinical sites throughout the US.
SettingAll data in SPARK are collected remotely to allow participants to complete the study protocol at their convenience.
ParticipantsTo minimize recall bias of early-life exposures, we restricted the sample to individuals born between 1999 and 2019 who were less than 18 years of age at the time of registration into the study. The final analytic sample included 40,582 children with ASD and 11,389 non-ASD siblings.
ExposuresPerinatal exposures including preterm birth, prenatal infection, fetal alcohol syndrome, hypoxia at birth, bleeding into the brain during delivery, lead poisoning, brain infections, and traumatic brain injury.
Main Outcomes and MeasuresOutcomes were based on parent report of professional diagnosis of neurological, cognitive, psychiatric, and physical disorders.
ResultsChildren with ASD had a substantially higher prevalence of all comorbidities compared to their siblings without an ASD diagnosis (all p-values <0.05). ADHD was the most common comorbidity, affecting 1 in every 3 children with ASD. In logistic regression models adjusted for covariates (annual household income, fathers and mothers education, parental ages at childbirth, year of birth, age of the child at evaluation, race, and ethnicity), different exposures were associated with distinct patterns of comorbidities in ASD cases, including associations between preterm birth and difficulty gaining weight (OR=2.38; 95%CI=2.09-2.71) and traumatic brain injury and seizure or epilepsy (OR=4.75; 95%CI=3.25-6.95). We observed a similar pattern of associations in non-ASD siblings.
Conclusions and RelevanceIndividuals with ASD experience a greater burden of perinatal exposures and comorbidities. The higher burden of comorbidities in this population could be partly attributable to the higher rates of perinatal exposures. Study findings, if replicated in other samples, can inform the etiology of comorbidity in ASD.
KEY POINTSO_ST_ABSQuestionC_ST_ABSDoes the pattern of comorbidities in individuals with autism spectrum disorder vary by pre- and postnatal exposures?
FindingsThe burden of pre- and postnatal exposures and comorbidity in children with autism spectrum disorder (ASD) is significantly higher than in their non-ASD siblings. Distinct pre- and postnatal exposures were associated with different patterns of comorbidities.
MeaningChildren with ASD experience a significantly higher burden of comorbidities which may arise due to distinct etiological factors. | epidemiology |
10.1101/2022.03.11.22272157 | Alcohol use and poor sleep quality: a longitudinal twin study across 36 years | Poor sleep is one of the multiple health issues associated with heavy alcohol consumption. While acute effects of alcohol intake on sleep have been widely investigated, the longitudinal associations remain relatively underexplored. The objective of our research was to shed light on cross-sectional and longitudinal associations between alcohol use and insomnia symptoms over time, and to elucidate the role of familial confounding factors in such associations. Using self-report questionnaire data from the Older Finnish Twin Cohort (N=13,851), we examined how alcohol consumption and binge drinking are associated with sleep quality during a period of 36 years. Cross-sectional logistic regression analyses revealed significant associations between poor sleep and alcohol misuse, including heavy and binge drinking, at all four time points (OR range =1.61-3.37, P <0.05), suggesting that higher alcohol intake is associated with poor sleep quality over the years. Longitudinal cross-lagged analyses indicated that moderate, heavy and binge drinking predict poor sleep quality (OR range =1.25-1.76, P <0.05), but not the reverse. Within-pair analyses suggested that the associations between heavy drinking and poor sleep quality were not fully explained by genetic and environmental influences shared by the co-twins. In conclusion, our findings support previous literature in that alcohol use is associated with poor sleep quality, such that alcohol use predicts poor sleep quality later in life, but not vice versa, and that the association is not fully explained by familial factors. | epidemiology |
10.1101/2022.03.10.22272235 | HIV testing uptake according to opt-in, opt-out or risk-based testing approaches: a systematic review and meta-analysis | Purpose of reviewImproving HIV testing uptake is essential to ending the HIV pandemic. HIV testing approaches can be opt-in, opt-out or risk-based. This systematic review examines and compares the uptake of HIV testing in opt-in, opt-out and risk-based testing approaches.
Recent findingsThere remains missed opportunities for HIV testing in a variety of settings using different approaches: opt-in (a person actively accepts to be tested for HIV), opt-out (a person is informed that HIV testing is routine/standard of care, and they actively decline if they do not wish to be tested for HIV) or risk-based (using risk-based screening tools to focus testing on certain individuals or sub-populations at greater risk of HIV). It is not clear how the approach could impact HIV test uptake when adjusted for other factors (e.g. rapid testing, country-income level, test setting and population tested).
SummaryWe searched four databases for studies reporting on HIV test uptake. In total, 18,238 records were screened, and 150 studies were included in the review. Most studies described an opt-in approach (87 estimates), followed by opt-out (76) and risk-based (19). Opt-out testing was associated with 64.3% test uptake (I2=99.9%), opt-in testing with 59.8% (I2=99.9%), and risk-based testing with 54.4% (I2=99.9%). When adjusted for settings that offered rapid testing, country income level, setting and population tested, opt-out testing had a significantly higher uptake (+12% (95% confidence intervals: 3-21), p=0.007) than opt-in testing. We also found that emergency department patients and hospital outpatients had significantly lower HIV test uptake than other populations. | public and global health |
10.1101/2022.03.11.22272274 | 'We don't routinely check vaccination background in adults': a national qualitative study of barriers and facilitators to vaccine delivery and uptake in adult migrants through UK primary care | BackgroundThe COVID-19 pandemic has highlighted shortfalls in the delivery of vaccine programmes to some adult migrant groups; however, little is known around care pathways and engagement of these older cohorts in routine vaccinations in primary care, including catch-up programmes. Guidelines exist, but the extent to which they are put into practice and prioritised is unclear.
ObjectivesTo explore the views of primary care professionals around barriers and facilitators to catch-up vaccination in adult migrants (defined as foreign born; over 18 years) with incomplete or uncertain vaccination status and for routine vaccines to inform development of future interventions to improve vaccine uptake in this group and improve coverage.
DesignQualitative interview study with purposive sampling and thematic analysis
SettingUK primary care, 50 included practices.
Participants64 primary care professionals (PCPs): 48 clinical including GPs, Practice Nurses and healthcare assistants (HCAs); 16 administrative staff including practice managers and receptionists (mean age 45 years; 84.4% female; a range of ethnicities).
ResultsParticipants highlighted direct and indirect barriers to catch-up vaccines in adult migrants who may have missed vaccines as children, missed boosters, and not be aligned with the UKs vaccine schedule, from both a personal and service-delivery level, with themes including: lack of training and knowledge of guidance around catch-up vaccination among staff; unclear or incomplete vaccine records; and lack of incentivization (including financial reimbursement) and dedicated time and care pathways. Adult migrants were reported as being excluded from many vaccination initiatives, most of which focus exclusively on children. Where delivery models existed they were diverse and fragmented but included a combination of opportunistic and proactive programmes. PCPs noted that migrants expressed to them a range of views around vaccines, from positivity to uncertainty, to refusal, with specific nationality groups reported as more hesitant to get vaccinated with specific vaccines, including MMR.
ConclusionsWHOs new Immunization Agenda (IA2030) has called for greater focus to be placed on delivering vaccination across the life-course, targeting under-immunised groups for catch-up vaccination at any age, with UK primary care services therefore having a key role to play. Vaccine uptake in adult migrants could be improved through implementing new financial incentives or inclusion of adult migrant vaccination targets in QOF, strengthening care pathways and training, and working directly with local community groups to improve understanding around the benefits of vaccination at all ages. | public and global health |
10.1101/2022.03.11.22271849 | Seasonality in living kidney donation in the United States from 1995-2019 | For nearly two decades, the annual number of US living kidney donors has been characterized by worrying patterns of decline and no factors have been identified to explain and reverse these patterns. Evidence suggests that there is seasonality in living kidney donation; herein we investigate whether potentially modifiable social, economic, and structural issues might explain this seasonality. Using donor-registry data from the Scientific Registry of Transplant Recipients, we described this seasonality in living kidney donation and used Poisson regression stratified by both donor-recipient biological relationship and estimated household income tertile to quantify these trends. In every decade from 1980-2020, there was a summer-only surge in living kidney donations (13%-25% for biologically related donors and 10%-17% for unrelated donors). This summer-only surge was evident for the months of June, July, and August when compared with January for each given year and statistically significant in some groups (range of incidence rate ratio [IRR] for related donors: 1.05-1.34; IRR for unrelated donors: 1.08-1.19). We observed this summer-only surge across all three income tertiles ($73,544+, $52,635- $73,544, and <$52,635) and regardless of donor-recipient relationship. Seasonal variation in donation is associated with structural factors, which may serve as potential targets for interventions to increase donation. | transplantation |
10.1101/2022.03.10.22272187 | Dementia Risk and Dynamic Response to Exercise: A non-randomized clinical trial | BackgroundPhysical exercise may support brain health and cognition over the course of typical aging. The goal of this nonrandomized clinical trial was to examine the effect of an acute bout of aerobic exercise on brain blood flow and blood neurotrophic factors associated with exercise response and brain function in older adults with and without possession of the APOE4 allele, a genetic risk factor for developing Alzheimers. We hypothesized that older adult APOE4 carriers would have lower cerebral blood flow regulation and would demonstrate blunted neurotrophic response to exercise compared to noncarriers.
MethodsSixty-two older adults (73{+/-}5 years old, 41 female) consented to this prospectively enrolling clinical trial, utilizing a single arm, single visit, experimental design, with post-hoc assessment of difference in outcomes based on APOE4 carriership. All participants completed a single 15-minute bout of moderate-intensity aerobic exercise. The primary outcome measure was change in cortical gray matter cerebral blood flow in cortical gray matter measured by magnetic resonance imaging (MRI) arterial spin labeling (ASL), defined as the total perfusion (area under the curve, AUC) following exercise. Secondary outcomes were changes in blood neurotrophin concentrations of insulin-like growth factor-1 (IGF-1), vascular endothelial growth factor (VEGF), and brain derived neurotrophic factor (BDNF).
ResultsGenotyping failed in one individual (n=23 APOE4 carriers and n=38 APOE4 non-carriers) and two participants could not complete primary outcome testing. Cerebral blood flow AUC increased immediately following exercise, regardless of APOE4 carrier status. In an exploratory regional analyses, we found that cerebral blood flow increased in hippocampal brain regions, while showing no change in cerebellar brain regions across both groups. Among high interindividual variability, there were no significant changes in any of the 3 neurotrophic factors for either group immediately following exercise.
ConclusionsOur findings show that both APOE4 carriers and non-carriers show similar effects of exercise-induced increases in cerebral blood flow and neurotrophic response to acute aerobic exercise. Our results provide further evidence that acute exercise-induced increases in cerebral blood flow may be regional specific, and that exercise-induced neurotrophin release may show a differential effect in the aging cardiovascular system. Results from this study build upon previous research in younger adults by providing an initial characterization of the acute brain blood flow and neurotrophin responses to a bout of exercise in older adults with and without this known risk allele for cardiovascular disease and Alzheimers disease.
Trials registrationDementia Risk and Dynamic Response to Exercise (DYNAMIC); Identifier: NCT04009629
FundingThis study was funded by grants from the national institutes of health R21 AG061548, P30 AG072973 and P30 AG035982, and the Leo and Anne Albert Charitable Trust. The Hoglund Biomedical Imaging Center is supported by a generous gift from Forrest and Sally Hoglund and funding from the National Institutes of Health including S10 RR29577, and UL1 TR002366. | neurology |
10.1101/2022.03.10.22272177 | The Omicron SARS-CoV-2 epidemic in England during February 2022 | BackgroundThe third wave of COVID-19 in England peaked in January 2022 resulting from the rapid transmission of the Omicron variant. However, rates of hospitalisations and deaths were substantially lower than in the first and second waves
MethodsIn the REal-time Assessment of Community Transmission-1 (REACT-1) study we obtained data from a random sample of 94,950 participants with valid throat and nose swab results by RT-PCR during round 18 (8 February to 1 March 2022).
FindingsWe estimated a weighted mean SARS-CoV-2 prevalence of 2.88% (95% credible interval [CrI] 2.76-3.00), with a within-round reproduction number (R) overall of 0.94 (0{middle dot}91-0.96). While within-round weighted prevalence fell among children (aged 5 to 17 years) and adults aged 18 to 54 years, we observed a level or increasing weighted prevalence among those aged 55 years and older with an R of 1.04 (1.00-1.09). Among 1,195 positive samples with sublineages determined, only one (0.1% [0.0-0.5]) corresponded to AY.39 Delta sublineage and the remainder were Omicron: N=390, 32.7% (30.0-35.4) were BA.1; N=473, 39.6% (36.8-42.5) were BA.1.1; and N=331, 27.7% (25.2-30.4) were BA.2. We estimated an R additive advantage for BA.2 (vs BA.1 or BA.1.1) of 0.40 (0.36-0.43). The highest proportion of BA.2 among positives was found in London.
InterpretationIn February 2022, infection prevalence in England remained high with level or increasing rates of infection in older people and an uptick in hospitalisations. Ongoing surveillance of both survey and hospitalisations data is required.
FundingDepartment of Health and Social Care, England.
O_TEXTBOXResearch in contextO_ST_ABSEvidence before this studyC_ST_ABSA search of PubMed using title or abstract terms ("Omicron" or "BA.1" or "BA.2") and "prevalence" without language or other restrictions, identified 51 results (with no duplicates). All 51 results were evaluated, with 18 deemed relevant. One study focused on Omicron case rates in South Africa during the early stage after the discovery of the new variant (November 2021), one described genomic surveillance of SARS-CoV-2 in the USA (June - December 2021), one analysed clinical outcomes based on health records (January - December 2021), one described the results of whole-genome sequencing of SARS-CoV-2 samples collected in North Africa (March - December 2021), and one was from a previous REACT survey round (November - December 2021). The others focused on the mutation distribution of Omicron, disease severity, immune response, vaccine effectiveness, and prevalence in animal hosts.
Added value of this studyWe analysed data from throat and nose swabs collected at home by a randomly selected sample of residents of England, aged 5 years and older, obtained during round 18 (8 February to 1 March 2022) of the REal-time Assessment of Community Transmission-1 (REACT-1) study. We estimated a weighted prevalence of SARS-CoV-2 of 2.88% (95% CrI 2.76-3.00) in England in February 2022, which was substantially lower than that estimated in January 2022 (4.41% [4.25-4.56]). The within-round dynamics differed by age group with weighted prevalence falling among children (aged 5 to 17 years) with an R of 0.79 (0.74-0.84) and adults aged 18 to 54 years with an R of 0.92 (0.89-0.96), in contrast to the level or increasing weighted prevalence among those aged 55 years and older with an R of 1.04 (1.00-1.09). Exponential models estimated a daily growth rate advantage of 0.12 (0.11-0.13) in the odds of BA.2 (vs BA.1 or BA.1.1) corresponding to an R additive advantage of 0.40 (0.36-0.43).
Implications of all the available evidenceRandom community surveys of SARS-CoV-2 provide robust insights into transmission dynamics and identify groups at heightened risk of infection based on estimates of population prevalence that are unbiased by test-seeking behaviour or availability of tests. In England, replacement by BA.2 of other Omicron sublineages, the level or increasing rates of infection in older people and the uptick in hospitalisations in England toward the end of February 2022 require ongoing surveillance, both to monitor the levels of current (and future) SARS-CoV-2 variants and the risks of severe disease.
C_TEXTBOX | epidemiology |
10.1101/2022.03.10.22272191 | Victimization and witnessing of workplace bullying and physician-diagnosed physical and mental health and organizational outcomes: a cross-sectional study of a nationally representative sample in Japan | BackgroundCompared to the numerous reports on mental health outcomes of workplace bullying victims, research on organizational outcomes of witnesses and physical health outcomes of victims and witnesses is scarce. Therefore, the purpose of this study was to investigate the relationship between bullying victimization and witnessing and various physical and mental health outcomes and organizational outcomes such as sickness absence, work performance, and job satisfaction.
MethodsThis study used cross-sectional data from a nationally representative, community-based sample of 5,000 Japanese residents aged 20-60. We analyzed data from 1,496 respondents after excluding those not working at the time of the survey and those with missing values. Workplace bullying, psychological distress, physical complaints, and job satisfaction were assessed with the New Brief Job Stress Questionnaire and work performance with the World Health Organizations Health and Work Performance Questionnaire. In addition, subjective health status, physician-diagnosed mental or physical illness, and sickness absence were asked as one item. Hierarchical multiple regression analysis or Poisson regression analysis was conducted to assess the association between victimization/witnessing workplace bullying and health and organizational outcomes.
ResultsBoth victimization and witnessing workplace bullying were significantly associated with psychological distress, physical complaints, subjective poor health, physician-diagnosed mental disorders, and job dissatisfaction. Victimization of workplace bullying was further associated with physician-diagnosed respiratory diseases, sickness absence ([≥]7 days), and poor work performance. Victims were absent from work for 4.5 more sick days and had 11.2% lower work performance than non-victims.
ConclusionsThe results showed that both victimization and witnessing workplace bullying were significantly associated with physical and mental outcomes and various organizational outcomes. Organizations should implement further measures to prevent personal and organizational losses due to workplace bullying. | epidemiology |
10.1101/2022.03.10.22271914 | The Gastric Cancer Registry: A Genomic Translational Resource for Multidisciplinary Research in Stomach Malignancies | BackgroundGastric cancer (GC) is a leading cause of global cancer morbidity and mortality. Developing information systems which integrate clinical and genomic data may accelerate discoveries to improve cancer prevention, detection, and treatment. To support translational research in GC, we developed the GC Registry (GCR), a North American repository of clinical and cancer genomics data.
MethodsGCR is a national registry with online self-enrollment. Entry criteria into the GCR included the following: (1) diagnosis of GC, (2) history of GC in a first-or second-degree family member or (3) known pathogenic or likely pathogenic germline mutation in the gene CDH1. Participants provided demographic and clinical information through a detailed (412-item) online survey. A subset of participants provided specimens of saliva and tumor samples. These tumor samples underwent exome sequencing, whole genome sequencing and transcriptome sequencing.
ResultsFrom 2011-2021, 567 individuals registered for the GCR and returned the clinical questionnaire. For this cohort 65% had a personal history of GC, 36% reported a family history of GC and 14% had a germline CDH1 mutation. Eighty-nine GC patients provided tumor tissue samples. For the initial pilot study, 41 tumors were sequenced using next generation sequencing. The data was analyzed for cancer mutations, copy number variations, gene expression, microbiome presence, neoantigens, immune infiltrates, and other features. We developed a searchable, web-based interface (the GCR Genome Explorer) to enable researchers access to these datasets.
ConclusionsThe GCR is a unique, North American GC registry which integrates both clinical and genomic annotation.
ImpactAvailable for researchers through an open access, web-based explorer, we hope the GCR Genome Explorer accelerates collaborative GC research across the United States. | genetic and genomic medicine |
10.1101/2022.03.10.22272195 | Comparing survival in urinary tract cancer patients with various second primary malignancies: A population-based study | PurposeWe aimed to identify the patterns and combinations of second primary malignancies (SPMs) observed in patients with malignant neoplasms of the urinary tract (MNUT) and to explore the independent risk factors for survival outcomes in these patients.
Materials and MethodsWe analysed the data of MNUT patients with SPM in 25 hospitals in Shanghai between 2002 and 2015. A life table was used to calculate the survival rates, Kaplan-Meier analysis was used to determine the survival status of MNUT patients, and Cox regression analysis was used to perform multivariate analysis of survival risk factors in MNUT patients with SPM.
ResultsAmong the 154 patients included, the first primary malignancy (PM) most commonly occurred in the bladder (50.65%) and kidney (41.56%), and the SPM most commonly occurred in the lung (22.73%) and stomach (13.64%). The most common combinations included the bladder + lung and bladder + stomach. The Cox regression results showed that age older than 60 years (HR = 2.36 [95% CI 1.30-4.28] vs. age [≤]60 years, p = 0.005), TNM 1 stage III+IV disease (HR = 2.19 [95% CI 1.37-4.57] vs. I+ II), p = 0.037), TNM 2 stage III+IV disease (HR = 7.43 [95% CI 1.49-19.68] vs. I + II), p <0.001), and SPM in the lung (HR = 4.36 [95% CI 1.74-18.69], p = 0.047) were associated with a significantly worse cancer-specific survival.
ConclusionThe survival of MNUT patients with SPM may be related to the SPM site, first and second PM staging and latency time. | urology |
10.1101/2022.03.11.22271902 | How are declarations of interest working? A cross sectional study in declarations of interest in medical practice in Scotland and England in 2020/2021 | ObjectiveTo understand arrangements for doctors declarations of interest in Scotland and England in the context of current recommendations.
DesignCross sectional study of a random selection of NHS hospital registers of interest by two independent observers in England, all NHS Boards in Scotland, and a random selection of Clinical Commissioning Groups (CCGs) in England.
SettingNHS Trusts in England (NHSE), NHS Boards in Scotland, CCGs in England, and private healthcare organisations.
ParticipantsRegisters of declarations of interest published in a random sample of 67 of 217 NHS Trusts, a random sample of 15 CCGs of in England, registers held by all 14 NHS Scotland boards, a purposeful selection of private hospitals/clinics in the UK.
Main Outcome MeasuresAdherence to NHSE guidelines on declarations of interests, and comparison in Scotland.
Results76% of registers published by Trusts did not routinely include all declaration of interest categories recommended by NHS England. In NHS Scotland only 14% of Boards published staff registers of interest. Of these employee registers (most obtained under Freedom of Information), 27% contained substantial retractions. In England, 96% of Clinical Commissioning Groups published a Gifts and Hospitality register, with 67% of CCG staff declaration templates and 53% of governor registers containing full standard NHS England declaration categories. Single organisations often held multiple registers lacking enough information to interpret them. Only 35% of NHS Trust registers were organised to enable searching. None of the private sector organisations studied published a comparable declarations of interest register.
ConclusionDespite efforts, the current system of declarations frequently lacks ability to meaningfully obtain complete health care professionals declaration of interests. | health policy |
10.1101/2022.03.11.22272161 | Do recent family physician graduates practice differently? A longitudinal study of primary care visits and continuity in four Canadian provinces | BackgroundComplaints about lack of access to family physicians (FPs) has led to concerns about the role recent physician graduates have had in changes in the supply of primary care services in Canada. This study investigates the impact of career stage, time period, and graduation cohort on family physician practice volume and continuity over two decades.
MethodsRetrospective-cohort study of family physician practice from 1997/98 to 2017/18. Administrative health and physician claims data were collected in British Columbia, Manitoba, Ontario, and Nova Scotia. The study included all physicians registered with their respective provincial regulatory colleges with a medical specialty of family practice and/or billed the provincial health insurance system for patient care as family physicians. Median polish analysis of patient contacts and physician-level continuity was completed to isolate years-in-practice, period, and cohort effects.
ResultsMedian patient contacts per provider fell over time in the four provinces examined. In all four provinces, median contacts increased with years in practice until mid-to-late-career and declined into end-of-career. We found no relationship between graduation cohort and practice volume or FP-level continuity.
InterpretationRecent cohorts of family physicians practice similarly to predecessors in terms of practice volumes and continuity of care. Since FPs of all career stages show declining patient contacts, system-wide solutions to recent challenges in the accessibility of primary care in Canada are needed. | primary care research |
10.1101/2022.03.11.22272161 | Do recent family physician graduates practice differently? A longitudinal study of primary care visits and continuity in four Canadian provinces | BackgroundComplaints about lack of access to family physicians (FPs) has led to concerns about the role recent physician graduates have had in changes in the supply of primary care services in Canada. This study investigates the impact of career stage, time period, and graduation cohort on family physician practice volume and continuity over two decades.
MethodsRetrospective-cohort study of family physician practice from 1997/98 to 2017/18. Administrative health and physician claims data were collected in British Columbia, Manitoba, Ontario, and Nova Scotia. The study included all physicians registered with their respective provincial regulatory colleges with a medical specialty of family practice and/or billed the provincial health insurance system for patient care as family physicians. Median polish analysis of patient contacts and physician-level continuity was completed to isolate years-in-practice, period, and cohort effects.
ResultsMedian patient contacts per provider fell over time in the four provinces examined. In all four provinces, median contacts increased with years in practice until mid-to-late-career and declined into end-of-career. We found no relationship between graduation cohort and practice volume or FP-level continuity.
InterpretationRecent cohorts of family physicians practice similarly to predecessors in terms of practice volumes and continuity of care. Since FPs of all career stages show declining patient contacts, system-wide solutions to recent challenges in the accessibility of primary care in Canada are needed. | primary care research |
10.1101/2022.03.11.22272153 | Analysis of immunization time, amplitude, and adverse events of seven different vaccines against SARS-CoV-2 across four different countries. | BackgroundScarce information exists in relation to the comparison of seroconversion and adverse events following immunization (AEFI) with different SARS-CoV-2 vaccines. Our aim was to correlate the magnitude of the antibody response to vaccination with previous clinical conditions and AEFI.
MethodsA multicentric comparative study where SARS-CoV-2 spike 1-2 IgG antibodies IgG titers were measured at baseline, 21-28 days after the first and second dose (when applicable) of the following vaccines: BNT162b2 mRNA, mRNA-1273, Gam-COVID-Vac, Coronavac, ChAdOx1-S, Ad5-nCoV and Ad26.COV2. Mixed model and Poisson generalized linear models were performed.
ResultsWe recruited 1867 subjects [52 (SD 16.8) years old, 52% men]. All vaccines enhanced anti-S1 and anti-S2 IgG antibodies over time (p<0.01). The highest increase after the first and second dose was observed in mRNA-1273 (p<0.001). There was an effect of previous SARS-CoV-2 infection; and an interaction of age with SARS-CoV-2, Gam-COVID-Vac and ChAdOx1-S (p<0.01). There was a negative correlation of Severe or Systemic AEFI (AEs) of naive SARS-CoV-2 subjects with age and sex (p<0.001); a positive interaction between the delta of antibodies with Gam-COVID-Vac (p=0.002). Coronavac, Gam-COVID-Vac and ChAdOx1-S had less AEs compared to BNT162b (p<0.01). mRNA-1273 had a higher number of AEFIs. The delta of the antibodies showed an association with AEFIs in previously infected individuals (p<0.001).
ConclusionsThe magnitude of seroconversion is predicted by age, vaccine type and SARS-CoV-2 exposure. AEs are correlated with age, sex, and vaccine type. The delta of the antibody response is positively correlated with AEs in patients previously exposed to SARS-CoV-2.
Registration numberNCT05228912 | infectious diseases |
10.1101/2022.03.11.22272153 | Analysis of immunization time, amplitude, and adverse events of seven different vaccines against SARS-CoV-2 across four different countries. | BackgroundScarce information exists in relation to the comparison of seroconversion and adverse events following immunization (AEFI) with different SARS-CoV-2 vaccines. Our aim was to correlate the magnitude of the antibody response to vaccination with previous clinical conditions and AEFI.
MethodsA multicentric comparative study where SARS-CoV-2 spike 1-2 IgG antibodies IgG titers were measured at baseline, 21-28 days after the first and second dose (when applicable) of the following vaccines: BNT162b2 mRNA, mRNA-1273, Gam-COVID-Vac, Coronavac, ChAdOx1-S, Ad5-nCoV and Ad26.COV2. Mixed model and Poisson generalized linear models were performed.
ResultsWe recruited 1867 subjects [52 (SD 16.8) years old, 52% men]. All vaccines enhanced anti-S1 and anti-S2 IgG antibodies over time (p<0.01). The highest increase after the first and second dose was observed in mRNA-1273 (p<0.001). There was an effect of previous SARS-CoV-2 infection; and an interaction of age with SARS-CoV-2, Gam-COVID-Vac and ChAdOx1-S (p<0.01). There was a negative correlation of Severe or Systemic AEFI (AEs) of naive SARS-CoV-2 subjects with age and sex (p<0.001); a positive interaction between the delta of antibodies with Gam-COVID-Vac (p=0.002). Coronavac, Gam-COVID-Vac and ChAdOx1-S had less AEs compared to BNT162b (p<0.01). mRNA-1273 had a higher number of AEFIs. The delta of the antibodies showed an association with AEFIs in previously infected individuals (p<0.001).
ConclusionsThe magnitude of seroconversion is predicted by age, vaccine type and SARS-CoV-2 exposure. AEs are correlated with age, sex, and vaccine type. The delta of the antibody response is positively correlated with AEs in patients previously exposed to SARS-CoV-2.
Registration numberNCT05228912 | infectious diseases |
10.1101/2022.03.11.22272140 | High vaccine effectiveness against severe Covid-19 in the elderly in Finland before and after the emergence of Omicron | BackgroundThe elderly are highly vulnerable to severe Covid-19. Waning immunity and emergence of Omicron have caused concerns about reduced effectiveness of Covid-19 vaccines. The objective was to estimate vaccine effectiveness (VE) against severe Covid-19 among the elderly.
MethodsThis nationwide, register-based cohort study included all residents aged 70 years and over in Finland. The follow-up started on December 27, 2020, and ended on February 19, 2022. The study outcomes were Covid-19-related hospitalization and intensive care unit (ICU) admission timely associated with SARS-CoV-2 infection. VE was estimated as 1 minus the hazard ratio comparing the vaccinated and unvaccinated and taking into account time since vaccination. Omicron-specific VE was evaluated as the effectiveness observed since January 01, 2022.
ResultsThe cohort included 897932 individuals. Comirnaty (BioNTech/Pfizer) VE against Covid-19-related hospitalization was 93% (95% confidence interval [CI], 90%-95%) and 87% (84%-89%) 14-90 and 91-180 days after the second dose; VE increased to 96% (95%-97%) 14-60 days after the third dose. VE of other homologous and heterologous 3-dose series was similar. Protection against severe Covid-19 requiring ICU treatment was even better. Since January 01, 2022, Comirnaty VE was 91% (95% CI, 79%-96%) and 76% (56%-86%) 14-90 and 91-180 days after the second and 95% (94%-97%) 14-60 days after the third dose.
ConclusionsVE against severe Covid-19 is high among the elderly. It waned slightly after 2 doses, but a third restored the protection. VE against severe Covid-19 remained high even after the emergence of Omicron. | infectious diseases |
10.1101/2022.03.11.22272269 | Association of frailty, age, and biological sex with SARS-CoV-2 mRNA vaccine-induced immunity in older adults | BackgroundMale sex and old age are risk factors for severe COVID-19, but the intersection of sex and aging on antibody responses to SARS-CoV-2 vaccines has not been characterized.
MethodsPlasma samples were collected from older adults (75-98 years) before and after three doses of SARS-CoV-2 mRNA vaccination, and from younger adults (18-74 years) post-dose two, for comparison. Antibody binding to SARS-CoV-2 antigens (spike protein [S], S-receptor binding domain [S-RBD], and nucleocapsid [N]) and functional activity against S were measured against the vaccine virus and variants of concern (VOC).
ResultsVaccination induced greater antibody titers in older females than males, with both age and frailty associated with reduced antibody responses to vaccine antigens in males, but not females. ACE2 binding inhibition declined more than anti-S or anti-S-RBD IgG in the six months following the second dose (28-fold vs. 12- and 11-fold decreases in titer). The third dose restored functional antibody responses and eliminated disparities caused by sex, age, and frailty in older adults. Responses to the VOC were significantly reduced relative to the vaccine virus, with older males having lower titers to the VOC than females. Older adults had lower responses to the vaccine and VOC viruses than younger adults, with disparities being greater in males than females.
ConclusionOlder and frail males may be more vulnerable to breakthrough infections due to low antibody responses before receipt of a third vaccine dose. Promoting third dose coverage in older adults, especially males, is crucial to protecting this vulnerable population.
Brief summarySARS-CoV-2 mRNA vaccination induces greater antibody response in older females than males, and age and frailty reduce responses in males only. These effects are eliminated by a third vaccine dose, highlighting the need for third dose coverage, especially in older males. | infectious diseases |
10.1101/2022.03.11.22272263 | Surveillance of COVID-19 cases associated with dental settings using routine health data from the East of Scotland with a description of efforts to break chains of transmission from October 2020 to December 2021. | IntroductionDental settings have been considered high risk setting s for COVID-19. A Dental Public Health Team in South East Scotland have worked to risk assess the situation timeously to break chains of transmission.
AimTo present routine data produced from a contact tracing service for COVID-19 cases in the dental setting with a focus on transmission.
DesignObservational retrospective analysis of a routine data set of COVID-19 cases associated with a dental setting reported via the national contact tracing system for two health board areas in the east of Scotland.
MethodsCOVID-19 cases were confirmed by PCR testing. Descriptive statistics are used to summarise the data collected over a 13-month period (Oct 2020-Dec 2021). A narrative presents themes identified during contact tracing that led to transmission within a dental setting and includes a case study.
ResultsA total of 811 incidents are included. No evidence of staff to patient transmission or vice versa was found in this study. Staff to staff transmission occurred in non-clinical areas contributing to 33% of total staff cases.
ConclusionTransmission of COVID-19 in a dental setting in the context of this study appears to be confined to non-clinical areas. Future pandemic plans should include tools to aid with implementation of guidance in non-clinical areas.
In brief pointsO_LIOutbreaks of COVID-19 in a dental setting appear to be confined to the non-clinical areas of dental practices.
C_LIO_LIWe have found no evidence of staff to patient transmission or vice versa using our contact tracing methods.
C_LIO_LIFuture pandemic preparedness would benefit from including current quality improvement tools to aid with implementation of new standard operating procedures and other regularly changing guidance.
C_LI | dentistry and oral medicine |
10.1101/2022.03.11.22271912 | External validation of risk scores to predict in-hospital mortality in patients hospitalized due to coronavirus disease 2019. | BackgroundThe coronavirus disease 2019 (COVID-19) presents an urgent threat to global health. Prediction models that accurately estimate mortality risk in hospitalized patients could assist medical staff in treatment and allocating limited resources.
AimsTo externally validate two promising previously published risk scores that predict in-hospital mortality among hospitalized COVID-19 patients.
MethodsTwo cohorts were available; a cohort of 1028 patients admitted to one of nine hospitals in Lombardy, Italy (the Lombardy cohort) and a cohort of 432 patients admitted to a hospital in Leiden, the Netherlands (the Leiden cohort). The primary endpoint was in-hospital mortality. All patients were adult and tested COVID-19 PCR-positive. Model discrimination and calibration were assessed.
ResultsThe C-statistic of the 4C mortality score was good in the Lombardy cohort (0.85, 95CI: 0.82-0.89) and in the Leiden cohort (0.87, 95CI: 0.80-0.94). Model calibration was acceptable in the Lombardy cohort but poor in the Leiden cohort due to the model systematically overpredicting the mortality risk for all patients. The C-statistic of the CURB-65 score was good in the Lombardy cohort (0.80, 95CI: 0.75-0.85) and in the Leiden cohort (0.82, 95CI: 0.76-0.88). The mortality rate in the CURB-65 development cohort was much lower than the mortality rate in the Lombardy cohort. A similar but less pronounced trend was found for patients in the Leiden cohort.
ConclusionAlthough performances did not differ greatly, the 4C mortality score showed the best performance. However, because of quickly changing circumstances, model recalibration may be necessary before using the 4C mortality score. | emergency medicine |
10.1101/2022.03.11.22272264 | Daily Rapid Antigen Testing in a University Setting to Inform COVID-19 Isolation Duration Policy | ImportanceThe suitability of the currently recommended 5-day COVID-19 isolation period remains unclear in an Omicron-dominant landscape. Early data suggest high positivity via rapid antigen test beyond day 5, but evidence gaps remain regarding optimal isolation duration and the best use of limited RATs to exit isolation.
ObjectiveTo determine the percentage of SARS-CoV-2 infected persons who remain positive via RAT on isolation day 5+ and assess possible factors associated with isolation duration.
DesignWe evaluated daily rapid antigen test case series data from 324 persons in a managed isolation program who initially tested positive between January 1 and February 11, 2022, an Omicron-dominant period. Arrival tests and twice-weekly screening were mandated. Positive persons isolated and began mandatory daily self-testing on day 5 until testing negative. Trained staff proctored exit testing.
SettingA mid-sized university in the United States.
ParticipantsUniversity students in isolation.
Main Outcomes and MeasuresThe percentage of persons remaining positive on isolation day 5 and each subsequent day. The association between possible prognostic factors and isolation duration as measured by event-time-ratios (ETR).
ResultsWe found 47% twice-weekly screeners and 26-28% less frequent screeners remained positive on day 5, with the percentage approximately halving each additional day. Having a negative test [≥] 10 days before diagnosis (ETR 0.85 (95% CI 0.75-0.96)) and prior infection > 90 days (ETR 0.50 (95% CI 0.33-0.76)) were significantly associated with shorter isolation. Symptoms before or at diagnosis (ETR 1.13 (95% CI 1.02-1.25)) and receipt of 3 vaccine doses (ETR 1.20 (95% CI 1.04-1.39)) were significantly associated with prolonged isolation. However, these factors were associated with duration of isolation, not infection, and could reflect how early infections were detected.
Conclusions and RelevanceA high percentage of university students during an Omicron-dominant period remained positive after the currently recommended 5-day isolation, highlighting possible onward transmission risk. Persons diagnosed early in their infections or using symptom onset as their isolation start may particularly require longer isolations. Significant factors associated with isolation duration should be further explored to determine relationships with infection duration.
Key PointsO_ST_ABSQuestionC_ST_ABSWhat percentage of SARS-CoV-2 infected persons remain positive via rapid antigen test on days 5+ of isolation?
FindingsIn this case series of 324 university students, 47% of twice-weekly screeners and 26-28% of less frequent screeners remained positive via rapid antigen on isolation day 5, with the percent still positive approximately halving with each subsequent day.
MeaningWhile isolation duration decisions are complex, our study adds to growing evidence that a 5-day isolation may be 1-2 days too short to sufficiently reduce the onward transmission risk, particularly for those in dense settings or among vulnerable populations. | epidemiology |
10.1101/2022.03.11.22271527 | SARS-CoV-2 reinfections during the first three major COVID-19 waves in Bulgaria | BackgroundThe COVID-19 pandemic has had a devastating impact on the world over the past two years (2020-2021). One of the key questions about its future trajectory is the protection from subsequent infections and disease conferred by a previous infection, as the SARS-CoV-2 virus belongs to the coronaviruses, a group of viruses the members of which are known for their ability to reinfect convalescent individuals. Bulgaria, with high rates of previous infections combined with low vaccination rates and an elderly population, presents a somewhat unique context to study this question.
MethodsWe use detailed governmental data on registered COVID-19 cases to evaluate the incidence and outcomes of COVID-19 reinfections in Bulgaria in the period between March 2020 and early December 2021.
ResultsFor the period analyzed, a total of 4,106 cases of individuals infected more than once were observed, including 31 cases of three infections and one of four infections. The number of reinfections increased dramatically during the Delta variant-driven wave of the pandemic towards the end of 2021. We observe a moderate reduction of severe outcomes (hospitalization and death) in reinfections relative to primary infections, and a more substantial reduction of severe outcomes in breakthrough infections in vaccinated individuals.
ConclusionsIn the available datasets from Bulgaria, prior infection appears to provide some protection from severe outcomes, but to a lower degree than the reduction in severity of breakthrough infections in the vaccinated compared to primary infections in the unvaccinated. | epidemiology |
10.1101/2022.03.12.22271872 | The impact of COVID-19 non-pharmaceutical interventions on future respiratory syncytial virus transmission in South Africa | In response to the COVID-19 pandemic, the South African government employed various nonpharmaceutical interventions (NPIs) in order to reduce the spread of SARS-CoV-2. In addition to mitigating transmission of SARS-CoV-2, these public health measures have also functioned in slowing the spread of other endemic respiratory pathogens. Surveillance data from South Africa indicates low circulation of respiratory syncytial virus (RSV) throughout the 2020-2021 Southern Hemisphere winter seasons. Here we fit age-structured epidemiological models to national surveillance data to predict the 2022 RSV outbreak following two suppressed seasons. We project a 32% increase in the peak number of monthly hospitalizations among infants [≤] 2 years, with older infants (6-23 month olds) experiencing a larger portion of severe disease burden than typical. Our results suggest that hospital system readiness should be prepared for an intense RSV season in early 2022. | public and global health |
10.1101/2022.03.11.22272272 | Portable, Low-Field Magnetic Resonance Imaging Sensitively Detects and Accurately Quantifies Multiple Sclerosis Lesions | Magnetic resonance imaging is a fundamental tool in the diagnosis and management of neurological diseases such as multiple sclerosis (MS). New portable, low-field MRI scanners could potentially lower financial and technical barriers to neuroimaging and reach underserved or disabled populations. However, the sensitivity of low-field MRI for MS lesions is unknown. We sought to determine if white matter lesions can be detected on a 64mT low-field MRI, compare automated lesion segmentations and total lesion burden between paired 3T and 64mT scans, and identify features that contribute to lesion detection accuracy. In this prospective, cross-sectional study, same-day brain MRI (FLAIR, T1, and T2) scans were collected from 36 adults (32 women; mean age, 50 {+/-} 14 years) with known or suspected MS using 3T (Siemens) and 64mT (Hyperfine) scanners at two centers. Images were reviewed by neuroradiologists. MS lesions were measured manually and segmented using an automated algorithm. Statistical analyses assessed accuracy and variability of segmentations across scanners and systematic scanner biases in automated volumetric measurements. Lesions were identified on 64mT scans in 94% (31/33) of patients with confirmed MS. The smallest lesions manually detected were 5.7 {+/-} 1.3 mm in maximum diameter at 64mT vs 2.1 {+/-} 0.6 mm at 3T. Automated lesion burden estimates were highly correlated between 3T and 64mT scans (r = 0.89, p < 0.001). Bland-Altman analysis identified bias in 64mT segmentations (mean = 1.6 ml, standard error = 5.2 ml, limits of agreement = -19.0-15.9 ml), which over-estimated low lesion burden and under-estimated high burden (r = 0.74, p < 0.001). Visual inspection revealed over-segmentation was driven by flow-related hyperintensities in veins on 64mT FLAIR. Lesion size drove segmentation accuracy, with 93% of lesions >1.0 ml and all lesions >1.5 ml being detected. These results demonstrate that in established MS, a portable 64mT MRI scanner can identify white matter lesions, and disease burden estimates are consistent with 3T scans.
HighlightsO_LIPaired, same-day 3T and 64mT MRI studies were collected in 36 patients
C_LIO_LI64mT MRI showed 94% sensitivity for detecting any lesions in established MS cases
C_LIO_LIThe diameter of the smallest detected lesion was larger at 64mT compared to 3T
C_LIO_LIDisease burden estimates were strongly correlated between 3T and 64mT scans
C_LIO_LILow-field MRI can detect white matter lesions, though smaller lesions may be missed
C_LI | radiology and imaging |
10.1101/2022.03.11.22272039 | Clinical interpretation of machine learning models for prediction of diabetic complications using electronic health records | Type 2 diabetes is a massive public health issue that continues to grow. The rate of diabetic complication progression varies across individuals. Understanding factors that alter the rate of complication onset may uncover new clinical interventions and help prioritize individuals for more aggressive management. Here, we explore how various machine learning models and types of electronic health records can predict fast versus slow diabetic complication onset using only patient data prior to diabetes diagnosis. We find that optimized random forests generally perform best among the tested models and combining all data sources yields the best predictive performance. A key differentiator of our study is our model interpretation, which identifies specific patient metrics from each dataset that play a unique role in the progression of each complication. Overall, our clinical interpretation of machine learning models can identify patients at risk for poorer outcomes years in advance of their diabetic complication. | health informatics |
10.1101/2022.03.11.22272282 | Development and Implementation of a Simple and Rapid Extraction-Free Saliva SARS-CoV-2 RT-LAMP Workflow for Workplace Surveillance | Effective management of the COVID-19 pandemic requires widespread and frequent testing of the population for SARS-CoV-2 infection. Saliva has emerged as an attractive alternative to nasopharyngeal samples for surveillance testing as it does not require specialized personnel or materials for its collection and can be easily provided by the patient. We have developed a simple, fast, and sensitive saliva-based testing workflow that requires minimal sample treatment and equipment. After sample inactivation, RNA is quickly released and stabilized in an optimized buffer, followed by reverse transcription loop-mediated isothermal amplification (RT-LAMP) and detection of positive samples using a colorimetric and/or fluorescent readout. The workflow was optimized using 1,670 negative samples collected from 172 different individuals over the course of 6 months. Each sample was spiked with 50 copies/L of inactivated SARS-CoV-2 virus to monitor the efficiency of viral detection. Using pre-defined clinical samples, the test was determined to be 100% specific and 97% sensitive, with a limit of detection comparable to commercially available RT-qPCR-based diagnostics. The method was successfully implemented in a CLIA laboratory setting for workplace surveillance and reporting. From April 2021-February 2022, more than 30,000 self-collected samples from 755 individuals were tested and 85 employees tested positive mainly during December and January, consistent with high infections rates in Massachusetts and nationwide. The rapid identification and isolation of infected individuals with trace viral loads before symptom onset minimized viral spread in the workplace. | infectious diseases |
10.1101/2022.03.11.22272185 | SEROLOGICAL TESTING OF BLOOD DONORS TO CHARACTERISE THE IMPACT OF COVID-19 IN MELBOURNE, AUSTRALIA, 2020 | Rapidly identifying and isolating people with acute SARS-CoV-2 infection has been a core strategy to contain COVID-19 in Australia, but a proportion of infections go undetected. We estimated SARS-CoV-2 specific antibody prevalence (seroprevalence) among blood donors in metropolitan Melbourne following a COVID-19 outbreak in the city between June and September 2020. The aim was to determine the extent of infection spread and whether seroprevalence varied demographically in proportion to reported cases of infection. The design involved stratified sampling of residual specimens from blood donors (aged 20-69 years) in three postcode groups defined by low (<3 cases/1,000 population), medium (3-7 cases/1,000 population) and high (>7 cases/1,000 population) COVID-19 incidence based on case notification data. All specimens were tested using the Wantai SARS-CoV-2 total antibody assay. Seroprevalence was estimated with adjustment for test sensitivity and specificity for the Melbourne metropolitan blood donor and residential populations, using multilevel regression and poststratification. Overall, 4,799 specimens were collected between 23 November and 17 December 2020. Seroprevalence for blood donors was 0.87% (90% credible interval: 0.25-1.49%). The highest estimates, of 1.13% (0.25-2.15%) and 1.11% (0.28-1.95%), respectively, were observed among donors living in the lowest socioeconomic areas (Quintiles 1 and 2) and lowest at 0.69% (0.14-1.39%) among donors living in the highest socioeconomic areas (Quintile 5). When extrapolated to the Melbourne residential population, overall seroprevalence was 0.90% (0.26-1.51%), with estimates by demography groups similar to those for the blood donors. The results suggest a lack of extensive community transmission and good COVID-19 case ascertainment based on routine testing during Victorias second epidemic wave. Residual blood donor samples provide a practical epidemiological tool for estimating seroprevalence and information on population patterns of infection, against which the effectiveness of ongoing responses to the pandemic can be assessed. | infectious diseases |
10.1101/2022.03.11.22272259 | Unsupervised clustering reveals phenotypes of AKI in ICU Covid19 patients | BackgroundAcute Kidney Injury (AKI) is a very frequent condition, occurring in about one in three patients admitted to an intensive care unit (ICU). AKI is a syndrome defined as a sudden decrease in glomerular filtration rate. However, this unified definition does not reflect the various mechanisms involved in AKI pathophysiology, each with its own characteristics and sensitivity to therapy. In this study, we aimed at developing an innovative machine learning based method able to subphenotype AKI according to its pattern of risk factors.
MethodsWe adopted a three-step pipeline of analyses. Firstly, we looked for factors associated with AKI using a generalized additive model. Secondly, we calculated the importance of each identified AKI related factor in the estimated AKI risk to find the main risk factor for AKI, at the single patient level. Lastly, we clusterized AKI patients according to their profile of risk factors and compared the clinical characteristics and outcome of every cluster. We applied this method to a cohort of severe Covid19 patients hospitalized in the ICU of Geneva University Hospitals.
ResultsAmong the 250 patients analyzed, we found ten factors associated with AKI development. Using the individual expression of these factors, we identified three groups of AKI patients, based on the use of Lopinavir/Ritonavir, a prior history of diabetes mellitus and baseline eGFR and ventilation. The three clusters expressed distinct characteristic in terms of AKI severity and recovery, metabolic patterns and ICU mortality.
ConclusionWe propose here a new method to phenotype AKI patients according to their most important individual risk factors for AKI development. When applied to an ICU cohort of Covid19 patients, we were able to differentiate three groups of patients. Each expressed specific AKI characteristics and outcomes, which probably reflects a distinct pathophysiology. | intensive care and critical care medicine |
10.1101/2022.03.11.22269298 | Repetitive Transcranial Magnetic Stimulation for Tobacco Treatment in Cancer Patients: A Preliminary Report of a One-Week Treatment | BackgroundSmoking cessation represents a significant opportunity to improve cancer survival rates, reduce the risk of cancer treatment complications, and improve quality of life. However, about half of cancer patients who smoke continue to smoke despite the availability of several treatments. Previous studies demonstrate that repetitive transcranial magnetic stimulation (rTMS) over the left dorsolateral prefrontal cortex (DLPFC) decreases cue craving, reduces cigarette consumption, and increases the quit rate in tobacco use disorder. We investigated whether 5 sessions of rTMS can be safely and efficaciously used for smoking cessation in cancer patients.
MethodsWe enrolled 11 treatment-seeking smokers with cancer (> 5 cigarettes per day) in a randomized, double-blind, sham-controlled proof-of-concept study. Participants received 5 daily sessions of active 10Hz rTMS of the left DLPFC (3000 pulses per session) or sham rTMS, and were followed up for 1 month via phone assessments. Main outcomes included reductions in the number of smoked-cigarettes per day (primary) and craving (secondary). Adverse effects were reported daily by participants.
ResultsSeven of 11 participants completed 5 sessions of rTMS over one week. Compared to sham treatment (n = 4), the active rTMS (n = 3) exhibited modest effects overtime on smoking (Cohens f2 effect size of 0.16) and large effects on cue craving (Cohens f2 = 0.40). No serious side effects related to rTMS were reported in the treatment.
ConclusionsFive sessions of daily rTMS over the left DLPFC might benefit cancer patients who smoke cigarettes. However, further evidence is needed to determine with more certainty its therapeutic effect and adverse effects for cancer patients who smoke cigarettes. | addiction medicine |
10.1101/2022.03.11.22272276 | Risk factors for SARS-CoV-2 infection after primary vaccination with ChAdOx1 nCoV-19 or BNT1262b2 and after booster vaccination with BNT1262b2 or mRNA-1273: a population-based cohort study (COVIDENCE UK) | BackgroundLittle is known about the relative influence of demographic, behavioural, and vaccine-related factors on risk of post-vaccination SARS-CoV-2 infection. We aimed to identify risk factors for SARS-CoV-2 infection after primary and booster vaccinations.
MethodsWe undertook a prospective population-based study in UK adults ([≥]16 years) vaccinated against SARS-CoV-2, including data from Jan 12, 2021, to Feb 21, 2022. We modelled risk of post-vaccination SARS-CoV-2 infection separately for participants who had completed a primary course of vaccination (two-dose or, in the immunosuppressed, three-dose course of either ChAdOx1 nCoV-19 [ChAdOx1] or BNT1262b2) and for those who had additionally received a booster dose (BNT1262b2 or mRNA-1273). Cox regression models were used to explore associations between sociodemographic, behavioural, clinical, pharmacological, and nutritional factors and breakthrough infection, defined as a self-reported positive result on a lateral flow or reverse transcription PCR (RT-PCR) test for SARS-CoV-2. Models were further adjusted for weekly SARS-CoV-2 incidence at the local (lower tier local authority) level.
Findings14,713 participants were included in the post-primary analysis and 10,665 in the post-booster analysis, with a median follow-up of 203 days (IQR 195-216) in the post-primary cohort and 85 days (66-103) in the post-booster cohort. 1051 (7.1%) participants in the post-primary cohort and 1009 (9.4%) participants in the post-booster cohort reported a breakthrough SARS-CoV-2 infection. A primary course of ChAdOx1 (vs BNT182b2) was associated with higher risk of infection, both in the post-primary cohort (adjusted hazard ratio 1.63, 95% CI 1.41-1.88) and in the post-booster cohort after boosting with mRNA-1273 (1.29 [1.03-1.61] vs BNT162b2 primary plus BNT162b2 booster). A lower risk of breakthrough infection was associated with older age (post-primary: 0.96 [0.96-0.97] per year; post-booster: 0.97 [0.96-0.98]), whereas a higher risk of breakthrough infection was associated with lower levels of education (post-primary: 1.66 [1.35-2.06] for primary or secondary vs postgraduate; post-booster: 1.36 [1.08-1.71]) and at least three weekly visits to indoor public places (post-primary: 1.38 [1.15-1.66] vs none; post-booster: 1.33 [1.10-1.60]).
ConclusionsVaccine type, socioeconomic status, age, and behaviours affect risk of breakthrough SARS-CoV-2 infection following a primary schedule and a booster dose.
Research in contextO_ST_ABSEvidence before this studyC_ST_ABSWe searched PubMed, medRxiv, and Google Scholar for papers published up to Feb 18, 2022, using the search terms (breakthrough OR post-vaccin*) AND (SARS-CoV-2 OR COVID) AND (disease OR infection) AND (determinant OR "risk factor" OR associat*), with no language restrictions. Existing studies on risk factors for breakthrough SARS-CoV-2 infection among vaccinated individuals have found associations with age, comorbidities, vaccine type, and previous infection; however, findings have been inconsistent across studies. Most studies have been limited to specific subgroups or have focused on severe outcomes, and very few have considered breakthrough infections after a booster dose or have adjusted for behaviours affecting exposure to other people.
Added value of this studyThis study is among the first to provide a detailed analysis of a wide range of risk factors for breakthrough SARS-CoV-2 infection, both after the primary course of vaccination and after a booster dose. Our large study size and detailed data have allowed us to investigate associations with various sociodemographic, clinical, pharmacological, and nutritional factors. Monthly follow-up data have additionally given us the opportunity to consider the effects of behaviours that may have changed across the pandemic, while adjusting for local SARS-CoV-2 incidence.
Implications of all the available evidenceOur findings add to growing evidence that risk factors for SARS-CoV-2 infection after primary or booster vaccinations can differ to those in unvaccinated populations, with effects attenuated for previously observed risk factors such as body-mass index and Asian ethnicity. The clear difference we observed between the efficacies of ChAdOx1 and BNT162b2 as the primary course of vaccination appears to have been reduced by the use of BNT162b2 boosters, but not by mNRA-1273 boosters. As more countries introduce booster vaccinations, future population-based studies with longer follow-up will be needed to investigate our findings further. | epidemiology |
10.1101/2022.03.11.22272285 | Optimal dosing of cefotaxime and desacetylcefotaxime for critically ill paediatric patients. Can we use microsampling? | SynopsisO_ST_ABSObjectivesC_ST_ABSThis study aimed to describe the population pharmacokinetics of cefotaxime and desacetylcefotaxime in critically ill paediatric patients and provide dosing recommendations. We also sought to evaluate the use of capillary microsampling to facilitate data-rich blood sampling.
MethodsPatients were recruited into a pharmacokinetic study, with cefotaxime and desacetylcefotaxime concentrations from plasma samples collected at 0, 0.5, 2, 4 and 6 h used to develop a population pharmacokinetic model using Pmetrics. Monte Carlo dosing simulations were tested using a range of estimated glomerular filtration rates (60, 100, 170 and 200 mL/min/1.73 m2) and body weights (4, 10, 15, 20 and 40 kg) to achieve PK/PD targets, including 100% [f]T>MIC with an MIC breakpoint of 1 mg/L.
Results36 patients (0.2 - 12 y) provided 160 conventional samples for inclusion in the model. The pharmacokinetics of cefotaxime and desacetylcefotaxime were best described using one-compartmental model with first-order elimination. The clearance and volume of distribution for cefotaxime were 12.8 L and 39.4 L/h, respectively. The clearance for desacetylcefotaxime was 10.5 L/h. Standard dosing of 50 mg/kg Q6h was only able to achieve the PK/PD target of 100% [f]T>MIC in patients > 10 kg and with impaired renal function or patients of 40 kg with normal renal function.
ConclusionsDosing recommendations support the use of extended or continuous infusion to achieve cefotaxime exposure suitable for bacterial killing in critically ill paediatric patients, including those with severe or deep-seated infection. An external validation of capillary microsampling demonstrated skin-prick sampling can facilitate data-rich pharmacokinetic studies. | infectious diseases |
10.1101/2022.03.13.22272311 | A Hybrid Pipeline for Covid-19 Screening Incorporating Lungs Segmentation and Wavelet Based Preprocessing of Chest X-Rays | We have developed a two-module pipeline for the detection of SARS-CoV-2 from chest X-rays (CXRs). Module 1 is a traditional convnet that generates masks of the lungs overlapping the heart and large vasa. Module 2 is a hybrid convnet that preprocesses CXRs and corresponding lung masks by means of the Wavelet Scattering Transform, and passes the resulting feature maps through an Attention block and a cascade of Separable Atrous Multiscale Convolutional Residual blocks to produce a class assignment as Covid or non-Covid. Module 1 was trained on a public dataset of 6395 CXRs with radiologist annotated lung contours. Module 2 was trained on a dataset of 2362 non-Covid and 1435 Covid CXRs acquired at the Henry Ford Health System Hospital in Detroit. Six distinct cross-validation models, were combined into an ensemble model that was used to classify the CXR images of the test set. An intuitive graphic interphase allows for rapid Covid vs. non-Covid classification of CXRs, and generates high resolution heat maps that identify the affected lung regions. | radiology and imaging |
10.1101/2022.03.12.22272083 | Response to COVID-19 booster vaccinations in seronegative people with MS. | BackgroundUncertainties remain about the benefit of a 3rd COVID-19 vaccine for people with attenuated response to earlier vaccines. This is of particular relevance for people with multiple sclerosis (pwMS) treated with anti-CD20 therapies and fingolimod, who have substantially reduced antibody responses to initial vaccine course.
MethodsPwMS taking part in a seroprevalence study without a detectable IgG response following COVID-19 vaccines 1&2 were invited to participate. Participants provided a dried blood spot +/-venous blood sample 2-12 weeks following COVID-19 vaccine 3. Humoral and T cell responses to SARS-CoV-2 spike protein and nucleocapsid antigen were measured. The relationship between evidence of prior COVID-19 infection and immune response to COVID-19 vaccine 3 was evaluated using Fishers exact test.
ResultsOf 81 participants, 79 provided a dried blood spot sample, of whom 38 also provided a whole blood sample; 2 provided only whole blood. Anti-SARS-CoV-2-spike IgG seroconversion post-COVID-19 vaccine 3 occurred in 26/79 (33%) participants; 26/40 (65%) had positive T-cell responses. Overall, 31/40 (78%) demonstrated either humoral or cellular immune response post-COVID-19 vaccine 3. There no association between laboratory evidence of prior COVID-19 infection and anti-spike seroconversion following COVID-19 vaccine 3.
ConclusionsApproximately one third of pwMS who were seronegative after initial COVID-19 vaccination seroconverted after booster (third) vaccination, supporting the use of boosters in this group. Almost 8 out of 10 had a measurable immune response following 3rd COVID-19 vaccine.
Key messagesO_ST_ABSWhat is already knownC_ST_ABSThe benefits of COVID vaccination are well described. It is unknown whether there is additional benefit afforded from a third COVID-19 vaccination in those people who have failed to mount a serological response to their initial vaccine course.
What this study addsApproximately one third of people with MS in our study, all of whom had failed to response to initial vaccine course, developed anti-spike antibodies following a third COVID-19 vaccine. Two-thirds of participants had T cell response to vaccination. No people taking fingolimod appeared to mount a T cell response to vaccination.
How this study might influence practiceThese findings highlight potential benefits of booster vaccinations to a substantial proportion of immunosuppressed people who have failed to respond to initial vaccination course. The clinical correlates of antibody and T-cell responses to COVID-19 remain uncertain but they are almost certainly associated with milder subsequent disease in the general population. | neurology |
10.1101/2022.03.12.22271020 | PRState: Incorporating Genetic Ancestry in Prostate Cancer Risk scores for African American Men | Prostate cancer (PrCa) is one of the most genetically driven solid cancers with heritability estimates as high as 57%. African American men are at an increased risk of PrCa; however, current risk prediction models are based on European ancestry groups and may not be broadly applicable. In this study, we define an African ancestry group of 4,533 individuals to develop an African ancestry-specific PrCa polygenic risk score (PRState). We identified risk loci on chromosomes 3, 8, and 11 in the African ancestry group GWAS and constructed a polygenic risk score (PRS) from 10 African ancestry-specific PrCa risk SNPs, achieving an AUC of 0.61 [0.60-0.63] and 0.65 [0.64-0.67], when combined with age and family history. Performance dropped significantly when using ancestry-mismatched PRS models but remained comparable when using trans-ancestry models. Importantly, we validated the PRState score in the Million Veteran Program, demonstrating improved prediction of PrCa and metastatic PrCa in African American individuals. This study underscores the need for inclusion of individuals of African ancestry in gene variant discovery to optimize PRS. | oncology |
10.1101/2022.03.12.22272306 | Homologous recombination deficiency (HRD) can predict the therapeutic outcomes of immuno-neoadjuvant therapy in NSCLC patients | BackgroundNeoadjuvant immunotherapy is emerging as novel effective intervention in lung cancer but the study to prioritize effective surrogates indicating its therapeutic outcomes is limited. We investigated the genetic changes between patients with distinct response to neoadjuvant immunotherapy in non-small-cell lung cancer (NSCLC) for the derive of biomarkers with indicative capability in predicting outcomes.
MethodsIn this study, 3 adenocarcinoma and 11 squamous cell carcinoma NSCLC patients were treated by neoadjuvant immunotherapy with variated regimen followed by surgical resection. Pre-therapy FFPE or fresh tissues and blood samples were analyzed by whole-exome sequencing (WES). Genetic alternation comparisons were conducted between differently-responded patients. Multiple public cohorts were selected for validation.
ResultsDNA damage repair (DDR)-related InDel signatures and DDR-related gene mutations were enriched in better-responded patients, i.e. major pathological response (MPR) group. Besides, MPR patients exhibited provoked genome instability and unique homologous recombination deficiency (HRD) events. By further inspecting alternation status of homology-dependent recombination (HR) pathway genes, the clonal alternations were exclusively enriched in MPR group. Additionally, associations between HR gene alternations, percent of viable tumor cells and HRD event were identified, which orchestrated tumor mutational burden (TMB), mutational intratumor heterogeneity (ITH), somatic copy number alteration (SCNA) ITH and clonal neoantigen load in patients. Validations in public cohorts further supported the generality of our findings.
ConclusionsWe innovatively associated the HRD event with enhanced neoadjuvant immunotherapy response in lung cancer. The power of HRD event in patient therapeutic stratification persisted in multi-facet public cohorts. We propose the inspection of HR pathway gene status could serve as novel and additional indicators directing immune-neoadjuvant and immunotherapy treatment decisions for NSCLC patients. | oncology |
10.1101/2022.03.13.22271253 | Detection of COVID-19 dysosmia with paired crushable odorant ampules | BackgroundSigns of anosmia can help detect COVID-19 infection when testing for viral positivity is not available. Inexpensive mass-produced disposable olfactory sensitivity tests suitable for worldwide use might serve not only as a screening tool for potential infection but also to identify cases at elevated risk of severe disease as anosmic COVID-19 patients have a better prognosis.
Methods and FindingsWe adopted paired crushable ampules with two concentrations of a standard test odorant (n-butanol) as standard of care in several clinics as community prevalence of COVID-19 infection waxed and waned. This was not a clinical trial; a chart review was undertaken to evaluate the operating characteristics and potential utility of the test device as RT-PCR testing became routine. The risk of anosmia was greater in COVID-19 patients. Olfactory sensitivity was concentration-dependent and decreased with aging. Hyposmia was detected across a wider age range than expected from the literature, and tests can be optimized to characterize different age groups.
Conclusionsn-Butanol at 0.32 and 3.2% in crushable ampules can be used to characterize olfactory function quickly and inexpensively and thus has potential benefits in pandemic screening, epidemiology, and clinical decision-making. | otolaryngology |
10.1101/2022.03.13.22271253 | Detection of COVID-19 and age-dependent dysosmia with paired crushable odorant ampules | BackgroundSigns of anosmia can help detect COVID-19 infection when testing for viral positivity is not available. Inexpensive mass-produced disposable olfactory sensitivity tests suitable for worldwide use might serve not only as a screening tool for potential infection but also to identify cases at elevated risk of severe disease as anosmic COVID-19 patients have a better prognosis.
Methods and FindingsWe adopted paired crushable ampules with two concentrations of a standard test odorant (n-butanol) as standard of care in several clinics as community prevalence of COVID-19 infection waxed and waned. This was not a clinical trial; a chart review was undertaken to evaluate the operating characteristics and potential utility of the test device as RT-PCR testing became routine. The risk of anosmia was greater in COVID-19 patients. Olfactory sensitivity was concentration-dependent and decreased with aging. Hyposmia was detected across a wider age range than expected from the literature, and tests can be optimized to characterize different age groups.
Conclusionsn-Butanol at 0.32 and 3.2% in crushable ampules can be used to characterize olfactory function quickly and inexpensively and thus has potential benefits in pandemic screening, epidemiology, and clinical decision-making. | otolaryngology |
10.1101/2022.03.13.22272308 | Duration of mRNA vaccine protection against SARS-CoV-2 Omicron BA.1 and BA.2 subvariants in Qatar | The SARS-CoV-2 Omicron (B.1.1.529) variant has two subvariants, BA.1 and BA.2, that are genetically quite divergent. We conducted a matched, test-negative, case-control study to estimate duration of protection of mRNA COVID-19 vaccines, after the second dose and after a third/booster dose, against BA.1 and BA.2 infections in Qatars population. BNT162b2 effectiveness against symptomatic BA.1 infection was highest at 46.6% (95% CI: 33.4-57.2%) in the first three months after the second dose, but then declined to [~]10% or below thereafter. Effectiveness rapidly rebounded to 59.9% (95% CI: 51.2-67.0%) in the first month after the booster dose, but then started to decline again. BNT162b2 effectiveness against symptomatic BA.2 infection was highest at 51.7% (95% CI: 43.2-58.9%) in the first three months after the second dose, but then declined to [~]10% or below thereafter. Effectiveness rapidly rebounded to 43.7% (95% CI: 36.5-50.0%) in the first month after the booster dose, but then declined again. Effectiveness against COVID-19 hospitalization and death was in the range of 70-80% any time after the second dose, and was greater than 90% after the booster dose. Similar patterns of protection were observed for the mRNA-1273 vaccine. mRNA vaccines provide only moderate and short-lived protection against symptomatic Omicron infections, with no discernable differences in protection against either the BA.1 or BA.2 subvariants. Vaccine protection against COVID-19 hospitalization and death is strong and durable after the second dose, but is more robust after a booster dose. | epidemiology |
10.1101/2022.03.13.22272304 | Quantitative Trend Analysis of SARS-CoV-2 RNA in Municipal Wastewater Exemplified with Sewershed-Specific COVID-19 Clinical Case Counts | We present and demonstrate a quantitative statistical linear trend analysis (QTA) approach to analyze and interpret SARS-CoV-2 RNA wastewater surveillance results concurrently with clinical case data. This demonstration is based on the work completed under the Ontario (Canada) Wastewater Surveillance Initiative (WSI) by two laboratories in four large sewersheds within the Toronto Public Health (TPH) jurisdiction. The sewersheds were sampled over a 9-month period and data were uploaded to the Ontario Wastewater Surveillance Data and Visualization Hub (Ontario Dashboard) along with clinical case counts, both on a sewershed-specific basis. The data from the last 5-months, representing a range of high and low cases, was used for this demonstration. The QTA was conducted on a sewershed specific approach using the recommendations for public health interpretation and use of wastewater surveillance data by the United States Centers for Disease Control and Prevention (US CDC). The interpretation of the QTA results was based on the integration of both clinical and wastewater virus signals using an integration matrix in an interim draft guide by the Public Health Agency of Canada (PHAC). The key steps in the QTA consisted of (i) the calculation of Pepper Mild Mottle Virus (PMMoV), flow and flow-PMMoV-normalized virus loads; (ii) computation of the linear trends including interval estimation to identify the key inflection points using a segmented linear regression method and (iii) integrated interpretations based on consideration of both the cases and wastewater signals, as well as end user actionability. This approach is considered a complementary tool to commonly used qualitative analyses of SARS-CoV-2 RNA in wastewater and is intended to directly support public health decisions using a systematic quantitative approach. | epidemiology |
10.1101/2022.03.13.22272176 | Vaccination against SARS-CoV-2 in UK school-aged children and young people decreases infection rates and reduces COVID-19 symptoms | BackgroundWe aimed to explore the effectiveness of one-dose BNT162b2 vaccination upon SARS-CoV-2 infection, its effect on COVID-19 presentation, and post-vaccination symptoms in children and young people (CYP) in the UK during periods of Delta and Omicron variant predominance.
MethodsIn this prospective longitudinal cohort study, we analysed data from 115,775 CYP aged 12-17 years, proxy-reported through the Covid Symptom Study (CSS) smartphone application. We calculated post-vaccination infection risk after one dose of BNT162b2, and described the illness profile of CYP with post-vaccination SARS- CoV-2 infection, compared to unvaccinated CYP, and post-vaccination side-effects.
FindingsBetween August 5, 2021 and February 14, 2022, 25,971 UK CYP aged 12-17 years received one dose of BNT162b2 vaccine. Vaccination reduced (proxy-reported) infection risk (-80{middle dot}4% and -53{middle dot}7% at 14-30 days with Delta and Omicron variants respectively, and -61{middle dot}5% and -63{middle dot}7% after 61-90 days). The probability of remaining infection-free diverged soon after vaccination, and was greater in CYP with prior SARS-CoV-2 infection. Vaccinated CYP who contracted SARS-CoV-2 during the Delta period had milder disease than unvaccinated CYP; during the Omicron period this was only evident in children aged 12-15 years. Overall disease profile was similar in both vaccinated and unvaccinated CYP. Post-vaccination local side-effects were common, systemic side-effects were uncommon, and both resolved quickly.
InterpretationOne dose of BNT162b2 vaccine reduced risk of SARS-CoV-2 infection for at least 90 days in CYP aged 12-17 years. Vaccine protection varied for SARS-CoV-2 variant type (lower for Omicron than Delta variant), and was enhanced by pre-vaccination SARS-CoV-2 infection. Severity of COVID-19 presentation after vaccination was generally milder, although unvaccinated CYP also had generally mild disease. Overall, vaccination was well-tolerated.
FundingUK Government Department of Health and Social Care, Chronic Disease Research Foundation, The Wellcome Trust, UK Engineering and Physical Sciences Research Council, UK Research and Innovation London Medical Imaging & Artificial Intelligence Centre for Value Based Healthcare, UK National Institute for Health Research, UK Medical Research Council, British Heart Foundation and Alzheimers Society, and ZOE Limited.
Research in context
Evidence before this studyWe searched PubMed database for peer-reviewed articles and medRxiv for preprint papers, published between January 1, 2021 and February 15, 2022 using keywords ("SARS-CoV-2" OR "COVID-19") AND (child* OR p?ediatric* OR teenager*) AND ("vaccin*" OR "immunization campaign") AND ("efficacy" OR "effectiveness" OR "symptoms") AND ("delta" or "omicron" OR "B.1.617.2" OR "B.1.1.529"). The PubMed search retrieved 36 studies, of which fewer than 30% specifically investigated individuals <18 years.
Eleven studies explored SARS-CoV-2 viral transmission: seroprevalence in children (n=4), including age-dependency of susceptibility to SARS-CoV-2 infection (n=1), SARS-CoV-2 transmission in schools (n=5), and the effect of school closure on viral transmission (n=1).
Eighteen documents reported clinical aspects, including manifestation of infection (n=13), symptomatology, disease duration, and severity in children. Other studies estimated emergency department visits, hospitalization, need for intensive care, and/or deaths in children (n=4), and explored prognostic factors (n=1).
Thirteen studies explored vaccination-related aspects, including vaccination of children within specific paediatric co-morbidity groups (e.g., children with Down syndrome, inflammatory bowel disease, and cancer survivors, n=4), mRNA vaccine efficacy in children and adolescents from the general population (n=7), and the relation between vaccination and severity of disease and hospitalization cases (n=2). Four clinical trials were conducted using mRNA vaccines in minors, also exploring side effects. Sixty percent of children were found to have side effects after BNT162b2 vaccination, and especially after the second dose; however, most symptoms were mild and transient apart from rare uncomplicated skin ulcers. Two studies focused on severe adverse effects and safety of SARS-CoV-2 vaccines in children, reporting on myocarditis episodes and two cases of Guillain-Barre syndrome. All other studies were beyond the scope of our research.
Added value of this studyWe assessed multiple components of the UK vaccination campaign in a cohort of children and young people (CYP) aged 12-17 years drawn from a large UK community-based citizen-science study, who received a first dose of BNT162b2 vaccine. We describe a variant-dependent protective effect of the first dose against both Delta and Omicron, with additional protective effect of pre-vaccination SARS- CoV-2 infection on post-vaccination re-infection. We compare the illness profile in CYP infected post-vaccination with that of unvaccinated CYP, demonstrating overall milder disease with fewer symptoms for vaccinated CYP. We describe local and systemic side-effects during the first week following first-dose vaccination, confirming that local symptoms are common, systemic symptoms uncommon, and both usually transient.
Implications of all the available evidenceOur data confirm that first dose BNT162b2 vaccination in CYP reduces risk of infection by SARS-CoV-2 variants, with generally local and brief side-effects. If infected after vaccination, COVID-19 is milder, if manifest at all. The study aims to contribute quantitative evidence to the risk-benefit evaluation of vaccination in CYP to inform discussion regarding rationale for their vaccination and the designing of national immunisation campaigns for this age group; and applies citizen-science approaches in the conduct of epidemiological surveillance and data collection in the UK community.
Importantly, this study was conducted during Delta and Omicron predominance in UK; specificity of vaccine efficacy to variants is also illustrated; and results may not be generalizable to future SARS-CoV-2 strains. | epidemiology |
10.1101/2022.03.12.22272300 | Did COVID-19 Vaccines Go to the Whitest Neighborhoods First? Racial Inequities in Six Million Phase 1 Doses Shipped to Pennsylvania | Research on racial disparities in COVID-19 vaccination rates has focused primarily on vaccine hesitancy. However, vaccine hesitancy research is increasingly unable to account for racial disparities in vaccination rates in the U.S., which have shrunk rapidly over the past year. This and other evidence suggests that inequities in vaccine allocation and access may have contributed to vaccination rate disparities in the U.S. But to our knowledge, no previously published research has examined whether the geographic distribution of COVID-19 vaccines has led to greater access for White Americans than for Black Americans.
Here, we link neighborhood-level data on vaccine allocation to data on racial demographics to show that in the first 17 weeks of Pennsylvanias COVID-19 vaccine rollout (Phase 1), White people were 25% more likely than Black people to live in neighborhoods (census tracts) that received vaccine shipments. In the 17 weeks of Pennsylvanias de jure restrictions on vaccine eligibility, de facto geographic restrictions on vaccine access disproportionately disadvantaged Black people and favored White people. In revealing these vaccine inequities, our work builds on prior work to develop a theory-driven, evidence-based, reproducible framework for studying racial inequities in the distribution of COVID-19 vaccines. | public and global health |
10.1101/2022.03.13.22272309 | Effectiveness of Community-Based Rehabilitation (CBR) centres for improving physical fitness for community-dwelling older adults: a systematic review protocol | IntroductionThe increasing ageing population has become a substantial challenge for both health care and social services in many Asian countries. There is a high incidence of chronic diseases and comorbidities in older populations, leading to impairments and functional disability. Functional disability may result in loss of independence, reduced quality of life and increased care needs. Community-based rehabilitation (CBR) aims to promote equality of opportunity and improve the social inclusion of individuals living with disability. CBR also provides rehabilitation to improve physical, mental, and social outcomes. However, there is limited evidence regarding the effectiveness of CBR for improving older adults physical fitness. The aim of this systematic review is to synthesise the evidence for the effectiveness of interventions delivered by CBR centres on physical fitness of community-dwelling older adults in Asian countries.
Methods and analysisA search on four English databases (CINAHL, Medline, Scopus and Proquest) and two Chinese databases (China National Knowledge Internet and Wanfang Database) will be conducted, from inception to 15 November 2021. Both English and Chinese publications will be included. Experimental and quasi-experimental studies using any type of control group will be included. The primary outcomes are physical fitness (capacity to perform activities and tasks). Secondary outcomes are performance of activities of daily living and health-related quality of life. The quality of all included studies will be assessed using the Joanna Briggs Institute (JBI) standardised critical appraisal tools. Two reviewers will independently complete study screening, selection, quality appraisal, and data extraction. Quantitative data where possible will be pooled in statistical meta-analysis. All statistical analyses will be performed using Review Manager (Rev Man) V.5.3 software.
Ethics and disseminationEthical approval is not required for this review. The findings of this review will be disseminated electronically through a peer-reviewed publication and conference presentations.
PROSPERO registration numberCRD42021292088
Strengths and limitationsO_LIFindings and evidence in this review will be summarised and graded using the Grading of Recommendations, Assessment, Development and Evaluation Pro (GRADEPro) approach.
C_LIO_LIA comprehensive literature search using both English and Chinese language databases will be conducted.
C_LIO_LIStudies included in the review may measure different outcomes which may limit pooling in meta-analysis.
C_LIO_LIDifferences in populations and interventions delivered in the included studies may result in high levels of heterogeneity, leading to less certainty about the recommendations from the review.
C_LI | rehabilitation medicine and physical therapy |
10.1101/2022.03.13.22272179 | Experience of Adults with Upper-limb Difference and their Views on Sensory Feedback for Prostheses: A Mixed Methods Study | AO_SCPLOWBSTRACTC_SCPLOWO_ST_ABSBackgroundC_ST_ABSUpper-limb prostheses are regularly abandoned, in part due to the mismatch between user needs and prostheses performance. Sensory feedback is among several technological advances that have been proposed to reduce device abandonment rates. While it has already been introduced in some high-end commercial prostheses, limited data is available about user expectations in relation to sensory feedback. The aim of this study is thus to use a mixed methods approach to provide a detailed insight of users perceptions and expectations of sensory feedback technology, to ensure the addition of sensory feedback is as acceptable, engaging and ultimately as useful as possible for users and, in turn, reduce the reliance on compensatory movements that lead to overuse syndrome.
MethodsThe study involved an online survey (N=37) and video call interviews (N=15) where adults with upper-limb differences were asked about their experience with limb difference and prosthesis use (if applicable) and their expectations about sensory feedback to prostheses. The survey data were analysed quantitatively and descriptively to establish the range of sensory feedback needs and their variations across the different demographics. Reflective thematic analysis was performed on the interview data, and data triangulation was used to understand key behavioural issues to generate actionable guiding principles for the development of sensory feedback systems.
ResultsThe survey provided a list of practical examples and suggestions that did not vary with the different causes of limb difference or prosthesis use. The interviews showed that although sensory feedback is a desired feature, it must prove to have more benefits than drawbacks. The key benefit mentioned by participants was increasing trust, which requires a highly reliable system that provides input from several areas of the hand rather than just the fingertips. The feedback system should also complement existing implicit feedback sources without causing confusion or discomfort. Further, the effect sensory feedback has on the users psychological wellbeing was highlighted as an important consideration that varies between individuals and should therefore be discussed. The results obtained were used to develop guiding principles for the design and implementation of sensory feedback systems.
ConclusionsThis study provided the first mixed-methods research on the sensory feedback needs of adults with upper-limb differences, enabling a deeper understanding of their expectations and worries. Guiding principles were developed based on the results of a survey and interviews to inform the development and assessment of sensory feedback for upper-limb prostheses. | rehabilitation medicine and physical therapy |