id
stringlengths 16
27
| title
stringlengths 18
339
| abstract
stringlengths 95
38.7k
| category
stringlengths 7
44
|
---|---|---|---|
10.1101/2022.03.13.22272179 | Experience of Adults with Upper-limb Difference and their Views on Sensory Feedback for Prostheses: A Mixed Methods Study | AO_SCPLOWBSTRACTC_SCPLOWO_ST_ABSBackgroundC_ST_ABSUpper-limb prostheses are regularly abandoned, in part due to the mismatch between user needs and prostheses performance. Sensory feedback is among several technological advances that have been proposed to reduce device abandonment rates. While it has already been introduced in some high-end commercial prostheses, limited data is available about user expectations in relation to sensory feedback. The aim of this study is thus to use a mixed methods approach to provide a detailed insight of users perceptions and expectations of sensory feedback technology, to ensure the addition of sensory feedback is as acceptable, engaging and ultimately as useful as possible for users and, in turn, reduce the reliance on compensatory movements that lead to overuse syndrome.
MethodsThe study involved an online survey (N=37) and video call interviews (N=15) where adults with upper-limb differences were asked about their experience with limb difference and prosthesis use (if applicable) and their expectations about sensory feedback to prostheses. The survey data were analysed quantitatively and descriptively to establish the range of sensory feedback needs and their variations across the different demographics. Reflective thematic analysis was performed on the interview data, and data triangulation was used to understand key behavioural issues to generate actionable guiding principles for the development of sensory feedback systems.
ResultsThe survey provided a list of practical examples and suggestions that did not vary with the different causes of limb difference or prosthesis use. The interviews showed that although sensory feedback is a desired feature, it must prove to have more benefits than drawbacks. The key benefit mentioned by participants was increasing trust, which requires a highly reliable system that provides input from several areas of the hand rather than just the fingertips. The feedback system should also complement existing implicit feedback sources without causing confusion or discomfort. Further, the effect sensory feedback has on the users psychological wellbeing was highlighted as an important consideration that varies between individuals and should therefore be discussed. The results obtained were used to develop guiding principles for the design and implementation of sensory feedback systems.
ConclusionsThis study provided the first mixed-methods research on the sensory feedback needs of adults with upper-limb differences, enabling a deeper understanding of their expectations and worries. Guiding principles were developed based on the results of a survey and interviews to inform the development and assessment of sensory feedback for upper-limb prostheses. | rehabilitation medicine and physical therapy |
10.1101/2022.03.10.22271304 | ACE2 gene expression and inflammatory conditions in periodontal microenvironment of COVID-19 patients with and without diabetes evaluated by qPCR | ObjectiveChronic periodontitis has been proposed to be linked to coronavirus disease (COVID-19) on the basis of its inflammation mechanism. We aimed to evaluate this association by investigating the expression of Angiotensin Converting Enzyme-2 (ACE2) in periodontal compartments, which contain dysbiosis-associated pathogenic bacteria, and how it can be directly or indirectly involved in exacerbating inflammation in periodontal tissue.
Material and MethodsThis observational clinical study included 23 adult hospitalized patients admitted to Universitas Indonesia Hospital with PCR-confirmed COVID-19, while 6 non-COVID-19 participants come to periodontal clinic were included as control. Using real time-PCR (qPCR) and gingival crevicular fluids (GCF) samples from COVID-19 patients with and without diabetes and periodontitis, we assessed the mRNA expression of angiotensin-converting enzyme 2 (ACE2), IL-6, IL-8, complement C3, and LL-37 as well as the relative proportion of Porphyromonas gingivalis, Fusobacterium nucleatum, and Veillonella parvula to represent the dysbiosis condition in periodontal microenvironment. All analyses were performed to determine their relationship.
ResultsACE2 mRNA expression was detected in the GCF of periodontitis-COVID-19 patients with and without diabetes. However, only periodontitis-COVID-19 patients with diabetes showed a positive relationship between ACE2 expression and inflammatory conditions in the periodontal microenvironment. In addition, the interplay between pro-inflammatory cytokine (IL-6) and complement C3 could be used as a predictor of the severity of periodontal inflammation in COVID-19 patients with diabetes.
ConclusionThe study data show that the SARS-CoV-2 entry gene is expressed in the GCF of patients with COVID-19, and its expression correlates with inflammatory markers. | dentistry and oral medicine |
10.1101/2022.03.10.22272184 | Does a postpartum "Green Star" family planning decision aid for adolescent mothers reduce decisional conflict? : A quasi-experimental study | AimTo our knowledge, there are still no studies in Tanzania regarding decision aids on long-acting reversible contraception. We evaluated the effects of our postpartum "Green Star" family planning decision aid on decisional conflict, knowledge, satisfaction, and uptake of long-acting reversible contraception among pregnant adolescents in Tanzania.
MethodsWe used a facility-based quasi-experimental design with control. The participants were purposively recruited and randomly assigned (intervention, n = 33; control, n = 33). The intervention received the routine family planning counseling and decision aid. The control received only the routine family planning counseling. The primary outcome was change in decisional conflict measured using the validated Decisional Conflict Scale (DCS). The secondary outcomes were knowledge, satisfaction, and contraception uptake. We hypothesized that pregnant adolescents who use the decision aid will have a lower DCS score.
ResultsWe recruited 66 pregnant adolescents; 62 completed the study. Participants in the intervention had a lower mean difference score in the DCS than participants in the control (intervention: -24.7 [SD 7.99] vs. control: -11.6 [SD 10.9], t = -5.53, p < 0.001). The mean difference score in knowledge was significantly higher in the intervention than in the control (intervention: 4.53 [SD 2.54] vs. control: 2.0 [SD 1.45], t = 4.88, p < 0.001). The mean score of satisfaction was significantly higher in the intervention than in the control (intervention: 100 [SD 0.0] vs. control: 55.8 [SD 30.7], t = 8.112, p < 0.001). Choice of contraception was significantly higher in the intervention [29 (45.3%)] than in the control [13 (20.3%)] (x2 = 17.73, p < 0.001).
ConclusionThe postpartum "Green Star" family planning decision aid was useful as it lowered decisional conflict, improved knowledge and satisfaction with decision making, and enhanced contraception uptake. The decision aid demonstrated positive applicability and affordability for pregnant adolescents in Tanzania. | sexual and reproductive health |
10.1101/2022.03.10.22272193 | Presence of Mediastinal Lymphadenopathy in Hospitalized Covid-19 Patients in a Tertiary Care Hospital in Pakistan - A cross-sectional study | BackgroundThe aim of this study was to investigate the presence of mediastinal lymphadenopathy in hospitalized Covid-19 patients in a tertiary care hospital in the metropolitan city of Lahore, Pakistan from September 2020 till July 2021.
MethodsWe retrospectively collected data of Covid-19 patients hospitalized from September 2020 till July 2021. Only those patients who tested PCR positive through a nasopharyngeal swab, were enrolled in the study. Patients whose data were missing were excluded from this study. Our exclusion criteria included patients who tested negative on Covid-19 PCR, patients with comorbidities that may cause enlarged mediastinal lymphadenopathies such as haemophagocytic lymphohistiocytosis, neoplasia, tuberculosis, sarcoidosis or a systemic disease. The extent of lung involvement in Covid-19 patients was quantified by using a 25-point visual quantitative assessment called the Chest Computed Tomography Score. This score was then correlated with the presence of mediastinal lymphadenopathy.
FindingsOf the 210 hospitalized patients included in the study, 131 (62.4%) had mediastinal lymphadenopathy. The mean and median Severity Score of Covid-19 patients with mediastinal lymphadenopathy (mean: 17.1, SD:5.7; median: 17, IQR: 13-23) were higher as compared to those without mediastinal lymphadenopathy (mean: 12.3, SD:5.4; median: 12, IQR:9-16)
InterpretationOur study documents a high prevalence of mediastinal lymphadenopathy in hospitalized patients with Covid-19 with the severity score being higher in its presence representing a more severe course of disease. | infectious diseases |
10.1101/2022.03.11.22272061 | PTPRG activates m6A methyltransferase VIRMA to block mitochondrial autophagy mediated neuronal death in Alzheimer's disease | In Alzheimers disease (AD), neuronal death is one of the key pathology. However, the initiation of neuronal death in AD is far from clear, and new targets are urgently needed to develop effective therapeutic methods. This study analyzed sequencing data from single-cell RNAseq and spatialomics, and revealed the impact of global single-cell mapping and cell spatial distribution relationships in early stage of AD. We found that microglia subpopulation Mic_PTPRG can anchor neurons based on ligand-receptor interaction pairs and cause ectopic expression of PTPRG in neurons during AD progression. PTPRG in neurons can bind and upregulate VIRMA expression, which in turn increases the level of m6A methylation, enhances PRKN transcript degradation and represses translation. Repressed PRKN expression blocks the clearance of damaged mitochondria in neurons, which in turn reprograms neuronal energy and nutrient metabolic pathways and leads to neuronal death during AD progression. This study elucidates novel mechanisms, by which the PTPRG-dependent microglia-synaptic modification may play a role in AD, providing a new scientific basis for potential therapeutic targets for AD. | neurology |
10.1101/2022.03.13.22272317 | A shared genetic signature for common chronic pain conditions and its impact on biopsychosocial traits | BackgroundThe multifactorial heterogenous nature of chronic pain with its multiple comorbidities presents a formidable challenge in disentangling the aetiology underlying patient symptoms.
MethodsHere, we performed genome-wide association studies (GWAS) of eight types of regional chronic pain using UK Biobank data (N=4,037-79,089 cases; N=239,125 controls), followed by bivariate linkage disequilibrium-score regression and latent causal variable analyses to determine (respectively) their genetic correlations and genetic causal proportion (GCP) parameters with 1,492 other complex traits.
FindingsWe report evidence of a shared genetic signature in common regional chronic pain types, with their genetic correlations and causal directions being broadly consistent across a wide array of biopsychosocial traits. Furthermore, we identified 5,942 significant genetic correlations, of which 570 trait pairs showed evidence of a likely causal association (|GCP| > 0{middle dot}6; 5% false discovery rate), including 488 traits contributing to chronic pain while 82 were affected by pain. A range of somatic pathologies (e.g., musculoskeletal, visceral), psychiatric factors (e.g., depression, trauma, anxiety, mania, bipolar disorder), socioeconomic factors (e.g., occupation) and other medical comorbidities (e.g., cardiovascular disease) contributed to an increased risk of regional chronic pain. Conversely, each chronic pain type contributed to various other traits such as increased medication use (e.g., analgesics) and risk of ischaemic heart disease and depression.
InterpretationFindings of this data-driven study, through comprehensive analysis of a vast range of biopsychosocial factors, are indicative of a common genetic signature underlying eight regional chronic pain types. We identify a broad range of traits with genetic causal effects upon chronic pain, which may inform development of novel diagnostic and therapeutic strategies, as well as provide convergent support for various existing approaches. | pain medicine |
10.1101/2022.03.12.22272299 | Integrative genomics analysis implicates decreased FGD6 expression underlying risk of intracranial aneurysm rupture | BackgroundThe genetic determinants and mechanisms underlying intracranial aneurysm rupture (rIA) are largely unknown. Given the [~]50% mortality rate of rIA, approaches to identify patients at high-risk will inform screening, diagnostic, and preventative measures.
ObjectiveOur goal was to identify and characterize the genetic basis of rIA.
MethodsWe perform a genome-wide association study (GWAS) use functional genomics approaches to identify and characterize rIA-associated loci and genes. We perform a meta-analysis across 24 published GWAS of rIA. Single nucleotide polymorphisms (SNP), gene-burden analysis, and functional genomics identify and characterize genetic risk factors for rIA.
ResultsOur cohort contains 84,353 individuals (7,843 rIA cases and 76,510 controls). We identify 5 independent genetic loci reaching genome-wide significance (p<5.0x10-8) for rIA including rs12310399 (FGD6, OR=1.16), which to our knowledge, has not been implicated in prior GWAS of rIA. We then quantified gene-level mutation-burden across [~]20,000 genes, and only FGD6 (containing 21 rIA-associated SNPs) reached transcriptome-wide significance. Expression quantitative trait loci (eQTL) mapping indicates that rs12310399 causes decreased FGD6 gene expression in arterial tissue. Next, we utilized publicly available single-cell RNA sequencing of normal human cerebrovascular cells obtained during resection surgery and identify high expression of FGD6 in 1 of 3 arterial lineages but absent in perivascular cells. These data suggest how alterations in FGD6 may confer risk to rIA.
ConclusionWe identify and characterize a previously unknown risk loci for rIA containing FGD6. Elucidation of high-risk genetic loci may instruct population-genetic screening and clinical-genetic testing strategies to identify patients predisposed to rIA.
FundingNo funding sources were used for the material presented herein. | genetic and genomic medicine |
10.1101/2022.03.12.22272299 | Integrative genomics analysis implicates decreased FGD6 expression underlying risk of intracranial aneurysm rupture | BackgroundThe genetic determinants and mechanisms underlying intracranial aneurysm rupture (rIA) are largely unknown. Given the [~]50% mortality rate of rIA, approaches to identify patients at high-risk will inform screening, diagnostic, and preventative measures.
ObjectiveOur goal was to identify and characterize the genetic basis of rIA.
MethodsWe perform a genome-wide association study (GWAS) use functional genomics approaches to identify and characterize rIA-associated loci and genes. We perform a meta-analysis across 24 published GWAS of rIA. Single nucleotide polymorphisms (SNP), gene-burden analysis, and functional genomics identify and characterize genetic risk factors for rIA.
ResultsOur cohort contains 84,353 individuals (7,843 rIA cases and 76,510 controls). We identify 5 independent genetic loci reaching genome-wide significance (p<5.0x10-8) for rIA including rs12310399 (FGD6, OR=1.16), which to our knowledge, has not been implicated in prior GWAS of rIA. We then quantified gene-level mutation-burden across [~]20,000 genes, and only FGD6 (containing 21 rIA-associated SNPs) reached transcriptome-wide significance. Expression quantitative trait loci (eQTL) mapping indicates that rs12310399 causes decreased FGD6 gene expression in arterial tissue. Next, we utilized publicly available single-cell RNA sequencing of normal human cerebrovascular cells obtained during resection surgery and identify high expression of FGD6 in 1 of 3 arterial lineages but absent in perivascular cells. These data suggest how alterations in FGD6 may confer risk to rIA.
ConclusionWe identify and characterize a previously unknown risk loci for rIA containing FGD6. Elucidation of high-risk genetic loci may instruct population-genetic screening and clinical-genetic testing strategies to identify patients predisposed to rIA.
FundingNo funding sources were used for the material presented herein. | genetic and genomic medicine |
10.1101/2022.03.13.22272312 | Why does purpose in life predict mortality in older adults? | BackgroundPrevious work documents a strong association between a higher sense of life purpose and lower all-cause mortality risk even when controlling for baseline health and proposes that life purpose intervention may provide a low-cost lever to improve health and longevity. Causation, however, is less clear--lower purpose may cause poorer health and decreased longevity, or poorer health may cause decreased longevity and lower purpose. We examine the extent that (1) more comprehensive health metrics and (2) horizon mitigate or strengthen the relation between purpose and mortality risk to better understand causation.
MethodsProspective cohort sample of 8 425 individuals aged 50 and older who were eligible to participate in the 2006 Health and Retirement Study Psychosocial and Lifestyle questionnaire. Individuals were followed for three subsequent four-year periods: 2006-2010, 2010-2014, and 2014-2018. A total of 1 597 individuals were excluded in the initial four-year period due to lack of follow up, sample weights, or covariates leaving an initial sample of 6 828 individuals. For the second and third four-year periods, an additional 168 and 349 respondents were lost to follow up, respectively. Cox models were estimated to examine the relation between life purpose and mortality for three horizons (years 1-4, 5-8, and 9-12) with more comprehensive measures of current health. Covariates included age, sex, education, race, marital status, smoking status, exercise, alcohol, BMI, and functional health score.
FindingsThe relation between life purpose and mortality was substantially attenuated or disappeared at longer horizons or when using more comprehensive measures of current health.
InterpretationMuch of the documented relation between life purpose and longevity arises from poor health causing higher mortality risk and lower purpose (i.e., reverse causation). As a result, life purpose intervention is likely to be less effective than the previous evidence suggests.
FundingNone.
Research in contextO_ST_ABSEvidence before this studyC_ST_ABSWe searched PubMed and Google Scholar with no language or date restriction for the term "life purpose" and found four comprehensive reviews of the life purpose or psychological well-being (which included life purpose in the set of psychological well-being metrics) literatures in the last three years and a 2016 meta-analysis of the relation between life purpose and mortality. Although acknowledging it is possible that reverse causation plays a role in linking life purpose levels to subsequent morbidity and mortality, the prevalent view appears to be that even when controlling for current health levels, higher life purpose causes behavioral, biological, or stress buffering changes that, in turn, cause lower future morbidity and mortality.
Added value of this studyWe demonstrate that the relation between life purpose levels and mortality is substantially attenuated or eliminated when better controlling for current health or focusing on a longer horizon. Both results suggest that the relation between life purpose levels and future mortality risk primarily arises from life purpose proxying for current health levels. The evidence suggests poorer health causes lower life purpose rather than lower life purpose causing poorer health.
Implications of all the available evidenceAlthough life purpose intervention--either at the provider level or in public policy--may have benefits, it is unlikely to cause greater longevity. | health policy |
10.1101/2022.03.13.22272117 | Asexual stage synchronicity in symptomatic and asymptomatic falciparum malaria | Andrade et al have reported that P. falciparum parasitised erythrocytes circulate for longer in persistent asymptomatic infections than in symptomatic malaria. This radical suggestion, attributed to in-vivo adaptation by the parasite population to reduced cytoadherence, is based largely on in-vivo transcriptomic data from 24 Malian children: 12 with acute falciparum malaria and 12 with asymptomatic parasitaemia. We show that the reported analysis generated erroneous results because of data formatting issues. We also show that the algorithm used to estimate the average asexual parasite developmental stage (hours post-invasion) from in-vivo transcriptomic data breaks down when applied to asynchronous infections. We argue that comparisons between asymptomatic and symptomatic malaria of asexual parasite developmental stage distributions are confounded by differences in synchronicity and gametocytaemia, and also by selection bias (because schizogony often precipitates clinical presentation). There is no convincing evidence of an adaptive delayed cytoadherence phenotype in chronic P. falciparum infections. | infectious diseases |
10.1101/2022.03.11.22272251 | Biomedical consequences of elevated cholesterol-containing lipoproteins and apolipoproteins | AimsTo provide a comprehensive evaluation of the biomedical effects of circulating concentrations of cholesterol-containing lipoproteins and apolipoproteins.
Methods and ResultsNuclear magnetic resonance (NMR) spectroscopy was used to measure the cholesterol content of high density (HDL-C), very low-density (VLDL-C), intermediate-density (IDL-C), and low-density (LDL-C) lipoprotein fractions; apolipoproteins Apo-A1 and Apo-B; as well as total triglycerides (TG), remnant-cholesterol (Rem-chol) and total cholesterol (TC). The causal effects of these exposures were assessed against 33 cardiovascular as well as non-cardiovascular outcomes using two-sample univariable and multivariable Mendelian randomization. We observed that most cholesterol containing lipoproteins and apolipoproteins affected coronary heart disease (CHD), cIMT, carotid plaque, CRP and blood pressure. Through MVMR we showed that many of these exposures acted independently of the more commonly measured blood lipids: HDL-C, LDL-C and TG. We furthermore found that HF risk was increased by higher concentrations of TG, VLDL-C, Rem-Chol and Apo-B, often independently of LDL-C, HDL-C or TG. Finally, a smaller subset of these exposures could be robustly mapped to non-CVD traits such as Alzheimers disease (HDL-C, LDL-C, IDL-C, Apo-B), type 2 diabetes (VLDL-C, IDL-C, LDL-C), and inflammatory bowel disease (LDL-C, IDL-C).
ConclusionThe cholesterol content of a wide range of lipoprotein and apolipoproteins affected measures of atherosclerosis and CHD, implicating subfractions beyond LDL-C. Many of the observed effects acted independently of LDL-C, HDL-C, and TG, supporting the potential for additional, non-LDL-C, avenues to disease prevention. | cardiovascular medicine |
10.1101/2022.03.11.22272265 | Predicted serotype distribution in invasive pneumococcal disease (IPD) among children less than five years prior to the introduction of the Pneumococcal Conjugate Vaccine (PCV) in Nigeria | BackgroundThe 10-valent pneumococcal conjugate vaccine (PCV10) was introduced in Nigeria without any baseline data on serotype distribution in invasive pneumococcal disease (IPD). To estimate the proportion of IPD attributable to different serotypes, in children aged <5 years, we used statistical models based on the serotype-specific nasopharyngeal carriage prevalence and invasive capacity (IC).
MethodsWe used the carriage data from one urban and one rural setting in Nigeria, collected within five months of PCV10 introduction (2016). For Model A, we used serotype-specific adult case-fatality ratios from Denmark as proxy for IC. In the second model, we used the ratio of IPD proportions to carriage prevalence (case-carrier ratios) from Kenya (Model B) and the ratio of IPD incidence to carriage acquisition (attack rates) from the UK (Model C) as measures of serotype IC.
ResultsThe models predict that serotypes with high carriage prevalence (6A, 6B, 19F and 23F) will dominate IPD. Additionally, Models B and C predictions emphasize serotypes 1, 4, 5, and 14, which were not prevalent in carriage but had high IC estimates. Non-PCV10 serotypes,6A and 19A, also dominated IPD predictions across models and settings. The predicted proportion of IPD attributed to PCV10 serotypes varied between 56% and 74% by model and setting.
ConclusionCarriage data can provide preliminary insights into IPD serotypes in settings that lack robust IPD data. The predicted PCV10-serotype coverage for IPD was moderately high. However, predictions for non-PCV10 serotypes indicate that higher-valency PCVs that cover serotypes 6A and 19A may have a larger impact on IPD reductions. | epidemiology |
10.1101/2022.03.11.22272258 | International travel as a risk factor for carriage of extended-spectrum β-lactamase-producing Escherichia coli in a large sample of European individuals - The AWARE Study | Antibiotic resistance (AR) is currently a major threat to global health, calling for a One Health approach to be properly understood, monitored, tackled, and managed. Potential risk factors for AR are often studied in specific high-risk populations, but are still poorly understood in the general population. Our aim was to explore, describe, and characterize potential risk factors for carriage of Extended-Spectrum Beta-Lactamase-resistant E. coli (ESBL-EC) in a large sample of European individuals aged between 16 and 67 years recruited from the general population in Southern Germany, the Netherlands, and Romania. Questionnaire and stool sample collection for this cross-sectional study took place from September 2018 to March 2020. Selective culture of participants stool samples was performed for detection of ESBL-EC. A total of 1,183 participants were included in the analyses: 333 from Germany, 689 from the Netherlands, and 161 from Romania. Travels to Northern Africa (aOR 4.03, 95% CI 1.67-9.68), Sub-Saharan Africa (aOR 4.60, 95% CI 1.60-13.26), and Asia (aOR 4.08, 95% CI 1.97-8.43) were identified as independent risk factors for carriage of ESBL-EC carriage. Therefore, travel to these regions should continue to be routinely inquired risk factors in clinical practice when considering antibiotic therapy. | epidemiology |
10.1101/2022.03.10.22272188 | "I wanna live and not think about the future" What place for advance care planning for people living with severe multiple sclerosis and their families? A qualitative study | BackgroundLittle is known about how people with multiple sclerosis and their families comprehend advance care planning (ACP) and its relevance in their lives.
AimTo explore under what situations, with whom, how, and why do people with MS and their families engage in ACP
MethodsWe conducted a qualitative study comprising interviews with people living with MS and their families followed by an ethical discussion group with five health professionals representing specialties working with people affected by MS and their families. Twenty-seven people with MS and 17 family members were interviewed between June 2019 and March 2020. Interviews and the ethical discussion group were audio-recorded and transcribed verbatim. Data were analysed using the framework approach.
ResultsParticipants narratives focused on three major themes :(i) planning for an uncertain future; (ii) perceived obstacles to engaging in ACP that included uncertainty concerning MS disease progression, negative previous experiences of ACP discussions and prioritising symptom management over future planning; (iii) Preferences for engagement in ACP included a trusting relationship with a health professional and that information then be shared across services. Health professionals accounts from the ethical discussion group departed from viewing ACP as a formal document to that of an ongoing process of seeking preferences and values. They voiced similar concerns to people with MS about uncertainty and when to initiate discussions. Some shared concerns of a lack of confidence when having ACP discussions.
ConclusionThese findings support the need for a whole systems strategic approach where information about the potential benefits of ACP in all its forms can be shared with people with MS. Moreover, they highlight the need for health professionals to be skilled and trained in engaging in ACP discussions and where information is contemporaneously and seamlessly shared across services. | palliative medicine |
10.1101/2022.03.10.22272188 | I wanna live and not think about the future; What place for advance care planning for people living with severe multiple sclerosis and their families? A qualitative study | BackgroundLittle is known about how people with multiple sclerosis and their families comprehend advance care planning (ACP) and its relevance in their lives.
AimTo explore under what situations, with whom, how, and why do people with MS and their families engage in ACP
MethodsWe conducted a qualitative study comprising interviews with people living with MS and their families followed by an ethical discussion group with five health professionals representing specialties working with people affected by MS and their families. Twenty-seven people with MS and 17 family members were interviewed between June 2019 and March 2020. Interviews and the ethical discussion group were audio-recorded and transcribed verbatim. Data were analysed using the framework approach.
ResultsParticipants narratives focused on three major themes :(i) planning for an uncertain future; (ii) perceived obstacles to engaging in ACP that included uncertainty concerning MS disease progression, negative previous experiences of ACP discussions and prioritising symptom management over future planning; (iii) Preferences for engagement in ACP included a trusting relationship with a health professional and that information then be shared across services. Health professionals accounts from the ethical discussion group departed from viewing ACP as a formal document to that of an ongoing process of seeking preferences and values. They voiced similar concerns to people with MS about uncertainty and when to initiate discussions. Some shared concerns of a lack of confidence when having ACP discussions.
ConclusionThese findings support the need for a whole systems strategic approach where information about the potential benefits of ACP in all its forms can be shared with people with MS. Moreover, they highlight the need for health professionals to be skilled and trained in engaging in ACP discussions and where information is contemporaneously and seamlessly shared across services. | palliative medicine |
10.1101/2022.03.14.22272273 | Understanding adherence to self-isolation in the first phase of COVID-19 response | ObjectiveTo gain a better understanding of decisions around adherence to self-isolation advice during the first phase of the COVID-19 response in England.
DesignA mixed-methods cross sectional study.
Setting: EnglandParticipants COVID-19 cases and contacts who were contacted by Public Health England (PHE) during the first phase of the response in England (January-March 2020).
ResultsOf 250 respondents who were advised to self-isolate, 63% reported not leaving home at all during their isolation period, 20% reported leaving only for lower risk activities (dog walking or exercise) and 16% reported leaving for potentially higher risk, reasons (shopping, medical appointments, childcare, meeting family or friends). Factors associated with adherence to never going out included: the belief that following isolation advice would save lives, experiencing COVID-19 symptoms, being advised to stay in their room (rather than just "inside"), having help from outside and having regular contact by text message from PHE. Factors associated with non-adherence included being angry about the advice to isolate, being unable to get groceries delivered and concerns about losing touch with friends and family. Interviews highlighted that a sense of duty motivated people to adhere to isolation guidance and where people did leave their homes, these decisions were based on rational calculations of the risk of transmission - people would only leave their homes when they thought they were unlikely to come into contact with others.
ConclusionsMeasures of adherence should be nuanced to allow for the adaptations people make to their behaviour during isolation. Understanding adherence to isolation and associated reasoning during the early stages of the pandemic is an essential part of pandemic preparedness for future emerging infectious diseases.
Strengths and limitations of this studyO_LIOur participants were contacted directly by Public Health England during the first three months of the pandemic - the only cohort of cases and contacts who experienced self-isolation during this early phase of the pandemic.
C_LIO_LIResults may not be directly generalisable to wider populations or later phases of pandemic response.
C_LIO_LIWe classified reasons for leaving the home as higher or lower contact, as a proxy for potential risk of transmission, however further research published since we conducted our research as refined our understanding of transmission risk, highlighting the need for more in-depth research on adherence behaviour and transmission risk.
C_LIO_LIThe mixed methods approach combined quantitative measures of adherence with an exploration of how and why these decisions were being made in the same people.
C_LIO_LIOur study provides unique insights into self-isolation during the earliest stages of the pandemic, against a background of uncertainty and lack of information that will recur, inevitably, in the face of future pandemic and similar threats.
C_LI | public and global health |
10.1101/2022.03.10.22272238 | Nocturnal Respiratory Rate Dynamics Enable Early Recognition of Impending Hospitalizations | The days and weeks preceding hospitalization are poorly understood because they transpire before patients are seen in conventional clinical care settings. Home health sensors offer opportunities to learn signatures of impending hospitalizations and facilitate early interventions, however the relevant biomarkers are unknown. Nocturnal respiratory rate (NRR) is an activity-independent biomarker that can be measured by adherence-independent sensors in the home bed. Here, we report automated longitudinal monitoring of NRR dynamics in a cohort of high-risk recently hospitalized patients using non-contact mechanical sensors under patients home beds. Since the distribution of nocturnal respiratory rates in populations is not well defined, we first quantified it in 2,000 overnight sleep studies from the NHLBI Sleep Heart Health Study. This revealed that interpatient variability was significantly greater than intrapatient variability (NRR variances of 11.7 brpm2 and 5.2 brpm2 respectively, n=1,844,110 epochs), which motivated the use of patient-specific references when monitoring longitudinally. We then performed adherence-independent longitudinal monitoring in the home beds of 34 high-risk patients and collected raw waveforms (sampled at 80 Hz) and derived quantitative NRR statistics and dynamics across 3,403 patient-nights (n= 4,326,167 epochs). We observed 23 hospitalizations for diverse causes (a 30-day hospitalization rate of 20%). Hospitalized patients had significantly greater NRR deviations from baseline compared to those who were not hospitalized (NRR variances of 3.78 brpm2 and 0.84 brpm2 respectively, n= 2,920 nights). These deviations were concentrated prior to the clinical event, suggesting that NRR can identify impending hospitalizations. We analyzed alarm threshold tradeoffs and demonstrated that nominal values would detect 11 of the 23 clinical events while only alarming 2 times in non-hospitalized patients. Taken together, our data demonstrate that NRR dynamics change days to weeks in advance of hospitalizations, with longer prodromes associating with volume overload and heart failure, and shorter prodromes associating with acute infections (pneumonia, septic shock, and covid-19), inflammation (diverticulitis), and GI bleeding. In summary, adherence-independent longitudinal NRR monitoring has potential to facilitate early recognition and management of pre-symptomatic disease. | respiratory medicine |
10.1101/2022.03.14.22272000 | HormoneBayes: a novel Bayesian framework for the analysis of pulsatile hormone dynamics. | The hypothalamus is the central regulator of reproductive hormone secretion. Pulsatile secretion of gonadotropin releasing hormone (GnRH) is fundamental to physiological stimulation of the pituitary gland to release luteinizing hormone (LH) and follicle stimulating hormone (FSH). Furthermore, GnRH pulsatility is altered in common reproductive disorders such as polycystic ovary syndrome (PCOS) and hypothalamic amenorrhea (HA). LH is measured routinely in clinical practice using an automated chemiluminescent immunoassay method and is the gold standard surrogate marker of GnRH. LH can be measured at frequent intervals (e.g., 10 minutely) to assess GnRH/LH pulsatility. However, this is rarely done in clinical practice because it is resource intensive, and there is no user-friendly and accessible method for computational analysis of the LH data available to clinicians. Here we present hormoneBayes, a novel open-access Bayesian framework that can be easily applied to reliably analyze serial LH measurements to assess LH pulsatility. The framework utilizes parsimonious models to simulate hypothalamic signals that drive LH dynamics, together with state-of-the-art (sequential) Monte-Carlo methods to infer key parameters and latent hypothalamic dynamics. We show that this method provides estimates for key pulse parameters including inter-pulse interval, secretion and clearance rates and identifies LH pulses in line with the current gold-standard deconvolution method. We show that these parameters can distinguish LH pulsatility in different clinical contexts including in reproductive health and disease in men and women (e.g., healthy men, healthy women before and after menopause, women with HA or PCOS). A further advantage of hormoneBayes is that our mathematical approach provides a quantified estimation of uncertainty. Our framework will complement methods enabling real-time in-vivo hormone monitoring and therefore has the potential to assist translation of personalized, data-driven, clinical care of patients presenting with conditions of reproductive hormone dysfunction. | sexual and reproductive health |
10.1101/2022.03.13.22272320 | Plasma p-tau181/Aβ1-42 ratio predicts Aβ-PET status and correlates with CSF-p-tau181/Aβ1-42 and future cognitive decline. | BackgroundIn Alzheimers disease, plasma A{beta}1-42 and p-tau predict high amyloid status from A{beta}-PET, however the extent to which combination of both plasma assays predict remains unknown.
MethodsPrototype Simoa assays were used to measure plasma samples from cognitively normal (CN) and symptomatic adults in the Australian Imaging, Biomarkers and Lifestyle (AIBL) study.
ResultsThe p-tau181/A{beta}1-42 ratio showed the best prediction of A{beta}-PET across all participants (AUC=0.905, 95%CI:0.86-0.95) and in CN (AUC=0.873; 0.80-0.94), and symptomatic (AUC=0.908; 0.82-1.00) adults. Plasma p-tau181/A{beta}1-42 ratio correlated with CSF-p-tau181 (Elecsys(R), Spearmans {rho}=0.74, P<0.0001) and predicted abnormal CSF A{beta} (AUC=0.816, 0.74-0.89). The p-tau181/A{beta}1-42 ratio also predicted future rates of cognitive decline assessed by AIBL PACC or CDR-SOB (P<0.0001).
DiscussionPlasma p-tau181/A{beta}1-42 ratio predicted both A{beta}-PET status and cognitive decline, demonstrating potential as both a diagnostic aid and as a screening and prognostic assay for preclinical Alzheimers disease trials. | neurology |
10.1101/2022.03.14.22272286 | The French Early Breast Cancer Cohort (FRESH): a resource for breast cancer research and evaluations of oncology practices based on the French National Healthcare System Database (SNDS) | BackgroundBreast cancer (BC) is the most frequent cancer and the leading cause of cancer-related death in women. The French National Cancer Institute has created a national cancer cohort to promote cancer research and improve our understanding of cancer using the National Health Data System (SNDS). This cohort amalgamates all cancer sites, with no detailed separate data for early BC.
ObjectivesWe describe the French Early Breast Cancer Cohort (FRESH).
MethodsAll French women aged 18 years or over, with early-stage BC newly diagnosed between January 1, 2011 and December 31, 2017, treated by surgery and registered in the general health insurance coverage plan were included in the cohort. Patients with suspected locoregional or distant metastases at diagnosis were excluded. BC treatments (surgery, chemotherapy, targeted therapy, radiotherapy, endocrine therapy), and diagnostic procedures (biopsy, cytology, imaging) were extracted from hospital discharge reports, outpatient care notes or pharmacy drug delivery data. BC subtype was inferred from the treatments received.
ResultsWe included 235,368 patients with early BC in the cohort (median age: 60 years). BC subtype distribution was as follows: luminal (80.2%), triple-negative (TNBC, 9.5%); HER2+ (10.3%), or unidentifiable (n=44,388, 18.9% of the cohort). Most patients underwent radiotherapy (n=200,685, 85.3%) and endocrine therapy (n=165,655, 70.4%), and 38.3% (n=90,252) received chemotherapy. Treatments and care pathways are described.
ConclusionThe FRESH Cohort is an unprecedented population-based resource facilitating future large-scale real-life studies aiming to improve care pathways and quality of care for BC patients. | oncology |
10.1101/2022.03.11.22272206 | Genome-by-Trauma Interaction Effects in Depression | BackgroundSelf-reported trauma exposure has consistently been found to be a risk factor for Major Depressive Disorder (MDD) and several studies have reported interactions with genetic liability. To date, most studies have examined interaction effects with trauma exposure using genome-wide variants (single nucleotide polymorphisms SNPs) or polygenic scores, both typically capturing less than 3% of phenotypic risk variance. We sought to re-examine genome-by-trauma interaction effects using genetic measures utilising all available genotyped data and thus, maximising accounted variance.
MethodsMeasures of self-reported depression, neuroticism and trauma exposure for 148 129 participants with whole genome SNP data are available from the UK Biobank study. Here, we used a mixed-model statistical approach utilising genetic, trauma exposure and genome-by-trauma exposure interaction similarity matrices to explore sources of variation in depression and neuroticism.
FindingsOur approach estimated the heritability of MDD to be approximately 0{middle dot}160 [SE 0{middle dot}016]. Subtypes of self-reported trauma exposure (catastrophic, adult, childhood and full trauma) accounted for a significant proportion of the variance of each trait, ranging from 0{middle dot}056 [SE 0{middle dot}013] to 0{middle dot}176 [SE 0{middle dot}025]. The proportion of MDD risk variance accounted for by significant genome-by-trauma interaction ranged from 0{middle dot}074 [SE 0{middle dot}006] to 0{middle dot}201 [SE 0{middle dot}009]. Results from sex-specific analyses found genome-by-trauma interaction variance estimates approximately 5-fold greater for MDD in males than in females.
InterpretationThis is the first study to utilise an approach combining all genome-wide SNP data when exploring genome-by-trauma interaction effects in MDD and present evidence that interaction effects are influential in depression manifestation. This effect accounts for greater trait variance within males which points to potential differences in depression aetiology between the sexes. The methodology utilised in this study can be extrapolated to other environmental factors to identify modifiable risk environments and at-risk groups to target with interventions.
Research In ContextO_ST_ABSEvidence before this studyC_ST_ABSWe searched PubMed up to January 30th 2022, with the following terms: ("gene environment interaction" OR "gene environment" OR "genome wide by environment" OR "GWEIS" OR "polygenic environment" OR ("gene" AND "environment")) AND ("polygenic risk score" OR "polygenic score" OR "genomic relationship matrix" OR "GRM") AND ("trauma" OR "environmental adversity" OR "stressful life events") AND ("depression" OR "major depressive disorder" OR "MDD" OR "depressive symptoms"). Date or language restrictions were not applied. We further reviewed the reference lists of identified articles. This search was supplemented by reviewing related articles identified by Google Scholar. We identified 12 relevant articles. Studies to date have not explored genome-by-environment interaction effects in depression using genomic similarity matrices, however, these effects have been explored using individual single nucleotide polymorphisms (SNPs) from genome-wide studies and polygenic scores (PGSs). Some findings suggest genome-by-environment interaction effects increase risk of depression. However, replication attempts have produced either inconsistent or null findings. Taken together, it is evident that findings have failed to provide consistent evidence of substantial interaction effects. These findings may be a result of limited statistical power in analyses due to genome-wide variants and PGSs failing to capture the polygenic nature of depression with sufficient precision.
Added value of this studyThis study is the first to explore genome-by-trauma interaction effects on MDD through the estimation of variance components using relationship matrices. Genomic relationship matrices (GRMs) utilise all available genotyped variants, thus, capturing a greater proportion of the trait variance and potentially providing greater power to detect genetic effects in comparison to PGSs. Additional relationship matrices capturing trauma exposure, and genome-by-trauma exposure similarity are computed and included into mixed linear models. We found evidence for substantial genome-by-trauma (including subtypes of trauma) exposure interaction effects on depression manifestation. Estimated genome-by-trauma interaction effects were larger in males than in females.
Implications of all the available evidenceOur findings are the first to show substantial genome-by-trauma effects on depression using whole genome methods. These findings highlight that the role of trauma exposure on depression manifestation may be non-additive and different between sexes. Exploring these effects in depth may yield important insight into various mechanisms, which may explain prevalence differences observed between males and females. Future work can build upon the framework we propose to explore genome-by-trauma interaction effects and the underlying molecular sites and mechanisms which are involved in depression manifestation. | genetic and genomic medicine |
10.1101/2022.03.11.22272278 | Effectiveness of gamified team competition in the context of mHealth intervention for medical interns: a micro-randomized trial | BackgroundTwin revolutions in wearable technologies and smartphone-delivered digital health interventions have significantly expanded the accessibility and uptake of personalized interventions in multiple domains of health sciences. Gamification, the application of gaming elements to increase enjoyment and engagement, has the potential to improve the effectiveness of digital health interventions. However, the effectiveness of competition gamification components remains poorly understood, challenging informed decisions on the potential adoption of these components in future studies and trial designs. We aimed to evaluate the effect of smartphone-based gamified team competition intervention on daily step count and sleep duration via a micro-randomized trial.
MethodsWe recruited first-year medical residents (interns) in the US, who downloaded the study app, provided consent, wore a wearable device, and completed a baseline survey. Teams were formed based on participating residents institutions and specialties, and subsequently randomized weekly to the competition or non-competition arms. In the competition arm, opponent teams and competition type (step count or sleep duration) were also randomly selected. Competition participants had access to the current competition scoreboard and competition history via the study app; they also received scheduled competition-related push notifications in a competition week. We estimated the main and moderated causal effects of competition on proximal daily step count and sleep duration. This trial is registered with ClinicalTrials.gov (NCT05106439).
FindingsBetween April and June 2020, we enrolled 2,286 medical interns from 263 institutions, of whom 1,936 were formed into 191 teams that met the criteria for participation in competitions between July 6 and September 27, 2020. 1,797 participants who had pre-internship baseline information were included in the analysis. Relative to the no competition arm, competition intervention significantly increased the mean daily step count by 111{middle dot}5 steps (SE 40{middle dot}4, p=0{middle dot}01), while competition did not significantly affect the mean daily sleep minutes (p=0{middle dot}69). Secondary moderator analyses indicated that, for each additional week-in-study, the causal effects of competition on daily step count and sleep minutes decreased by 9{middle dot}1 (11{middle dot}6) steps (p=0{middle dot}43) and 1{middle dot}9 (0{middle dot}6) minutes (p=0{middle dot}003), respectively. Intra-institutional competition negatively moderated the causal effect of competition upon daily step count by -114.9 (93{middle dot}7) steps (p=0{middle dot}22).
InterpretationGamified competition delivered via mobile app significantly increased daily physical activity which suggests that team competition can function as a mobile health intervention tool to increase short-term physical activity level.
Research in contextO_ST_ABSEvidence before this studyC_ST_ABSWe searched PubMed for studies of mobile health intervention with gamified components: ("mobile health intervention", "mHealth intervention", "mobile health gamification"). We evaluated studies published before November 30, 2021. The search was not limited by language. Previous work affirmed that in mobile health interventions, gamification is effective for improving users physical activity and mental health. Most of previous work used feedback, reward, and progress bar as game mechanics, while none have rigorously examined the effectiveness of gamified team competition.
Added value of this studyThis study provides evidence that the gamified team competition has a positive effect on physical activity. The data that was intensively collected as part of this study can be used for further investigation.
Implications of all the available evidenceThe results of this study indicate that gamified team competition has the potential to improve the effectiveness of and engagement with mobile health interventions. | health informatics |
10.1101/2022.03.12.22272175 | Bayesian Hierarchical Model for Dose Finding Trial Incorporating Historical Data | The Multiple Comparison Procedure and Modelling (MCPMod) approach has been shown to be a powerful statistical tool that can significantly improve the design and analysis of dose finding studies under model uncertainty. Due to its frequentist nature, however, it is difficult to incorporate into MCPMod information from historical trials on the same drug. Recently, a Bayesian version of MCPMod has been introduced by Fleischer et al. (2022) to resolve this issue, which is particularly tailored to the situation where there is information about the placebo dose group from historical trials. In practice, information may also be available on active dose groups from early phase trials, e.g., a dose escalation trial and a preceding small Proof of Concept trial with only a placebo and a high dose. To address this issue, we introduce a Bayesian hierarchical framework capable of incorporating historical information about both placebo and active dose groups with the flexibility of modelling both prognostic and predictive between-trial heterogeneity. Our method is particularly useful in the situation where the effect sizes of two trials are different. Our goal is to reduce the necessary sample size in the dose finding trial while maintaining its target power. | health informatics |
10.1101/2022.03.13.22272313 | Health shocks and changes in life purpose: Understanding the link between purpose and longevity | BackgroundThe negative correlation between life purpose levels and subsequent morbidity and mortality is interpreted as evidence that a higher sense of life purpose causes healthier and longer lives. Causation, however, could run the other direction as a decline in health is, by definition, associated with greater morbidity and mortality risk and may also cause a decline in life purpose. We examine the relation between objective measures of changes in health and changes in purpose to better understand the causal mechanisms linking purpose to health and mortality.
MethodsProspective cohort sample of 12 745 individuals aged 50 and older who were eligible to participate in the 2006, 2010, or 2014 Health and Retirement Study Psychosocial and Lifestyle questionnaire. The final sample consists of 15 034 observations measured over three four-year periods from 5 147 individuals. Controlling for standard covariates, we examined the relation between changes in purpose and 14 contemporaneous and subsequent objectively measured changes in health--lung function, grip strength, walking speed, balance, and physician diagnoses of hypertension, diabetes, cancer, lung disease, heart condition, stroke, psychiatric problem, arthritis, dementia, and Alzheimers disease.
FindingsThere is strong evidence that negative health shocks cause a decline in life purpose as individuals who suffer a negative health shock experience a statistically meaningful contemporaneous decline in life purpose for 12 of the 14 changes in health metrics. In contrast, there is relatively weak evidence that a decline in purpose contributes to a deterioration of future health.
InterpretationMuch of the relation between life purpose levels and mortality risk arises from reverse causation--a decline in health causes both increased mortality risk and lower life purpose. There is little evidence that life purpose interventions would alter future morbidity or mortality.
FundingNone.
Research in contextO_ST_ABSEvidence before this studyC_ST_ABSWe searched PubMed and Google Scholar with no language or date restriction for the term "life purpose" and found four comprehensive reviews of the life purpose or psychological well-being (which included life purpose in the set of psychological well-being metrics) literatures in the last three years and a 2016 meta-analysis of the relation between life purpose and mortality. Although acknowledging the possibility that reverse causation plays a role in linking life purpose levels to subsequent morbidity and mortality, the prevalent view appears to be that even when controlling for current health levels, higher life purpose causes behavioral, biological, or stress buffering changes that, in turn, cause lower future morbidity and mortality.
Added value of this studyBy focusing on changes in health, changes in life purpose, and a longer horizon, we find strong evidence that changes in health cause changes in life purpose, but, contrary to the conclusions of most previous work, there is little evidence changes in life purpose cause changes in behavior, biology, or stress-buffering that, in turn, cause changes in future health.
Implications of all the available evidenceAlthough life purpose intervention--either at the provider level or in public policy--may have benefits, there is little evidence to suggest it will cause greater longevity or lower future illness. | health policy |
10.1101/2022.03.14.22272210 | HIV-1 infection induces functional reprogramming of female plasmacytoid dendritic cells associated with enhanced TLR7 expression | Plasmacytoid dendritic cells (pDCs) express TLR7, a ssRNA-sensor encoded on the X chromosome, which escapes X chromosome inactivation (XCI) in females. pDCs are specialized in the production of type 1 interferons (IFN-I) through TLR7 activation which mediates both immune cell activation and also reactivation of latent HIV-1. The effect of HIV-1 infection in women under antiretroviral therapy (ART) on pDC functional responses remains poorly understood. Here, we show that pDCs from HIV/ART women exhibit exacerbated production of IFN- and TNF- as compared to uninfected controls (UC) upon TLR7-activation. Because TLR7 can escape XCI in female pDCs, we measured the contribution of TLR7 allelic expression using SNP haplotypic markers to rigorously tag the allele of origin of TLR7 gene at single cell resolution. Herein, we provide evidence that the functional reprogramming of pDCs in HIV/ART women is associated with enhanced transcriptional activity of the TLR7 locus from both X chromosomes, rather than differences in the frequency of TLR7 bi-allelic cells. These data reinforce the interest in targeting the HIV-1 reservoir using TLR7 agonists in women. | hiv aids |
10.1101/2022.03.11.22272214 | C allele of rs479200 of the host EGLN1 gene - a risk factor for severe COVID-19 (pilot study) | BackgroundCoronavirus disease-2019 (COVID-19) symptoms can range from asymptomatic, moderate to severe manifestations that result in an overall global case fatality rate of 2-7 %. While each variant has had it challenges, and some variants are more severe than others, risk factors of severe COVID-19 are still under investigation. In this context, the host genetic predisposition is also a crucial factor to investigate. In the present study, we investigated host genotypes of the SNP rs479200 of the host EGLN1 gene, previously implicated in high altitude pulmonary edema (HAPE), some of whose symptoms such as hypoxia profoundly overlap with severe COVID-19.
MethodsAfter informed consent, 158 RT-PCR confirmed COVID-19 patients (March 2020 to June 2021) were enrolled in the study. Based on their clinical manifestations, disease severity was categorized by the clinical team. Blood samples were drawn and DNA was extracted from the clot to infer different genotypes of the SNP rs479200 of the host EGLN1 gene. PCR-RFLP analysis of the SNP rs479200 (C > T) was performed with an amplicon size of 367 bp. Various genotypes (TT, TC and CC) were assigned based on the presence/absence of a restriction site (T/GTACA) for restriction enzyme BsrGI. Allele frequencies, Hardy-Weinberg Equilibrium (HWE) and multinomial logistic regression were performed using statistical tool SPSS version 23 (IBM).
FindingsWe observed that the severe COVID-19 category was composed of comparatively younger patients with mean age (34.9{+/-}15.6), compared to asymptomatic and moderate categories whose mean age was 49.7{+/-}17.9 and 54.3{+/-}15.7, respectively. Preponderance of males and high heterozygosity (TC) was observed across the clinical categories. Notably, the frequency of C allele (0.664) was 2-fold higher than the T allele (0.336) in severe COVID-19 patients, whereas the allele frequencies were similar in asymptomatic and moderate category of COVID-19 patients. Multinomial logistic regression showed an association of genotypes with increasing clinical severity; odds ratio (adjusted OR-11.414 (2.564-50.812)) and (unadjusted OR-6.214 (1.84-20.99)) for the genotype CC in severe category of COVID-19. Interestingly, the TC genotype was also found to be positively associated with severe outcome (unadjusted OR-5.816 (1.489-22.709)), indicating association of C allele in imparting the risk of severe outcome.
InterpretationThe study provides strong evidence that the presence of C allele of SNP (rs479200) of the EGLN1 gene associates with severity in COVID-19 patients. Thus, the presence of C allele may be a risk factor for COVID-19 severity. This study opens new avenues towards risk assessment that include EGLN1 (rs479200) genotype testing and identifying patients with C allele who might be prioritized for critical care. | infectious diseases |
10.1101/2022.03.13.22272281 | Broad Cross-reactive IgA and IgG Against Human Coronaviruses in Milk Induced by COVID-19 Vaccination and Infection | It is currently unclear if SARS-CoV-2 infection or mRNA vaccination can also induce IgG and IgA against common human coronaviruses (HCoVs) in lactating parents. Here we prospectively analyzed human milk (HM) and blood samples from lactating parents to measure the temporal patterns of anti-SARS-CoV-2 specific and anti-HCoV cross-reactive IgA and IgG responses. Two cohorts were analyzed: a vaccination cohort (n=30) who received mRNA-based vaccines for COVID-19 (mRNA-1273 or BNT162b2), and an infection cohort (n=45) with COVID-19 disease. Longitudinal HM and fingerstick blood samples were collected pre- and post-vaccination or, for infected subjects, at 5 time-points 14 - 28 days after confirmed diagnosis. The anti-spike(S) and antinucleocapsid(N) IgA and IgG antibody levels against SARS-CoV-2 and HCoVs were measured by multiplex immunoassay (mPlex-CoV). We found that vaccination significantly increased the anti-S IgA and IgG levels in HM. In contrast, while IgG levels increased after a second vaccine dose, blood and HM IgA started to decrease. Moreover, HM and blood anti-S IgG levels were significantly correlated, but anti-S IgA levels were not. SARS2 acute infection elicited anti-S IgG and IgA that showed much higher correlations between HM and blood compared to vaccination. Vaccination and infection were able to significantly increase the broadly cross-reactive IgG recognizing HCoVs in HM and blood than the IgA antibodies in HM and blood. In addition, the broader cross-reactivity of IgG in HM versus blood indicates that COVID-19 vaccination and infection might provide passive immunity through HM for the breastfed infants not only against SARS-CoV-2 but also against common cold coronaviruses.
IMPORTANCEIt is unknown if COVID-19 mRNA vaccination and infection in lactating mothers results in cross-reactive antibodies against other common human coronaviruses. Our study demonstrates that mRNA vaccination and COVID-19 infection increase anti-spike SARS-CoV-2 IgA and IgG in both blood and milk. IgA and IgG antibody concentrations in milk were more tightly correlated with concentrations in blood after infection compared to mRNA vaccination. Notably, both infection and vaccination resulted in increased IgG against common seasonal {beta}-coronaviruses. This suggests that SARS-CoV-2 vaccination or infection in a lactating parent may result in passive immunity against SARS-CoV-2 and seasonal coronaviruses for the recipient infant. | infectious diseases |
10.1101/2022.03.11.22272208 | Monocytes are the main source of STING-mediated IFN-alpha production | BackgroundType I interferon (IFN-I) production by plasmacytoid dendritic cells (pDCs) occurs during viral infection, in response to Toll-like receptor 7 (TLR7) stimulation and is more vigorous in females than in males. Whether this sex bias persists in ageing people is currently unknown. In this study, we investigated the effect of sex and aging on IFN- production induced by PRR agonist ligands.
MethodsIn a large cohort of individuals from 19 to 97 years old, we measured the production of IFN- and inflammatory cytokines in whole-blood upon stimulation with either R-848, ODN M362 CpG-C, or cGAMP, which activate the TLR7/8, TLR9 or STING pathways, respectively. We further characterized the cellular sources of IFN-.
FindingsWe observed a female predominance in IFN- production by pDCs in response to TLR7 or TLR9 ligands. The higher TLR7-driven IFN- production in females was robustly maintained across ages, including the elderly. The sex-bias in TLR9-driven interferon production was lost after age 60, which correlated with the decline in circulating pDCs. By contrast, STING-driven IFN- production was similar in both sexes, preserved with aging, and correlated with circulating monocyte numbers. Indeed, monocytes were the primary cellular source of IFN- in response to cGAMP.
InterpretationWe show that the sex bias in the TLR7-induced IFN-I production is strongly maintained through ages, and identify monocytes as the main source of IFN-I production via STING pathway.
FundingThis work was supported by grants from Region Occitanie/Pyrenees-Mediterranee (#12052910, Inspire Program #1901175), University Paul Sabatier, and the European Regional Development Fund (MP0022856).
Research in contextO_ST_ABSEvidence before this studyC_ST_ABSType I interferon (IFN-I) production by plasmacytoid dendritic cells (pDCs) occurs during infection with viruses, including SARS-CoV-2, in response to Toll-like receptor 7 (TLR7) stimulation. Early type I IFN production by pDCs in the respiratory tract through TLR7 activation is protective in severe COVID-19. The capacity of female pDCs to produce higher levels of interferon (IFN-) in response to TLR7 ligands, compared to those of males, is one immune characteristic that robustly distinguishes the two sexes in middle-aged adults. It is currently unknown whether the superior ability of female pDCs to produce IFN-I upon TLR7 stimulation is maintained with age. In this study, we investigated the impact of sex and aging on the release of innate cytokines (IFN-, IFN-{gamma}, IL-1{beta}, IL-6, IL-8, TNF-, MCP1) in a whole-blood assay from 310 healthy volunteers (145 males and 165 females) from 19 to 97 years old, upon stimulation with either TLR7-, TLR9-ligands or with cGAMP, the natural product of cGAS which activates STING (Stimulator of IFN Gene) and has been reported to exhibit potent anti-tumor and adjuvant effects through induction of IFN-I by ill-defined cellular sources.
Added value of this studyWe observed that IFN- responses to TLR7 and TLR9 ligands were the only whole blood assay variables exhibiting sex differences among all 21 variables investigated (seven analytes analyzed after stimulation by three different ligands). Our results show that the accrued female response in the TLR7-induced IFN- production was robustly maintained over ages, including elderly subjects >80. In contrast, STING-induced IFN-I production was similar in both sexes and was maintained with aging possibly as a consequence of the age-related increase in circulating monocyte numbers. Indeed, we demonstrate for the first time that monocytes represent the main cellular source of IFN-I upon cGAMP stimulation of PBMCs.
Implications of all the available evidenceThis study demonstrates that the heightened TLR7 ligand-induced IFN- secretion by blood pDCs from females, compared to those from males, is maintained in elderly women, supporting the hypothesis that this pathway could contribute to enhanced protection against virus infections such as SARS-CoV-2 in females. This work also shows that cGAMP can promote IFN-I production by targeting monocytes, which numbers increase with aging, suggesting that STING ligands may be useful for vaccine design in the elderly in both sexes. | allergy and immunology |
10.1101/2022.03.11.22272280 | Association of left atrial structure and function with cognitive function among adults with metabolic syndrome | BackgroundAtrial fibrillation has been associated with cognitive impairment. Whether subclinical abnormalities in atrial function and substrate predisposing to atrial fibrillation impact cognitive function has received limited attention.
MethodsWe tested associations of echocardiographic markers of atrial structure and function with cognitive functioning and decline among 510 participants with overweight/obesity and metabolic syndrome (mean age [standard deviation] of 64.4 [5.2] years in men and 66.5 [3.9] in women). Left atrial markers (volume index, emptying fraction, strain, function index, and stiffness index) were estimated based on transthoracic echocardiography. General cognitive functioning (Mini-mental examination), verbal ability (verbal fluency test), memory and attention (Digit Span Tests), and processing speed and executive function (Trail-Making Tests A and B) were assessed at baseline and at the two-year follow-up. Multiple linear regression was used to test associations of atrial markers (modeled in standard deviation units) with baseline and two-year changes in cognitive scores adjusting for demographic and health covariates.
ResultsLeft atrial structure and function was not associated with cognitive function at baseline. Larger left atrial volume index (standardized {beta} [95% confidence interval] = -0.13 [-0.22, -0.03]), lower peak longitudinal strain (-0.11 [-0.20, -0.01]), and higher stiffness index (-0.18 [-0.28, -0.08) were associated with 2-year worsening in Trail-Making Test A. Strain measurements were also associated with 2-year change in the Controlled Oral Word Association Test.
ConclusionOverall, adverse markers of left atrial structure and function were associated with 2-year detrimental executive function-related cognitive changes in a sample of participants at high risk for cardiovascular disease, highlighting left atrial substrate as a potential risk factor for cognitive decline and dementia. | cardiovascular medicine |
10.1101/2022.03.13.22272267 | Lower Risk of Paediatric Inflammatory Multisystem Syndrome (PIMS-TS) with the Delta variant of SARS-CoV-2 | Paediatric Inflammatory Multisystem Syndrome (PIMS-TS, also known as MIS-C) typically occurs 2-6 weeks after exposure to SARS-CoV-2. Early estimates suggested a risk of PIMS-TS of 1 in 3-4000 infected children. Whether this risk is sustained with new SARS-CoV-2 variants remains unknown.
We utilised prospective data from the NHS South Thames Paediatric Network (STPN), which manages all cases of PIMS-TS amongst 1.5 million children in South-East England, to assess trends over time. We compared PIMS-TS cases with two independent SARS-CoV-2 infection datasets. We used publicly available UK Health Security Agency case numbers weighted to child population distributions according to area population estimates from the Office for National Statistics (ONS). To avoid bias due to evolving testing behaviour, we also compared PIMS-TS cases to community infection rates, obtained from the ONS COVID-19 Infection Survey, which randomly selects individuals for fortnightly PCR tests. All three datasets were normalised to the peak of the Alpha wave, and plotted against time. PIMS-TS cases were plotted 40 days prior to hospitalisation, corresponding to the best fit of rising SARS-CoV-2 infection and PIMS-TS cases during the Alpha wave.
Compared with the Alpha wave, we found fewer cases of PIMS-TS relative to SARS-CoV-2 infections during both initial and subsequent Delta waves. This relative reduction continued into the Omicron wave.
Re-infection rates with the Alpha or Delta variants and vaccination rates were very low during the Delta wave. As a result, lower PIMS-TS rate relative to SARS-CoV-2 infections during the Delta wave is unlikely to be explained by population level immunity from prior infection or vaccination. It is most likely due to viral mutations in key antigenic epitopes responsible for triggering the hyperinflammatory response observed with PIMS-TS. | pediatrics |
10.1101/2022.03.13.22272267 | Lower Risk of Multisystem Inflammatory Syndrome in Children (MIS-C) with the Delta and Omicron variants of SARS-CoV-2 | Paediatric Inflammatory Multisystem Syndrome (PIMS-TS, also known as MIS-C) typically occurs 2-6 weeks after exposure to SARS-CoV-2. Early estimates suggested a risk of PIMS-TS of 1 in 3-4000 infected children. Whether this risk is sustained with new SARS-CoV-2 variants remains unknown.
We utilised prospective data from the NHS South Thames Paediatric Network (STPN), which manages all cases of PIMS-TS amongst 1.5 million children in South-East England, to assess trends over time. We compared PIMS-TS cases with two independent SARS-CoV-2 infection datasets. We used publicly available UK Health Security Agency case numbers weighted to child population distributions according to area population estimates from the Office for National Statistics (ONS). To avoid bias due to evolving testing behaviour, we also compared PIMS-TS cases to community infection rates, obtained from the ONS COVID-19 Infection Survey, which randomly selects individuals for fortnightly PCR tests. All three datasets were normalised to the peak of the Alpha wave, and plotted against time. PIMS-TS cases were plotted 40 days prior to hospitalisation, corresponding to the best fit of rising SARS-CoV-2 infection and PIMS-TS cases during the Alpha wave.
Compared with the Alpha wave, we found fewer cases of PIMS-TS relative to SARS-CoV-2 infections during both initial and subsequent Delta waves. This relative reduction continued into the Omicron wave.
Re-infection rates with the Alpha or Delta variants and vaccination rates were very low during the Delta wave. As a result, lower PIMS-TS rate relative to SARS-CoV-2 infections during the Delta wave is unlikely to be explained by population level immunity from prior infection or vaccination. It is most likely due to viral mutations in key antigenic epitopes responsible for triggering the hyperinflammatory response observed with PIMS-TS. | pediatrics |
10.1101/2022.03.13.22272048 | Unplanned ICU transfer from post-anesthesia care unit following cerebral surgery: A retrospective study | BackgroundUnplanned transfer to intensive care unit (ICU) lead to reduced trust of patients and their families in medical staff and challenge medical staff to allocate scarce ICU resources. This study aimed to explore the incidence and risk factors of unplanned transfer to ICU during emergence from general anesthesia after cerebral surgery, and to provide guidelines for preventing unplanned transfer from post-anesthesia care unit (PACU) to ICU following cerebral surgery.
MethodsThis was a retrospective case-control study and included patients with unplanned transfer from PACU to ICU following cerebral surgery between January 2016 and December 2020. The control group comprised patients matched (2:1) for age ({+/-}5 years), sex, and operation date ({+/-}48 hours) as those in the case group. Stata14.0 was used for statistical analysis, and p <0.05 indicated statistical significance.
ResultsA total of 11,807 patients following cerebral surgery operations were cared in PACU during the study period. Of the 11,807 operations, 81 unscheduled ICU transfer occurred (0.686%). Finally, 76 patients were included in the case group, and 152 in the control group. The following factors were identified as independent risk factors for unplanned ICU admission after neurosurgery: low mean blood oxygen (OR=1.57, 95%CI: 1.20-2.04), low mean albumin (OR=1.14, 95%CI: 1.03-1.25), slow mean heart rate (OR=1.04, 95%CI: 1.00-1.08), blood transfusion (OR=2.78, 95%CI: 1.02-7.58), emergency surgery (OR=3.08, 95%CI: 1.07-8.87), lung disease (OR=2.64, 95%CI: 1.06-6.60), and high mean blood glucose (OR=1.71, 95%CI: 1.21-2.41).
ConclusionWe identified independent risk factors for unplanned transfer from PACU to ICU after cerebral surgery based on electronic medical records. Early identification of patients who may undergo unplanned ICU transfer after cerebral surgery is important to provide guidance for accurately implementing a patients level of care. | primary care research |
10.1101/2022.03.11.22272270 | Performing moderate to severe activity is safe and tolerable for healthy youth while wearing a cloth facemask | ObjectiveTo investigate if wearing a cloth facemask could affect physiological and perceptual responses to exercise at distinct exercise intensities in healthy young individuals.
MethodsIn a crossover design, 9 participants (sex, female/male: 6/3; age: 13{+/-}1 years; BMI: 18.4{+/-}2.1 kg/m2; sexual maturity rating, I/II/III/IV: 0/3/4/2; VO2peak: 44.5{+/-}5.5 mL/kg/min) underwent a progressive square-wave test at four intensities: (1) at 80% of the ventilatory anaerobic threshold (VAT), (2) at VAT, and (3) at 40% between the VAT and V{square}O2peak wearing a triple-layered cloth facemask or not. These stages represented moderate, heavy, and very heavy domains and corresponded to 46{+/-}8%, 57{+/-}10% and 87{+/-}8% of V{square}O2peak. Participants then completed a final stage (severe) to exhaustion at a running speed equivalent to the maximum achieved during the cardio-respiratory exercise test (Peak). Physiological, metabolic, and perceptual measures were analysed.
ResultsMask did not affect spirometry (forced vital capacity [FVC], peak expiratory flow [PEF1], forced expiratory volume [FEV]; all p > 0.27; Figure 1), respiratory (inspiratory capacity [IC], end-expiratory volume to functional vital capacity ratio [EELV/FVC], EELV, respiratory frequency [Rf], tidal volume [VT], Rf/VT, end-tidal carbo dioxide pressure [PetCO2], ventilatory equivalent and carbon dioxide ratio [VE/VCO2]; all p > 0.196), hemodynamic (heart rate [HR], systolic and diastolic blood pressure [SBP; DBP]; all p > 0.41), rated perceived exertion (RPE; p = 0.04) or metabolic measures (lactate; p = 0.78 Figure 2) at rest or at any exercise intensity. In both conditions, the same number of children (4 out of 9) were unable to finish the Peak stage, whereas one child did not complete exercise at the heavy domain while wearing no mask.
O_FIG O_LINKSMALLFIG WIDTH=200 HEIGHT=200 SRC="FIGDIR/small/22272270v1_fig1.gif" ALT="Figure 1">
View larger version (31K):
[email protected]@1f8a25forg.highwire.dtl.DTLVardef@6c404corg.highwire.dtl.DTLVardef@f1794a_HPS_FORMAT_FIGEXP M_FIG O_FLOATNOFigure 1.C_FLOATNO Spirometry data at rest with or without wearing a facemask. All data are expressed as mean and standard deviations. No significant differences between conditions were found.
C_FIG O_FIG O_LINKSMALLFIG WIDTH=200 HEIGHT=100 SRC="FIGDIR/small/22272270v1_fig2.gif" ALT="Figure 2">
View larger version (19K):
[email protected]@141d0a0org.highwire.dtl.DTLVardef@1e7d5beorg.highwire.dtl.DTLVardef@181b51a_HPS_FORMAT_FIGEXP M_FIG O_FLOATNOFigure 2.C_FLOATNO Ratings of perceived exertion (Panel A) and blood lactate concentration (Panel B) during a progressive square-wave test with or without wearing a facemask. Data expressed as means and standard deviations and individual data.
C_FIG ConclusionsThis study shows that performing moderate to severe activity is safe and tolerable for healthy youth while wearing a cloth facemask. ClinicalTrials.gov: NCT04887714 | sports medicine |
10.1101/2022.03.14.22272340 | Modifiable lifestyle activities affect cognition in cognitively healthy middle-aged individuals at risk for late-life Alzheimer's Disease. | It is now acknowledged that Alzheimers Disease (AD) processes are present decades before the onset of clinical symptoms, but it remains unknown whether lifestyle factors can protect against these early AD processes in mid-life. We asked whether modifiable lifestyle activities impact cognition in middle-aged individuals who are cognitively healthy, but at risk for late life AD. Participants (40-59 years) completed cognitive and clinical assessments at baseline (N = 210) and two years follow-up (N = 188). Mid-life activities were measured with the Lifetime of Experiences Questionnaire. We assessed the impact of lifestyle activities, known risk factors for sporadic late-onset AD (Apolipoprotein E {square}4 allele status, family history of dementia, and the Cardiovascular Risk Factors Aging and Dementia score), and their interactions on cognition. More frequent engagement in physically, socially and intellectually stimulating activities was associated with better cognition (verbal, spatial and relational memory), at baseline and follow-up. Critically, more frequent engagement in these activities was associated with stronger cognition (verbal and visuospatial functions, and conjunctive short-term memory binding) in individuals with family history of dementia. Impaired visuospatial function is one of the earliest cognitive deficits in AD and has previously associated with increased AD risk in this cohort. Additionally, conjunctive memory functions have been found impaired in the pre-symptomatic stages of AD. These findings suggest that modifiable lifestyle activities offset cognitive decrements due to AD risk in mid-life and support the targeting of modifiable lifestyle activities for the prevention of Alzheimers Disease. | public and global health |
10.1101/2022.03.14.22272134 | Acidity of expiratory aerosols controls the infectivity of airborne influenza virus and SARS-CoV-2 | Enveloped viruses are prone to inactivation when exposed to strong acidity levels characteristic of atmospheric aerosol. Yet, the acidity of expiratory aerosol particles and its effect on airborne virus persistence has not been examined. Here, we combine pH-dependent inactivation rates of influenza A virus and SARS-CoV-2 with microphysical properties of respiratory fluids under indoor conditions using a biophysical aerosol model. We find that particles exhaled into indoor air become mildly acidic (pH {approx} 4), rapidly inactivating influenza A virus within minutes, whereas SARS-CoV-2 requires days. If indoor air is enriched with non-hazardous levels of nitric acid, aerosol pH drops by up to 2 units, decreasing 99%-inactivation times for both viruses in small aerosol particles to below 30 seconds.Conversely, unintentional removal of volatile acids from indoor air by filtration may elevate pH and prolong airborne virus persistence. The overlooked role of aerosol pH has profound implications for virus transmission and mitigation strategies.
One Sentence SummaryRespiratory viruses are sensitive to aerosol pH, an unidentified factor in the mitigation of airborne virus transmission | infectious diseases |
10.1101/2022.03.14.22272334 | Obesity, hypertension and tobacco use associated with left ventricular remodelling and hypertrophy in South African Women: Birth to Twenty Plus Cohort | BackgroundLeft ventricular hypertrophy (LVH) is a known marker of increased risk in developing future life-threating CVD, though it is unclear how health risk factors, such as obesity, blood pressure and tobacco use, associate with left ventricular (LV) remodelling and LVH across generations of urban African populations.
MethodsBlack female adults (n=123; age: 29-68 years) and their children (n=64; age: 4-10; 55% female) were recruited from the Birth to Twenty Plus Cohort in Soweto, South Africa. Tobacco and alcohol use, physical activity, presence of diabetes mellitus, heart disease and medication were self-reported. Height, weight, and blood pressure were measured in triplicate. Echocardiography was used to assess LV mass at end-diastole, perpendicular to the long axis of the LV and indexed to body surface area to determine LVH.
ResultsHypertension and obesity prevalence were 35.8% and 59.3% for adults and 45.3% and 6.3% for children. Self-reported tobacco use in adults was 22.8%. LVH prevalence was 35.8% (n=44) in adults (75% eccentric; 25% concentric), and 6.3% (n=4) in children (all eccentric). Prevalence of concentric remodelling was 15.4% (n=19) in adults and observed in one child. In adults, obesity (OR: 2.54 (1.07-6.02; p=0.02) and hypertension (3.39 (1.08-10.62; p=0.04) significantly increased the odds of LVH, specifically eccentric LVH, while concentric LVH was associated with self-reported tobacco use (OR: 4.58 (1.18-17.73; p=0.03; n=11). Although no logistic regression was run within children, of the four children LVH, three had elevated blood pressure and the child with normal blood pressure was overweight.
ConclusionsThe association between obesity, hypertension, tobacco use and LVH in adults, and the 6% prevalence of LVH in children, calls for stronger public health efforts to control risk factors and monitor children at who are risk. | cardiovascular medicine |
10.1101/2022.03.14.22272327 | Independent effect and joint effect of polygenic liabilities for schizophrenia and bipolar disorder on cognitive aging and education attainment | To elucidate the specific and shared genetic background of schizophrenia (SCZ) and bipolar disorder (BPD) to better understand their nosology, this study explored the independent and joint effects of polygenic liabilities for SCZ and BPD on cognitive aging and educational attainment among a collection of 80318 unrelated community participants from the Taiwan Biobank. Using the Psychiatric Genomics Consortium meta-analysis as a discovery sample, we calculated the polygenic risk score (PRS) for SCZ (PRSSCZ) and BPD (PRSBPD), shared PRS between SCZ and BPD (PRSSCZ+BPD), and SCZ-specific, differentiated from BPD, PRS (PRSSCZvsBPD). Based on the sign concordance of the susceptibility variants with SCZ and BPD, PRSSCZ was split into PRSSCZ_concordant and PRSSCZ_discordant and PRSBPD was split into PRSBPD_concordant and PRSBPD_discordant. Linear regression models were used to estimate the association with cognitive aging as measured by the Mini-Mental State Examination (MMSE) in individuals aged [≥] 60 years. Ordinal logistic regression models were used to estimate the association with educational attainment. PRSSCZ was negatively associated with MMSE (beta=-0.063, p<0.001), while PRSBPD was positively associated with MMSE (beta=0.04, p=0.01). A larger difference between PRSSCZ and PRSBPD was associated with lower MMSE scores (beta=-0.052, p<0.001). Both PRSSCZ_concordant and PRSSCZ_discordant were negatively associated with MMSE scores, without a synergistic effect. There was a complex interaction between PRSBPD_concordant and PRSBPD_discordant on the MMSE scores. PRSSCZ+BPD (beta=-0.09, p=0.01) and PRSSCZvsBPD (beta=-0.13, p<0.001) predicted a decrease in MMSE scores during the follow-up. PRSSCZ, PRSBPD, and PRSSCZ+BPD were positively associated with educational attainment, whereas PRSSCZvs BPD was negatively associated with educational attainment. This study provides evidence for the contrasting effect of polygenic liabilities for SCZ and BPD on cognitive aging and partially supports the hypothesis that the heterogeneity of SCZ and the positive association of polygenic liability for SCZ with education might be attributed to the shared part with BPD. | epidemiology |
10.1101/2022.03.14.22272283 | Migrants' primary care utilisation before and during the COVID-19 pandemic in England: An interrupted time series | BackgroundHow international migrants access and use primary care in England is poorly understood. We aimed to compare primary care consultation rates between international migrants and non-migrants in England before and during the COVID-19 pandemic (2015- 2020).
MethodsUsing linked data from the Clinical Practice Research Datalink (CPRD) GOLD and the Office for National Statistics, we identified migrants using country-of-birth, visa-status or other codes indicating international migration. We ran a controlled interrupted time series (ITS) using negative binomial regression to compare rates before and during the pandemic.
FindingsIn 262,644 individuals, pre-pandemic consultation rates per person-year were 4.35 (4.34-4.36) for migrants and 4.6 (4.59-4.6) for non-migrants (RR:0.94 [0.92-0.96]). Between 29 March and 26 December 2020, rates reduced to 3.54 (3.52-3.57) for migrants and 4.2 (4.17-4.23) for non-migrants (RR:0.84 [0.8-0.88]). Overall, this represents an 11% widening of the pre-pandemic difference in consultation rates between migrants and non-migrants during the first year of the pandemic (RR:0.89, 95%CI:0.84-0.94). This widening was greater for children, individuals whose first language was not English, and individuals of White British, White non-British and Black/African/Caribbean/Black British ethnicities.
InterpretationMigrants were less likely to use primary care before the pandemic and the first year of the pandemic exacerbated this difference. As GP practices retain remote and hybrid models of service delivery, they must improve services and ensure they are accessible and responsive to migrants healthcare needs.
FundingThis study was funded by the Medical Research Council (MR/V028375/1) and Wellcome Clinical Research Career Development Fellowship (206602). | primary care research |
10.1101/2022.03.11.22271853 | Risk of ESKD in related older living kidney donors | Living kidney donors who are biologically related to the recipient have higher risk for end-stage kidney disease (ESKD) compared with those who are unrelated to the recipient. This risk is greater for first-degree relatives than more distant relatives. To understand if this holds true for older donors, who were cleared for donation and might be past the peak-age for hereditary disease, we used donor data (SRTR) linked to ESKD registry data (CMS) and stratified donors by age (younger vs older [[≥]50 years]) and race (black, Hispanic, and white). Younger related donors of all racial groups had higher risk of ESKD compared with younger unrelated donors; however, only older related white and Hispanic donors had higher risk of ESRD compared with unrelated older donors (2.3-fold for white full-siblings and 1.9-fold for white parents/offspring; 3.3-fold for Hispanic full-siblings and 2.0-fold for Hispanic parents/offspring). Older related black donors did not have higher risk compared to older unrelated black donors (0.8-fold for black full-siblings and 0.5-fold for black parents/offspring). Our study points to an earlier age of onset of kidney disease in black donors with a family history of ESKD. Our findings call for programs that promote living donation among related older black donor candidates. | transplantation |
10.1101/2022.03.14.22272316 | Prevalence, determinants, and impact on general health and working capacity of post-acute sequelae of COVID-19 six to 12 months after infection: a population-based retrospective cohort study from southern Germany | BackgroundPost-acute sequelae of SARS-CoV-2 infection have commonly been described after COVID-19, but few population-based studies have examined symptoms six to 12 months after acute SARS-CoV-2 infection and their associations with general health recovery and working capacity.
MethodsThis population-based retrospective cohort study in four geographically defined regions in southern Germany included persons aged 18-65 years with PCR confirmed SARS-CoV-2 infection between October 2020 and March 2021. Symptom frequencies (six to 12 months after versus before acute infection, expressed as prevalence differences [PD] and prevalence ratios [PR]), symptom severity and clustering, risk factors and associations with general health recovery, and working capacity were analysed.
FindingsAmong a total of 11 710 subjects (mean age 44{middle dot}1 years, 59{middle dot}8% females, 3{middle dot}5% previously admitted with COVID-19, mean follow-up time 8.5 months) the most prevalent symptoms with PDs >20% and PRs >5% were rapid physical exhaustion, shortness of breath, concentration difficulties, chronic fatigue, memory disturbance, and altered sense of smell. Female sex and severity of the initial infection were the main risk factors. Prevalence rates, however, appeared substantial among both men and women who had a mild course of acute infection, and PCS considerably affected also younger subjects. Fatigue (PD 37{middle dot}2%) and neurocognitive impairment (PD 31{middle dot}3%) as symptom clusters contributed most to reduced health recovery and working capacity, but chest symptoms, anxiety/depression, headache/dizziness and pain syndromes were also prevalent and relevant for working capacity, with some differences according to sex and age. When considering new symptoms with at least moderate impairment of daily life and [≤]80% recovered general health or working capacity, the overall estimate for post-COVID syndrome was 28{middle dot}5% (age- and sex-standardised rate 26{middle dot}5%).
InterpretationThe burden of self-reported post-acute symptoms and possible sequelae, notably fatigue and neurocognitive impairment, remains considerable six to 12 months after acute infection even among young and middle-aged adults after mild acute SARS-CoV-2 infection, and impacts general health and working capacity.
Research in contextO_ST_ABSEvidence before this studyC_ST_ABSPrevious studies have shown that post-acute sequelae of COVID-19 are common, in particular among patients who had been admitted to hospital for COVID-19. Post-acute self-reported complaints and symptoms often are diverse, nonspecific and sometimes of unknown severity and functional relevance. We searched PubMed and medRxiv for studies published between January 2021 and February 2022, using search terms describing "long covid, post-acute sequelae of COVID-19, prevalence, and systematic review", with no language restrictions. Searches with the terms "long covid", "post-acute sequelae of COVID-19", "post-covid condition" and "post-covid syndrome" were also done in PROSPERO, and we screened the website of the UK Office for National Statistics (www.ons.gov.uk) for long covid studies. We found more than 20 systematic reviews summarising post-acute symptom patterns among adults and a prevalence of "any" or "defined" symptoms (such as respiratory symptoms or symptoms related to mental health) or of medically assessed functional impairment (pulmonary or cardiac or neurocognitive function). Two reviews reported of health-related quality of life assessments. The prevalence of post-acute sequelae of COVID-19 or long covid/post-covid syndromes ranged between <10 to >70%, in part due to lack of uniform and clear case definitions, variable follow-up times, and non-inclusion of outpatients with initially mild disease. Most papers reviewed presented high heterogeneity and had a short follow-up, and there were very few papers estimating the prevalence of post-covid syndrome beyond six months after acute infection. The studies with the largest number of subjects were either including only patients after hospital admission, used online surveys of subjects with self-reported suspected and confirmed COVID-19 or electronic medical records only. We found one (small but) comprehensive population-based study from Switzerland assessing post-covid syndrome prevalence and associations with quality of life and health recovery with a follow-up time ranging from six to 10 months. Two further population-based studies from Switzerland and Norway investigated long covid symptoms among subgroups with [≥]6 months (n=498) and 11 to 12 months (n=170) of follow-up after acute infection, respectively.
Added value of this studyWith this large population-based study, we provide evidence of persistence of new symptom clusters (not present before acute infection) such as fatigue, neurocognitive impairment, chest symptoms, smell or taste disorder, and anxiety/depression beyond six months after acute infection, with a prevalence of >20% for each of these five clusters. We show that the three most frequent clusters (fatigue, neurocognitive impairment, chest symptoms) are often interfering with daily life and activities, often co-occur, and that both fatigue and neurocognitive impairment have the largest impact on working capacity, while long-term smell and taste disorders are reported relatively independent of other complaints. Age in this 18-65-year old adult population was not a major determinant of symptom prevalence, but we confirm severity of the initial infection and female sex as consistent risk factors for various manifestations of medium-term post-COVID syndrome, and age as risk factor for self-reported reduced working capacity, which overall and at population level exceeded 10%.
Implications of all the available evidenceFuture research should include the medical validation of the key symptom clusters of post-COVID syndrome, determine the possible causes, and urgently address prognostic factors and therapeutic options. The described key symptom clusters contributed most to reduced general health status and working capacity in middle-aged adults. The findings of this study may also help develop a more consistent and relevant definition of post-COVID syndrome with major implications for research and medical practice. | infectious diseases |
10.1101/2022.03.14.22272336 | Glucose and lipid profiles in men with non-obstructive azoospermia | The impact of a low testosterone level among men with non-obstructive azoospermia with various testicular histopathological patterns on the regulation of glucose and lipid metabolism is less well known than among the general population. The aim of this retrospective study was to examine the association between testicular histopathology and components of the metabolic profile among men with non-obstructive azoospermia. Participants were divided into two groups: men with non-obstructive azoospermia and men with obstructive azoospermia. Testicular biopsies were performed among those with non-obstructive azoospermia. We included 115 patients in this study: 83 (72.2%) had non-obstructive azoospermia and 32 (27.8%) had obstructive azoospermia. The plasma glucose concentration, glycated hemoglobin level, and lipid profile were similar between patients with non-obstructive azoospermia and those with obstructive azoospermia. Upon subgroup analysis of patients with non-obstructive azoospermia, those with Sertoli-cell-only syndrome had the lowest testosterone (431 {+/-} 238 ng/dL; P=0.039) and highest follicle-stimulating hormone (23.4 {+/-} 18.2 mIU/mL; P=0.002) concentrations. They also had the highest triglyceride concentration (163 {+/-} 114 mg/dL; P=0.001). Interestingly, patients with Sertoli-cell-only syndrome had a lower fasting plasma glucose concentration (92 {+/-} 11 mg/dL; P<0.001) and glycated hemoglobin level (5.9 {+/-} 0.8%; P=0.022) than those with histopathological patterns of maturation arrest and hypospermatogenesis. In conclusion, differences in glucose and lipid metabolism are evident between men with non-obstructive azoospermia with different spermatogenesis patterns. | urology |
10.1101/2022.03.14.22272130 | Mid-term subclinical myocardial injury detection in patients recovered from COVID-19 according pulmonary lesion severity. | BackgroundSevere acute respiratory syndrome coronavirus 2 (SARS-CoV 2) may cause damage of the cardiovascular system during the acute phase of infection. However, Recent studies described a mid and long-term subtle cardiac injuries after recovery from acute Coronavirus disease 19 (COVID-19).The aim of this study was to determine the relationship between the severity of chest computed tomography (CT) lesions and the persistence of subtle myocardial injuries at mid-term follow-up of patients recovered from COVID-19 infection.
MethodsAll COVID-19 patients were enrolled prospectively in this study. Sensitive troponin T (hsTnT) and chest CT scan was performed in all patients at the acute phase of Covid-19 infection. At the mid-term follow up, conventional transthoracic echocardiograph and global longitudinal strain (GLS) of left and right ventricles (LV and RV) were determined and compared between patients with chest CT scan lesions less than 50% (Group 1) and those with severe chest CT scan greater or equal to 50% (Group 2).
ResultsThe mean age was 55 more or less than 14 years. Both LV GLS and RV GLS were significantly decreased in the group 2 (p=0.013 and p=0.011, respectively). LV GLS value more than -18% was noted in 43% of all the patients and RV GLS value more than -20% was observed in 48% of them. The group with severe chest CT scan lesions included more patients with reduced LV GLS and reduced RV GLS than the group with mild chest CT scan lesions (G1:29% vs. G2:57%, p=0.002) and (G1:36% vs. G2:60 %, p=0.009) respectively).
ConclusionPatients with severe chest CT scan lesions are more likely to develop subclinical myocardial damage. TTE could be recommended in patients recovering from COVID-19 to detect subtle LV and RV lesions.
Trial registrationThe cohort of patients is a part of the research protocol (IORG 00093738 N{degrees}102/OMB 0990-0279) approved by the Hospital Ethics Committee. | cardiovascular medicine |
10.1101/2022.03.14.22272372 | Prospectively validated disease-agnostic predictive medicine with hybrid AI | Despite numerous superhuman achievements in complex challenges1,2, standalone AI does not free life science from the long-term bottleneck of linearly extracting new knowledge from exponentially growing new data3, severely limiting the success rate of drug discovery4. Inspired by the state-of-the-art AI training methods, we trained a human-centric hybrid augmented intelligence5 (HAI) to learn a foundation model6 that extracts all-encompassing knowledge of human physiology and diseases. To evaluate the quality of HAIs extracted knowledge, we designed the public, prospective prediction of pivotal ongoing clinical trial outcomes at large scale (PROTOCOLS) challenge to benchmark HAIs real-world performance of predicting drug clinical efficacy without access to human data. HAI achieved a 10.5-fold improvement from the baseline with 99% confidence7 in the PROTOCOLS validation, readily increasing the average clinical success rate of investigational new drugs from 7.9%8 to 90% for almost any human diseases9,10. The validated HAI confirms that exponentially extracted knowledge alone is sufficient for accurately predicting drug clinical efficacy, effecting a total reversal of Erooms law4. HAI is also the worlds first clinically validated model of human aging that could substantially speed up the discovery of preventive medicine for all age-related diseases11. Our results demonstrate that disruptive breakthroughs necessitate the smallest team size to attain the largest HAI for optimal knowledge extraction from high-dimensional low-quality data space, thus establishing the first prospective proof of the previous discovery that small teams disrupt12. The global adoption of training HAI provides a well-beaten economic path to mass-produce scientific and technological breakthroughs via exponential knowledge extraction and better designs of data labels for training better-performing AI13. | pharmacology and therapeutics |
10.1101/2022.03.14.22272372 | Prospectively validated disease-agnostic predictive medicine with hybrid AI | Despite numerous superhuman achievements in complex challenges1,2, standalone AI does not free life science from the long-term bottleneck of linearly extracting new knowledge from exponentially growing new data3, severely limiting the success rate of drug discovery4. Inspired by the state-of-the-art AI training methods, we trained a human-centric hybrid augmented intelligence5 (HAI) to learn a foundation model6 that extracts all-encompassing knowledge of human physiology and diseases. To evaluate the quality of HAIs extracted knowledge, we designed the public, prospective prediction of pivotal ongoing clinical trial outcomes at large scale (PROTOCOLS) challenge to benchmark HAIs real-world performance of predicting drug clinical efficacy without access to human data. HAI achieved a 10.5-fold improvement from the baseline with 99% confidence7 in the PROTOCOLS validation, readily increasing the average clinical success rate of investigational new drugs from 7.9%8 to 90% for almost any human diseases9,10. The validated HAI confirms that exponentially extracted knowledge alone is sufficient for accurately predicting drug clinical efficacy, effecting a total reversal of Erooms law4. HAI is also the worlds first clinically validated model of human aging that could substantially speed up the discovery of preventive medicine for all age-related diseases11. Our results demonstrate that disruptive breakthroughs necessitate the smallest team size to attain the largest HAI for optimal knowledge extraction from high-dimensional low-quality data space, thus establishing the first prospective proof of the previous discovery that small teams disrupt12. The global adoption of training HAI provides a well-beaten economic path to mass-produce scientific and technological breakthroughs via exponential knowledge extraction and better designs of data labels for training better-performing AI13. | pharmacology and therapeutics |
10.1101/2022.03.14.22272372 | Prospectively validated disease-agnostic predictive medicine with hybrid AI | Despite numerous superhuman achievements in complex challenges1,2, standalone AI does not free life science from the long-term bottleneck of linearly extracting new knowledge from exponentially growing new data3, severely limiting the success rate of drug discovery4. Inspired by the state-of-the-art AI training methods, we trained a human-centric hybrid augmented intelligence5 (HAI) to learn a foundation model6 that extracts all-encompassing knowledge of human physiology and diseases. To evaluate the quality of HAIs extracted knowledge, we designed the public, prospective prediction of pivotal ongoing clinical trial outcomes at large scale (PROTOCOLS) challenge to benchmark HAIs real-world performance of predicting drug clinical efficacy without access to human data. HAI achieved a 10.5-fold improvement from the baseline with 99% confidence7 in the PROTOCOLS validation, readily increasing the average clinical success rate of investigational new drugs from 7.9%8 to 90% for almost any human diseases9,10. The validated HAI confirms that exponentially extracted knowledge alone is sufficient for accurately predicting drug clinical efficacy, effecting a total reversal of Erooms law4. HAI is also the worlds first clinically validated model of human aging that could substantially speed up the discovery of preventive medicine for all age-related diseases11. Our results demonstrate that disruptive breakthroughs necessitate the smallest team size to attain the largest HAI for optimal knowledge extraction from high-dimensional low-quality data space, thus establishing the first prospective proof of the previous discovery that small teams disrupt12. The global adoption of training HAI provides a well-beaten economic path to mass-produce scientific and technological breakthroughs via exponential knowledge extraction and better designs of data labels for training better-performing AI13. | pharmacology and therapeutics |
10.1101/2022.03.14.22272372 | Prospectively validated disease-agnostic predictive medicine with augmented intelligence | Despite numerous superhuman achievements in complex challenges1,2, standalone AI does not free life science from the long-term bottleneck of linearly extracting new knowledge from exponentially growing new data3, severely limiting the success rate of drug discovery4. Inspired by the state-of-the-art AI training methods, we trained a human-centric hybrid augmented intelligence5 (HAI) to learn a foundation model6 that extracts all-encompassing knowledge of human physiology and diseases. To evaluate the quality of HAIs extracted knowledge, we designed the public, prospective prediction of pivotal ongoing clinical trial outcomes at large scale (PROTOCOLS) challenge to benchmark HAIs real-world performance of predicting drug clinical efficacy without access to human data. HAI achieved a 10.5-fold improvement from the baseline with 99% confidence7 in the PROTOCOLS validation, readily increasing the average clinical success rate of investigational new drugs from 7.9%8 to 90% for almost any human diseases9,10. The validated HAI confirms that exponentially extracted knowledge alone is sufficient for accurately predicting drug clinical efficacy, effecting a total reversal of Erooms law4. HAI is also the worlds first clinically validated model of human aging that could substantially speed up the discovery of preventive medicine for all age-related diseases11. Our results demonstrate that disruptive breakthroughs necessitate the smallest team size to attain the largest HAI for optimal knowledge extraction from high-dimensional low-quality data space, thus establishing the first prospective proof of the previous discovery that small teams disrupt12. The global adoption of training HAI provides a well-beaten economic path to mass-produce scientific and technological breakthroughs via exponential knowledge extraction and better designs of data labels for training better-performing AI13. | pharmacology and therapeutics |
10.1101/2022.03.14.22272372 | Prospectively validated disease-agnostic predictive medicine with augmented intelligence | Despite numerous superhuman achievements in complex challenges1,2, standalone AI does not free life science from the long-term bottleneck of linearly extracting new knowledge from exponentially growing new data3, severely limiting the success rate of drug discovery4. Inspired by the state-of-the-art AI training methods, we trained a human-centric hybrid augmented intelligence5 (HAI) to learn a foundation model6 that extracts all-encompassing knowledge of human physiology and diseases. To evaluate the quality of HAIs extracted knowledge, we designed the public, prospective prediction of pivotal ongoing clinical trial outcomes at large scale (PROTOCOLS) challenge to benchmark HAIs real-world performance of predicting drug clinical efficacy without access to human data. HAI achieved a 10.5-fold improvement from the baseline with 99% confidence7 in the PROTOCOLS validation, readily increasing the average clinical success rate of investigational new drugs from 7.9%8 to 90% for almost any human diseases9,10. The validated HAI confirms that exponentially extracted knowledge alone is sufficient for accurately predicting drug clinical efficacy, effecting a total reversal of Erooms law4. HAI is also the worlds first clinically validated model of human aging that could substantially speed up the discovery of preventive medicine for all age-related diseases11. Our results demonstrate that disruptive breakthroughs necessitate the smallest team size to attain the largest HAI for optimal knowledge extraction from high-dimensional low-quality data space, thus establishing the first prospective proof of the previous discovery that small teams disrupt12. The global adoption of training HAI provides a well-beaten economic path to mass-produce scientific and technological breakthroughs via exponential knowledge extraction and better designs of data labels for training better-performing AI13. | pharmacology and therapeutics |
10.1101/2022.03.14.22272372 | Prospectively validated disease-agnostic predictive medicine with augmented intelligence | Despite numerous superhuman achievements in complex challenges1,2, standalone AI does not free life science from the long-term bottleneck of linearly extracting new knowledge from exponentially growing new data3, severely limiting the success rate of drug discovery4. Inspired by the state-of-the-art AI training methods, we trained a human-centric hybrid augmented intelligence5 (HAI) to learn a foundation model6 that extracts all-encompassing knowledge of human physiology and diseases. To evaluate the quality of HAIs extracted knowledge, we designed the public, prospective prediction of pivotal ongoing clinical trial outcomes at large scale (PROTOCOLS) challenge to benchmark HAIs real-world performance of predicting drug clinical efficacy without access to human data. HAI achieved a 10.5-fold improvement from the baseline with 99% confidence7 in the PROTOCOLS validation, readily increasing the average clinical success rate of investigational new drugs from 7.9%8 to 90% for almost any human diseases9,10. The validated HAI confirms that exponentially extracted knowledge alone is sufficient for accurately predicting drug clinical efficacy, effecting a total reversal of Erooms law4. HAI is also the worlds first clinically validated model of human aging that could substantially speed up the discovery of preventive medicine for all age-related diseases11. Our results demonstrate that disruptive breakthroughs necessitate the smallest team size to attain the largest HAI for optimal knowledge extraction from high-dimensional low-quality data space, thus establishing the first prospective proof of the previous discovery that small teams disrupt12. The global adoption of training HAI provides a well-beaten economic path to mass-produce scientific and technological breakthroughs via exponential knowledge extraction and better designs of data labels for training better-performing AI13. | pharmacology and therapeutics |
10.1101/2022.03.14.22272372 | Prospectively validated augmented intelligence for disease-agnostic predictions of clinical success for novel therapeutics | Despite numerous superhuman achievements in complex challenges1,2, standalone AI does not free life science from the long-term bottleneck of linearly extracting new knowledge from exponentially growing new data3, severely limiting the success rate of drug discovery4. Inspired by the state-of-the-art AI training methods, we trained a human-centric hybrid augmented intelligence5 (HAI) to learn a foundation model6 that extracts all-encompassing knowledge of human physiology and diseases. To evaluate the quality of HAIs extracted knowledge, we designed the public, prospective prediction of pivotal ongoing clinical trial outcomes at large scale (PROTOCOLS) challenge to benchmark HAIs real-world performance of predicting drug clinical efficacy without access to human data. HAI achieved a 10.5-fold improvement from the baseline with 99% confidence7 in the PROTOCOLS validation, readily increasing the average clinical success rate of investigational new drugs from 7.9%8 to 90% for almost any human diseases9,10. The validated HAI confirms that exponentially extracted knowledge alone is sufficient for accurately predicting drug clinical efficacy, effecting a total reversal of Erooms law4. HAI is also the worlds first clinically validated model of human aging that could substantially speed up the discovery of preventive medicine for all age-related diseases11. Our results demonstrate that disruptive breakthroughs necessitate the smallest team size to attain the largest HAI for optimal knowledge extraction from high-dimensional low-quality data space, thus establishing the first prospective proof of the previous discovery that small teams disrupt12. The global adoption of training HAI provides a well-beaten economic path to mass-produce scientific and technological breakthroughs via exponential knowledge extraction and better designs of data labels for training better-performing AI13. | pharmacology and therapeutics |
10.1101/2022.03.15.22272348 | The impact of COVID-19 on pregnant and recently pregnant women in Malawi: A national facility-based cohort | ObjectiveTo describe the demographic characteristics, clinical manifestations, and clinical outcomes of hospitalised pregnant and recently pregnant women with COVID-19 in Malawi, a low-income country in Sub-Saharan Africa. This study responds to a critical gap in the global COVID-19 data.
MethodsA national surveillance platform was established in Malawi by the Ministry of Health to record the impact of COVID-19 on pregnant and recently pregnant women and provide real-time data for decision making. We report this facility-based cohort that includes all pregnant and recently pregnant hospitalised women in Malawi suspected of having COVID-19 between 2nd June 2020 and 1st December 2021.
Results398 women were admitted to hospital with suspected COVID-19 based on presenting symptoms and were tested; 246 (62%) were confirmed to have COVID-19. In women with COVID-19, the mean age was 27 {+/-} 7 years.
The most common presenting symptoms were cough (74%), breathlessness (45%), Fever (42%), headache (17%), and joint pain (10%). 53% of the women had COVID-19 symptoms severe enough to warrant admission.
31% (76/246) of women admitted with COVID-19 suffered a severe maternal outcome, 47/246 (19%) died, and 29/246 (12%) had a near-miss event. 9/111 (8%) of recorded births were stillbirths, and 12/101 (12%) of the live births resulted in early neonatal death.
ConclusionA national electronic platform providing real-time information on the characteristics and outcomes of pregnant and recently pregnant women with COVID-19 admitted to Malawian government hospitals. These women had much higher rates of adverse outcomes than those suggested in the current global data. These findings may reflect the differences in the severity of disease required for women to present and be admitted to Malawian hospitals, limited access to intensive care and the pandemics disruption to the health system.
SUMMARY BOXO_ST_ABSWhat is already known?C_ST_ABSO_LIIn pregnant and recently pregnant women, COVID-19 is associated with increased complications such as admission to an intensive care unit, invasive ventilation, and maternal death.
C_LIO_LIIn pregnant women with confirmed COVID-19, the current global estimate of all-cause mortality is 0.02%.
C_LIO_LIMost countries in Africa rely on paper-based systems to collect key maternal health indicators such as maternal deaths and severe morbidity, which does not enable timely actions.
C_LI
What are the new findings?O_LIMaternal mortality and adverse perinatal outcomes are alarmingly high in a cohort of pregnant and recently pregnant women admitted to Malawian healthcare facilities located in a low-income country in Africa.
C_LIO_LIA national facility-based maternal surveillance platform can be implemented during a pandemic and provide real-time data to aid policymakers in understanding its impacts on this key population.
C_LI
What do the new findings imply?O_LIIn low-income countries in Sub-Saharan Africa, pregnant and recently pregnant women with COVID-19 admitted to hospital require enhanced care and a renewed focus on their needs to avert these adverse health outcomes.
C_LIO_LIGlobal and national surveillance systems must specifically record outcomes for pregnant, recently pregnant women and their infants to understand the impact of public health emergencies on these groups, as they may be disproportionately affected and may require special considerations.
C_LI | public and global health |
10.1101/2022.03.15.22272192 | Laboratory changes associated with medication non-adherence in patients with hypertension over six months of the COVID-19 pandemic | The coexistence of multiple diseases is common in the elderly and often accompanied by medication non-adherence. This study investigated the relationship between medication non- adherence and laboratory findings inpatients with hypertension and hypertensive comorbidities (i.e., diabetes and nephropathy) in southern Taiwan during 6 months of the coronavirus disease pandemic. This was a panel study and involved outpatients from three hospitals classified as regional hospitals or above. Questionnaireswere usedto collect information on patient demographics, diet, medication adherence, and laboratory data at the time of recruitment and 6 months after. A total of 140 patients with only hypertension and 98 patients with hypertension and comorbidities were recruited, and the changes inblood pressure andlaboratory data were assessed after 6 months. Analyses performed with generalized estimating equations showed that patients who had not forgotten to take medication had a higher estimated glomerular filtration rate. Moreover, patients who did not change their medication time arbitrarily had lower low- density lipoprotein levels. Furthermore, patients who did not stop or interrupt their medication arbitrarily had lower diastolic blood pressuresand low-density lipoprotein levels. Overall, patients with better medication adherence had better estimated glomerular filtration rates,lower low-density lipoprotein levels, and lower diastolic blood pressures. | public and global health |
10.1101/2022.03.15.22272350 | Immunogenicity of the third and fourth BNT162b2 mRNA COVID-19 boosters and factors associated with immune response in systemic lupus erythematosus and rheumatoid arthritis patients | ObjectivesTo evaluate the safety and immunogenicity of third and fourth BNT162b2 boosters in systemic lupus erythematosus (SLE) and rheumatoid arthritis (RA) patients.
MethodsSLE and RA patients aged 18-65 years who completed a series of inactivated, adenoviral vector, or heterogenous adenoviral vector/mRNA vaccines for at least 28 days were enrolled. Immunogenicity assessment was done before and day 15 after each booster vaccination. The third BNT162b2 booster was administered on day 1. Patients with suboptimal humoral response to the third booster dose (anti-receptor binding domain (RBD) IgG on day 15 < 2,360 BAU/mL) were given a fourth BNT162b2 booster on day 22.
ResultsSeventy one SLE and 29 RA patients were enrolled. The third booster raised anti-RBD IgG by 15 fold and patients with positive neutralizing activity against the Omicron variant increased from 0% to 42%. Patients with positive cellular immune response also increased from 55% to 94%. High immunosuppressive load and initial inactivated vaccine were associated with lower anti-RBD IgG titer.
Fifty four patients had suboptimal humoral responses to the third booster and 28 received a fourth booster dose. Although anti-RBD IgG increased further by 7 fold, no significant change in neutralizing activity against the Omicron variant was observed. There were 2 severe SLE flares that occurred shortly after the fourth booster dose.
ConclusionsThe third BNT162b2 booster significantly improved humoral and cellular immunogenicity in SLE and RA patients. The benefit of a short interval fourth booster in patients with suboptimal humoral response was unclear.
Key messagesO_ST_ABSWhat is already known about this subject?C_ST_ABS- The SARS-CoV-2 omicron variant (B.1.1.159) has multiple mutations that have resulted in greater escape from immune protection elicited by COVID-19 vaccines.
- More attenuated immune response to SARS-CoV-2 vaccination has been observed in patients with autoimmune rheumatic diseases. The additional third dose of SARS-CoV-2 vaccine has been recommended in immunocompromised populations.
- Some immunocompromised patients have a suboptimal humoral response to a third booster dose. Factors associated with poor immune response have not been adequately studied.
- Administration of more than 3 doses has been shown to enhance immune response in some severely immunocompromised patients.
What does this study add?- The third BNT162b2 booster was well tolerated, and significantly improved both humoral and cellular immunogenicity in SLE and RA patients previously vaccinated with either inactivated, adenoviral vector, or heterogenous adenoviral vector/mRNA vaccines.
- High intensity of immunosuppressive therapy and initial inactivated vaccine were associated with lower humoral immune response to the third BNT162b2 booster.
- Administration of a fourth BNT162b2 booster in poor humoral immune responders may not offer additional protection against the omicron variant, and flares were observed in SLE patients.
How might this impact on clinical practice or future developments?- This study supported a third BNT162b2 booster dose administration in SLE and RA patients to enhance immune protection against the Omicron variant.
- Patients who receive a high dose of immunosuppressive therapy or initial inactivated vaccine could be unprotected from SARS-CoV-2 infection. Benefits and risks of additional boosters or second generation of SARS-CoV-2 vaccine should be further studied. | rheumatology |
10.1101/2022.03.14.22272365 | Reduced norepinephrine transporter binding in Parkinson's disease with dopa responsive freezing gait | Freezing of gait (FOG) is a major cause of falling and leads to loss of independence in Parkinsons disease (PD). The pathophysiology of FOG is poorly understood - although there is a hypothesized link with NE systems. PD-FOG can present in levodopa-responsive and unresponsive forms.
We examined NE transporter (NET) binding via brain positron emission tomography (PET) to evaluate changes in NET density associated with FOG using the high affinity selective NET antagonist radioligand [11C]MeNER (2S,3S)(2-[-(2-methoxyphenoxy)benzyl]morpholine) in N=52 parkinsonian patients. We used a rigorous levodopa challenge paradigm to characterize patients as non-freezing PD (NO-FOG, N=16), levodopa responsive freezing (OFF-FOG, N=10), levodopa-unresponsive freezing (ONOFF-FOG, N=21), and primary progressive freezing of gait (PP-FOG, N=5).
Linear mixed models identified significant reductions in whole brain NET binding in the OFF-FOG group compared to the NO-FOG group (-16.8%, P=0.021). Additional contrasts tested post-hoc identified trends toward increased NET expression in ONOFF-FOG vs. OFF-FOG ({approx}10%; P=0.123). Linear mixed models with interaction terms identified significantly reduced NET binding in right thalamus in the OFF-FOG group (P=0.038). A linear regression analysis identified an association between reduced NET binding and more severe NFOG-Q score only in the OFF-FOG group (P=0.022).
This is the first study to examine brain noradrenergic innervation using NET-PET in PD patients with and without FOG. Based on the normal regional distribution of noradrenergic innervation and pathological studies in the thalamus of PD patients, the implications of our findings suggest that noradrenergic limbic pathways may play a key role in OFF-FOG in PD. This finding could have implications for clinical subtyping of FOG as well as development of therapies. | neurology |
10.1101/2022.03.14.22271705 | The genomic landscape across 474 surgically accessible epileptogenic human brain lesions | Understanding the exact molecular mechanisms involved in the etiology of epileptogenic pathologies with or without tumor activity is essential for improving treatment of drug-resistant focal epilepsy. Here, we characterize the landscape of somatic genetic variants in resected brain specimens from 474 individuals with drug-resistant focal epilepsy using deep whole-exome sequencing (>350x) and whole-genome genotyping. Across the exome, we observe a greater number of somatic single-nucleotide variants (SNV) in low-grade epilepsy-associated tumors (LEAT; 7.92{+/-}5.65 SNV) than in brain tissue from malformations of cortical development (MCD; 6.11{+/-}4 SNV) or hippocampal sclerosis (HS; 5.1{+/-}3.04 SNV). Tumor tissues also had the largest number of likely pathogenic variant carrying cells. LEAT had the highest proportion of samples with one or more somatic copy number variants (CNV; 24.7%), followed by MCD (5.4%) and HS (4.1%). Recurring somatic whole chromosome duplications affecting Chromosome 7 (16.8%), chromosome 5 (10.9%), and chromosome 20 (9.9%) were observed among LEAT. For germline variant-associated MCD genes such as TSC2, DEPDC5, and PTEN, germline SNV were frequently identified within large loss of heterozygosity regions, supporting the recently proposed second hit disease mechanism in these genes. We detect somatic variants in twelve established lesional epilepsy genes and demonstrate exome-wide statistical support for three of these in the etiology of LEAT (e.g., BRAF) and MCD (e.g., SLC35A2 and MTOR). We also identify novel significant associations for PTPN11 with LEAT and NRAS Q61 mutated protein with a complex MCD characterized by polymicrogyria and nodular heterotopia. The variants identified in NRAS are known from cancer studies to lead to hyperactivation of NRAS, which can be targeted pharmacologically. We identify large recurrent 1q21-q44 duplication including AKT3 in association with focal cortical dysplasia type 2a with hyaline astrocytic inclusions, another rare and possibly under-recognized brain lesion. The clinical genetic analyses showed that the numbers of somatic SNV across the exome and the fraction of affected cells were positively correlated with the age at seizure onset and surgery in individuals with LEAT. Finally, we report that identifying a likely pathogenic variant enabled us to refine or reclassify previous histopathological classifications post hoc in 20.5% of diagnoses. In summary, our comprehensive genetic screen sheds light on the genome-scale landscape of genetic variants in epileptic brain lesions, informs the design of gene panels for clinical diagnostic screening, and guides future directions for clinical implementation of epilepsy surgery genetics. | genetic and genomic medicine |
10.1101/2022.03.14.22272354 | Estimating Cuff-less Continuous Blood Pressure from Fingertip Photoplethysmogram Signals with Deep Neural Network Model | ObjectiveBlood pressure (BP) is an important physiological index reflecting cardiovascular function. Continuous blood pressure monitoring helps to reduce the prevalence and mortality of cardiovascular diseases. In this study, we aim to estimate systolic blood pressure (SBP) and diastolic blood pressure (DBP) values continuously based on fingertip photoplethysmogram (PPG) waveforms using deep neural network models.
MethodsTwo models were proposed and both models consisted of three stages. The only difference between them was the method of extracting features from PPG signals in the first stage. Model 1 adopted Bidirectional Long Short-Term Memory (BiLSTM), while the other used convolutional neural network. Then, the residual connection was applied to multiple stacked LSTM layers in the second stage, following by the third stage with two fully connected layers.
ResultsOur proposed models outperformed other methods based on similar dataset or framework, while in our proposed models, the model 2 was superior to model 1. It satisfied the standard of Association for the Advancement of the Medical Instrumentation (AAMI) and obtained grade A for SBP and DBP estimation according to the British Hypertension Society (BHS) standard. The mean error (ME) and standard deviation (STD) for SBP and DBP estimations were 0.21 {+/-} 6.40 mmHg and 0.19 {+/-}4.71 mmHg, respectively.
ConclusionOur proposed models could extract important features of fingertip PPG waveforms automatically and realize cuff-less continuous BP monitoring, which can be helpful in the identification and early treatment of abnormal blood pressure, thus may reduce the occurrence of cardiovascular malignant events. | health informatics |
10.1101/2022.03.14.22272381 | CDC (Cindy and David's Conversations) Game: Advising President to Survive Pandemic | AO_SCPLOWBSTRACTC_SCPLOWOngoing debates on anti-COVID19 policies have been focused on coexistence vs. zero-out strategies, which can be simplified as "always open (AO)" vs. "always closed (AC)." We postulate that, the middle ground between the two extremes, dubbed LOHC (low-risk open and high-risk closed), is likely more favorable, precluding obviously irrational HOLC (high-open-low-closed). From a meta-strategy perspective, these four policies cover the full spectrum of anti-pandemic policies. We argue that, among numerous factors influencing strategic policy-making, the competence of advisory body such as CDC chief-scientist (say, Cindy) and politics in decision-making body such as president (David), and their cooperation/communication can be critical. Here we investigate anti-pandemic policy-making by harnessing the power of evolutionary game theory in modeling competition/cooperation/communication (three critical processes underlying biological and social evolutions). Specifically, we apply the Sir Philip Sydney (SPS) game, a 4x4 signaler-responder evolutionary game with 16 strategic interactions, which was devised to investigate the reliability of communication that can modulate competition and cooperation, to capture rich idiosyncrasies surrounding todays anti-pandemic policies. By emulating the reality of anti-pandemic policies today, the study aims to identify possible cognitive gaps and traps. The extended SPS, dubbed CDC (Cindy and Davids Conversations) game, offers a powerful cognitive model for investigating the coexistence/zero-out dichotomy and possible alternatives. The rigorous analytic solutions and extensive simulations suggest a take-home message--keep it persistently simple and rational: while apparently preferred middle-ground LOHC seems to be small-probability ([~]0.05) event counter-intuitively, the AO and AC policies appears to be large-probability ([~]0.41-0.53) events.
LO_SCPLOWAYC_SCPLOW SO_SCPLOWUMMARYC_SCPLOWOngoing debates on anti-COVID19 policies have been focused on coexistence-with vs. zero-out (virus) strategies, which can be simplified as "always open (AO)" vs. "always closed (AC)." We postulate that middle ground, dubbed LOHC (low-risk-open and high-risk-closed), is likely more favorable, precluding obviously irrational HOLC (high-risk-open and low-risk-closed). From a meta-strategy perspective, these four policies cover the full spectrum of anti-pandemic policies. By emulating the reality of anti-pandemic policies today, the study aims to identify possible cognitive gaps and traps by harnessing the power of evolutionary game-theoretic analysis and simulations, which suggest that (i) AO and AC seems to be "high-probability" events ([~]0.41-0.53); (ii) counter-intuitively, the middle ground--LOHC--seems to be small-probability event ([~]0.05), possibly due to its unduly complexity, mirroring its wide-range failures in practice. | health policy |
10.1101/2022.03.15.22272370 | Durability of second-line anti-retroviral therapy in a resource-limited setting: an 11-year analytical cohort | BackgroundIt is projected that up to 19{middle dot}6% of patients on ART in Sub-Saharan Africa will need second-line treatment by 2030, but the durability of such therapy remains unclear. This study investigated the durability of second-line ART and the factors associated with the viral rebound among patients on second-line ART in Uganda.
MethodsA retrospective dynamic cohort of adults initiated on second-line ART after confirmed virological failure to first-line ART. Patients that had taken second-line for [≥]6 months between 2007 and 2017 were included. Patients were followed until they experienced a viral rebound (Viral load [≥]200copies/ml). Cumulative probability of viral rebounds and factors associated with viral rebound were determined using Kaplan-Meier methods and Cox proportional hazard models, respectively.
Findings1101 participants were enrolled. At base-line, 64% were female, the median age was 37 years (IQR 31-43), median duration on first-line ART was 44months (IQR 27-67), and the median CD4 and viral load were 128 cells/ul (IQR 58-244) and 45978 copies/ml (IQR 13827-139583), respectively. During the 4757{middle dot}21 person-years, the incidence density of viral rebound was 74{middle dot}62 (95% CI 67{middle dot}25- 82{middle dot}80) per 1000 person-years. The probability of a viral round at 5 and 10 years was 0{middle dot}29 (95% C: I 0{middle dot}26 -0{middle dot}32) and 0{middle dot}623 (95%CI:0{middle dot}55 -0{middle dot}69), respectively. The median survival without experiencing a viral rebound was 8{middle dot}7 years. Young adults (18-24) years (aHR 2{middle dot}31 95 CI 1{middle dot}25-4{middle dot}27), high switch viral load [≥]100,000copies/ml (aHR 1{middle dot}53 95 CI 1{middle dot}23- 1{middle dot}91) and ATV/r based second-line (aHR1{middle dot}53 95 CI 1{middle dot}18-2{middle dot}00) were associated with an increased risk viral rebound.
InterpretationSecond-line regimens are fairly durable for eight years followed a rapid increase in the incidence of rebounds. A high viral load at switch, ATV/r based second-line, and young adulthood are risk factors associated with a viral rebound, which underscores the need for differentiated care services. | hiv aids |
10.1101/2022.03.14.22272351 | Super-spreaders of novel coronaviruses that cause SARS, MERS and COVID-19 : A systematic review | OBJECTIVEMost index cases with novel coronavirus infections transmit disease to just 1 or 2 other individuals, but some individuals super-spread - they are infection sources for many secondary cases. Understanding common factors that super-spreaders may share could inform outbreak models.
METHODSWe conducted a comprehensive search in MEDLINE, Scopus and preprint servers to identify studies about persons who were each documented as transmitting SARS, MERS or COVID-19 to at least nine other persons. We extracted data from and applied quality assessment to eligible published scientific articles about super-spreaders to describe them demographically: by age, sex, location, occupation, activities, symptom severity, any underlying conditions and disease outcome. We included scientific reports published by mid June 2021.
RESULTSThe completeness of data reporting was often limited, which meant we could not identify traits such as patient age, sex, occupation, etc. Where demographic information was available, for these coronavirus diseases, the most typical super-spreader was a male age 40+. Most SARS or MERS super-spreaders were very symptomatic and died in hospital settings. In contrast, COVID-19 super-spreaders often had a very mild disease course and most COVID-19 super-spreading happened in community settings.
CONCLUSIONAlthough SARS and MERS super-spreaders were often symptomatic, middle- or older-age adults who had a high mortality rate, COVID-19 super-spreaders often had a mild disease course and were documented to be any adult age (from 18 to 91 years old). More outbreak reports should be published with anonymised but useful demographic information to improve understanding of super-spreading, super-spreaders, and the settings that super-spreading happens in. | infectious diseases |
10.1101/2022.03.14.22272368 | Area-level social and structural inequalities determine mortality related to COVID-19 diagnosis in Ontario, Canada: a population-based explanatory modeling study of 11.8 million people | ImportanceSocial determinants of health (SDOH) play an important role in COVID-19 outcomes. More research is needed to quantify this relationship and understand the underlying mechanisms.
ObjectivesTo examine differential patterns in COVID-19-related mortality by area-level SDOH accounting for confounders; and to compare these patterns to those for non-COVID-19 mortality, and COVID-19 case fatality (COVID-19-related death among those diagnosed).
Design, setting, and participantsPopulation-based retrospective cohort study including all community living individuals aged 20 years or older residing in Ontario, Canada, as of March 1, 2020 who were followed through to March 2, 2021.
ExposureSDOH variables derived from the 2016 Canada Census at the dissemination area-level including: median household income; educational attainment; proportion of essential workers, racialized groups, recent immigrants, apartment buildings, and high-density housing; and average household size.
Main outcomes and measuresCOVID-19-related death was defined as death within 30 days following, or 7 days prior to a positive SARS-CoV-2 test. Cause-specific hazard models were employed to examine the associations between SDOH and COVID-19-related mortality, treating non-COVID-19 mortality as a competing risk.
ResultsOf 11,810,255 individuals included, 3,880 (0.03%) died related to COVID-19 and 88,107 (0.75%) died without a positive test. After accounting for demographics, baseline health, and other SDOH, the following SDOH were associated with increased hazard of COVID-19-related death (hazard ratios [95% confidence intervals]) comparing the most to least vulnerable group): lower income (1.30[1.09-1.54]), lower educational attainment (1.27[1.10-1.47]), higher proportion essential workers (1.28[1.10-1.50]), higher proportion racialized groups (1.42[1.16-1.73]), higher proportion apartment buildings (1.25[1.11-1.41]), and larger vs. medium household size (1.30[1.13-1.48]). In comparison, areas with higher proportion racialized groups were associated with a lower hazard of non-COVID-19 mortality (0.88[0.85-0.92]). With the exception of income, SDOH were not independently associated with COVID-19 case fatality.
Conclusions and relevanceArea-level social and structural inequalities determine COVID-19-related mortality after accounting for individual demographic and clinical factors. COVID-19 has reversed the pattern of lower non-COVID-19 mortality by racialized groups. Pandemic responses should include prioritized and community-tailored intervention strategies to address SDOH that mechanistically underpin disproportionate acquisition and transmission risks and shape barriers to the reach of, and access to prevention interventions.
Key pointsO_ST_ABSQuestionC_ST_ABSAre area-level social determinants of health factors independently associated with coronavirus disease 2019 (COVID-19)-related mortality after accounting for demographics and clinical factors?
FindingsIn this population-based cohort study including 11.8 million adults residing in Ontario, Canada and 3,880 COVID-19-related death occurred between Mar 1, 2020 and Mar 2, 2021, we found that areas characterized by lower SES (including lower income, lower educational attainment, and higher proportion essential workers), greater ethnic diversity, more apartment buildings, and larger vs. medium household size were associated with increased hazard of COVID-19-related mortality compared to their counterparts, even after accounting for individual-level demographics, baseline health, and other area-level SDOH.
MeaningPandemic responses should include prioritized and community-tailored intervention strategies to address SDOH that mechanistically underpin inequalities in acquisition and transmission risks, and in the reach of, and access to prevention interventions. | epidemiology |
10.1101/2022.03.15.22272252 | Long-term survival analysis of HIV patients on antiretroviral therapy in Congo: a 14 years retrospective cohort analysis, 2003-2017 | BackgroundThe long-term survival of patients on antiretroviral treatment in Congo remains less documented. Our study aimed to analyze the long-term survival of adults living with HIV on ART (Antiretroviral Therapy).
MethodsWe conducted a historical cohort study on 2,309 adult PLHIV (People Living with HIV) followed between January 1, 2003 and December 31, 2017 whose viral load and date of initiation of ART were known. The Kaplan Meier method was used to estimate the probability of survival and the Cox regression model to identify factors associated with death.
ResultsThe median age was 49 years; the female sex was predominant with 68.56%. The probability of survival at 14 years was 83%, (95% CI (Confidence Interval) [78-87]). On the other hand, when the lost to follow-up died, it was 66% (95% CI [62-70]) in the worst scenario. Stratified cox regression analysis showed that: being male, AHR (Adjusted Hazard Ratio) = 1.65 (95% CI [1.26-2.17]) was significantly associated with death, p-value <0.0001. Furthermore, having a viral load> 1000 copies / ml, AHR = 2.56 (95% CI [1.93-3.40]), be in the advanced WHO clinical stage, in particular: stage II, AHR = 4.07 (95% CI [2.36-7.01]); stage III, AHR = 13.49 (95% CI [8.99-20.27]) and stage IV, AHR = 34.45 (95% CI [23.74-50]) were also significantly associated with death; p-value <0.0001.
ConclusionThe long-term survival of PLHIV is worrying despite the offer of ARVs. | epidemiology |
10.1101/2022.03.14.22272341 | People underestimate the change of airborne Corona virus exposure when changing distance to an infected person: On interpersonal distance, exposure time, face masks and perceived virus exposure. | Participants judged airborne Corona virus exposure following a change of inter-personal distance and time of a conversation with an infected person with and without a face mask. About 75% of the participants underestimated how much virus exposure changes when the distance to an infected person changed. The smallest average face to face distance from an infected person without a mask that a participant judged as sufficiently safe was about 12 feet (3.67 m). Correlations showed that the more a person underestimated the effects of change of distance on exposure the shorter was that persons own safety distance. On average the effects of different lengths of a conversation on exposure were correct, but those who judged the effects of time as smaller tended to select longer safety distances. Worry of own COVID-19 infection correlated with protective behaviors: keeping longer safety distances, avoiding public gatherings, postponement of meetings with friends. The results showed that the protective effects of both distancing and wearing a face mask were under-estimated by a majority of the participants. Implications of these results were discussed last. | epidemiology |
10.1101/2022.03.14.22272359 | Biomarkers Selection for Population Normalization in SARS-CoV-2 Wastewater-based Epidemiology | Wastewater-based epidemiology (WBE) has been one of the most cost-effective approaches to track the SARS-CoV-2 levels in the communities since the COVID-19 outbreak in 2020. Normalizing SARS-CoV-2 concentrations by the population biomarkers in wastewater can be critical for interpreting the viral loads, comparing the epidemiological trends among the sewersheds, and identifying the vulnerable communities. In this study, five population biomarkers, pepper mild mottle virus (pMMoV), creatinine (CRE), 5-hydroxyindoleacetic acid (5-HIAA), caffeine (CAF) and its metabolite paraxanthine (PARA) were investigated for their utility in normalizing the SARS-CoV-2 loads through developed direct and indirect approaches. Their utility in assessing the real-time population contributing to the wastewater was also evaluated. The best performed candidate was further tested for its capacity for improving correlation between normalized SARS-CoV-2 loads and the clinical cases reported in the City of Columbia, Missouri, a university town with a constantly fluctuated population. Our results showed that, except CRE, the direct and indirect normalization approaches using biomarkers allow accounting for the changes in wastewater dilution and differences in relative human waste input over time regardless flow volume and population at any given WWTP. Among selected biomarkers, PARA is the most reliable population biomarker in determining the SARS-CoV-2 load per capita due to its high accuracy, low variability, and high temporal consistency to reflect the change in population dynamics and dilution in wastewater. It also demonstrated its excellent utility for real-time assessment of the population contributing to the wastewater. In addition, the viral loads normalized by the PARA-estimated population significantly improved the correlation (rho=0.5878, p<0.05) between SARS-CoV-2 load per capita and case numbers per capita. This chemical biomarker offers an excellent alternative to the currently CDC-recommended pMMoV genetic biomarker to help us understand the size, distribution, and dynamics of local populations for forecasting the prevalence of SARS-CoV2 within each sewershed.
HIGHLIGHT (bullet points)O_LIThe paraxanthine (PARA), the metabolite of the caffeine, is a more reliable population biomarker in SARS-CoV-2 wastewater-based epidemiology studies than the currently recommended pMMoV genetic marker.
C_LIO_LISARS-CoV-2 load per capita could be directly normalized using the regression functions derived from correlation between paraxanthine and population without flowrate and population data.
C_LIO_LINormalizing SARS-CoV-2 levels with the chemical marker PARA significantly improved the correlation between viral loads per capita and case numbers per capita.
C_LIO_LIThe chemical marker PARA demonstrated its excellent utility for real-time assessment of the population contributing to the wastewater.
C_LI | epidemiology |
10.1101/2022.03.14.22272344 | Neonatal AKI profile using KDIGO guidelines: A cohort study in tertiary care hospital ICU of Lahore, Pakistan | BackgroundAKI is witnessed in sick neonates and is associated with poor outcomes. Our cohort represents the profile of neonates who were diagnosed with AKI using KDIGO guidelines during intensive care unit stay.
MethodologyA cohort study was conducted in the NICU of FMH from June 2019 to May 2021. Data were collected on standardized proforma. Serum creatinine was measured within 24 hours after enrollment in the study by cytometric analysis using the C311 Rosch machine and subsequently after 24 to 48 hours. Data analysis was done using SPSS v 20.0. All continuous variables were not normally distributed and were expressed as the median and interquartile range (IQR). Categorical variables were analyzed by proportional differences with either the Pearson chi-square test or Fishers exact tests. A multinomial logistic regression model was used to explore the independent risk factors of AKI. Time to the event (death) and survival curves for the cohort were plotted by using Cox proportional hazard model.
ResultsAKI occurred in 473 (37.6%) of neonates and 15.7%, 16.3% and 5.6% had stage 1, 2 and 3 respectively. The outborn birth (p 0.000, AOR 3.987, 95%CI 2.564 - 6.200), birth asphyxia (p 0.000, AOR 3.567, 95%CI 2.093 - 6.080), inotropic agent (p 0.000, AOR 2.060, 95%CI 1.436 - 2.957), antenatal steroids (p 0.002, AOR 1.721, 95%CI 1.213 - 2.443), central lines (p 0.005, AOR 1.630, 95%CI 1.155 - 2.298), IVH/ICH/DIC (p 0.009, AOR1.580, 95%CI 1.119 - 2.231) and NEC (p 0.054, AOR 1.747, 95%CI 0.990 - 3.083) were independently associated with AKI. Protective factors of neonatal AKI were normal sodium levels, maternal diabetes mellitus as well Hb>10.5 mg/dl. Duration of stay (7 vs 9 days) and mortality rates (3.9% vs16.5%) were significantly higher in neonates with AKI (p <0.001).
ConclusionAbout one-third of critically sick neonates had AKI. Significant risk factors for AKI were outborn birth (298%), birth asphyxia (256%), inotropic agents (106%) %, NEC 74.7%, antenatal steroids 72%, central lines 63% and IVH/ICH/DIC 58%. AKI prolongs the duration of stay and reduces the survival of sick neonates. | pediatrics |
10.1101/2022.03.14.22272345 | Health behaviours and their determinants among immigrants residing in Italy | BackgroundThe mechanisms that influence the uptake of risky behaviours among immigrants are influenced by the interrelation between characteristics operating in different phases of their migratory experience. Characterizing their behavioural risk profile is needed to prioritize actions for prevention and health services organization. We therefore analysed health behaviours and their determinants among immigrants in Italy, jointly accounting for sociodemographic factors, migration pathways and integration indicators.
MethodsData come from a national survey conducted in 2011-2012 on a sample of about 12000 households with at least one foreigner residing in Italy. The independent impact of a variety of sociodemographic, migratory and integration characteristics on obesity, smoking and daily alcohol consumption was assessed using multivariable Poisson models.
ResultsThe survey involved more than 15,000 first generation immigrants. Unhealthy lifestyles are more common among men than among women and vary widely by ethnic group. There is a significant impact of employment status and family composition, while the educational level loses importance. Longer duration of residence and younger age at arrival are associated with an increased behavioural risk. Among women we also observed an independent impact of the integration indicators, less important for men.
ConclusionsThe profile of the main unhealthy lifestyles among migrants is shaped by cultural, socioeconomic and migratory characteristics, which differ by gender. Understanding these factors can help to design tailored preventive messages, necessary to interrupt the deterioration of migrants health capital. Low levels of integration have an additional negative impact on health, so inclusion and integration policies should complement health promotion strategies. | public and global health |
10.1101/2022.03.15.22272052 | Patterns of rates of mortality in the Clinical Practice Research Datalink | The Clinical Practice Research Datalink (CPRD) is a widely used data resource, representative in demographic profile, with accurate death recordings but it is unclear if mortality rates within CPRD are similar to rates in the general population. Rates may additionally be affected by selection bias caused by the requirement that a cohort have a minimum lookback window, i.e. observation time prior to start of at-risk follow-up.
Standardised Mortality Ratios (SMRs) were calculated incorporating published population reference rates from the Office for National Statistics (ONS), using Poisson regression with rates in CPRD contrasted to ONS rates, stratified by age, calendar year and sex. An overall SMR was estimated along with SMRs presented for cohorts with different lookback windows (1, 2, 5, 10 years). SMRs were stratified by calendar year, length of follow-up and age group. Mortality rates in a random sample of 1 million CPRD GOLD patients were slightly lower than the national population [SMR=0.980 95% confidence interval (CI) (0.973, 0.987)]. Cohorts with observational lookback had SMRs below one [1 year of lookback; SMR=0.905 (0.898, 0.912), 2 years; SMR=0.881 (0.874, 0.888), 5 years; SMR=0.849 (0.841, 0.857), 10 years; SMR=0.837 (0.827, 0.847)]. Mortality rates in the first two years after patient entry into CPRD were higher than the general population, while SMRs dropped below one thereafter.
Mortality rates in CPRD, using simple entry requirements, are similar to rates seen in the English population. The requirement of at least a single year of lookback results in lower mortality rates compared to national estimates. | public and global health |
10.1101/2022.03.14.22271847 | Employer Requirements and COVID-19 Vaccination and Attitudes among Healthcare Personnel in the U.S.: Findings from National Immunization Survey Adult COVID Module, August - September 2021 | IntroductionEmployer vaccination requirements have been used to increase vaccination uptake among healthcare personnel (HCP). In summer 2021, HCP were the group most likely to have employer requirements for COVID-19 vaccinations as healthcare facilities led the implementation of such requirements. This study examined the association between employer requirements and HCPs COVID-19 vaccination status and attitudes about the vaccine.
MethodsParticipants were a national representative sample of United States (US) adults who completed the National Immunization Survey Adult COVID Module (NIS-ACM) during August-September 2021. Respondents were asked about COVID-19 vaccination and intent, requirements for vaccination, place of work, attitudes surrounding vaccinations, and sociodemographic variables. This analysis focused on HCP respondents. We first calculated the weighted proportion reporting COVID-19 vaccination for HCP by sociodemographic variables. Then we computed unadjusted and adjusted prevalence ratios for vaccination coverage and key indicators on vaccine attitudes, comparing HCP based on individual self-report of vaccination requirements.
ResultsOf 12,875 HCP respondents, 41.5% reported COVID-19 vaccination employer requirements. Among HCP with vaccination requirements, 90.5% had been vaccinated against COVID-19, as compared to 73.3% of HCP without vaccination requirements--a pattern consistent across sociodemographic groups. Notably, the greatest differences in uptake between HCP with and without employee requirements were seen in sociodemographic subgroups with the lowest vaccination uptake, e.g., HCP aged 18-29 years, HCP with high school or less education, HCP living below poverty, and uninsured HCP. In every sociodemographic subgroup examined, vaccine uptake was more equitable among HCP with vaccination requirements than in HCP without. Finally, HCP with vaccination requirements were also more likely to express confidence in the vaccines safety (68.3% vs. 60.1%) and importance (89.6% vs 79.6%).
ConclusionIn a large national US sample, employer requirements were associated with higher and more equitable HCP vaccination uptake across all sociodemographic groups examined. Our findings suggest that employer requirements can contribute to improving COVID-19 vaccination coverage, similar to patterns seen for other vaccines. | public and global health |
10.1101/2022.03.13.22272319 | Short-chain fatty acid concentrations in the incidence and risk-stratification of colorectal cancer: a systematic review and meta-analysis | The beneficial role of gut microbiota and bacterial metabolites, including short-chain fatty acids (SCFAs), is well recognized; although the available literature around their role in colorectal cancer (CRC) has been inconsistent.
We performed a systematic review and meta-analysis to examine associations of fecal SCFA concentrations to the incidence and risk of CRC.
Data extraction through Medline, Embase, and Web of Science was carried out from database conception to May 21, 2021. Predefined criteria included human clinical observational studies, while excluding cell/animal model studies, conference proceedings, and reviews. Quality assessment of selected 16 case-control and six cross-sectional studies is reported using PRISMA 2020 guidelines. Studies were categorized for CRC risk or incidence, and RevMan 5.4 was used to perform the meta-analyses. Standardized mean differences (SMD) with 95% confidence intervals (CI) were calculated using a random-effects model.
Combined analysis of acetic-, propionic-, and butyric-acid revealed significantly lower concentrations of these SCFAs in individuals with high-risk of CRC (SMD = 2.02, 95% CI 0.31 to 3.74, P = 0.02). Further, CRC incidence increased in individuals with lower levels of SCFAs (SMD = 0.45, 95% CI 0.19 to 0.72, P = 0.0009), compared to healthy individuals.
Overall, lower fecal concentrations of the three major SCFAs is associated with higher risk and incidence of CRC. | oncology |
10.1101/2022.03.13.22272319 | Short-chain fatty acid concentrations in the incidence and risk-stratification of colorectal cancer: a systematic review and meta-analysis | The beneficial role of gut microbiota and bacterial metabolites, including short-chain fatty acids (SCFAs), is well recognized; although the available literature around their role in colorectal cancer (CRC) has been inconsistent.
We performed a systematic review and meta-analysis to examine associations of fecal SCFA concentrations to the incidence and risk of CRC.
Data extraction through Medline, Embase, and Web of Science was carried out from database conception to May 21, 2021. Predefined criteria included human clinical observational studies, while excluding cell/animal model studies, conference proceedings, and reviews. Quality assessment of selected 16 case-control and six cross-sectional studies is reported using PRISMA 2020 guidelines. Studies were categorized for CRC risk or incidence, and RevMan 5.4 was used to perform the meta-analyses. Standardized mean differences (SMD) with 95% confidence intervals (CI) were calculated using a random-effects model.
Combined analysis of acetic-, propionic-, and butyric-acid revealed significantly lower concentrations of these SCFAs in individuals with high-risk of CRC (SMD = 2.02, 95% CI 0.31 to 3.74, P = 0.02). Further, CRC incidence increased in individuals with lower levels of SCFAs (SMD = 0.45, 95% CI 0.19 to 0.72, P = 0.0009), compared to healthy individuals.
Overall, lower fecal concentrations of the three major SCFAs is associated with higher risk and incidence of CRC. | oncology |
10.1101/2022.03.14.22272054 | Identification of probiotic responders in cross-over trials using the Bayesian statistical model considering lags of effect period | Recent advances in microbiome research have led to the further development of microbial interventions, such as probiotics and prebiotics, which are potential treatments for constipation. However, the effects of probiotics vary from person to person; therefore, the effectiveness of probiotics needs to be verified for each individual. Individuals showing significant effects of the target probiotic is called responders. A statistical model for the evaluation of responders was proposed in a previous study. However, the previous model does not consider the lag in the effect of the probiotic. It is expected that there are lags between the period of time when probiotics are administered and when they are effective. In this study, we propose a Bayesian statistical model to estimate the probability that a subject is a responder, by considering the lag of the effect period. In synthetic dataset experiments, the proposed model was found to outperform the base model, which did not factor in the lag. Further, we found that the proposed model could distinguish responders showing large uncertainty in terms of the lag of the effect period against the intake period. | health informatics |
10.1101/2022.03.14.22272325 | Safety and immunogenicity of SARS-CoV-2 vaccine MVC-COV1901 in adolescents in Taiwan: A double-blind, randomized, placebo-controlled phase 2 trial | BackgroundMVC-COV1901 is a subunit SARS-CoV-2 vaccine based on the prefusion spike protein S-2P and adjuvanted with CpG 1018 and aluminum hydroxide. Although MVC-COV1901 has been licensed for emergency use for adults in Taiwan, the safety and immunogenicity of MVC-COV1901 in adolescents remained unknown. As young people play an important role in SARS-CoV-2 transmission and epidemiology, a vaccine approved for adolescents and eventually, children, will be important in mitigating the COVID-19 pandemic.
MethodsThis study is a prospective, double-blind, multi-center phase 2 trial evaluating the safety, tolerability and immunogenicity of two doses of the SARS-CoV-2 vaccine MVC-COV1901 in adolescents. Healthy adolescents from age of 12 to 17 years were recruited and randomly assigned (6:1) to receive two intramuscular doses of either MVC-COV1901 or placebo at 28 days apart. The primary outcomes were safety and immunogenicity from the day of first vaccination (Day 1) to 28 days after the second vaccination (Day 57), and immunogenicity of MVC COV1901 in adolescents as compared to young adult vaccinees in terms of neutralizing antibody titers and seroconversion rate. The secondary outcomes were safety and immunogenicity of MVC-COV1901 as compared to placebo in adolescents in terms of immunoglobulin titers and neutralizing antibody titers over the study period.
ResultsBetween July 21, 2021 and December 22, 2021, a total of 399 adolescent participants were included for safety evaluation after enrollment to receive at least one dose of either MVC-COV1901 (N=341) or placebo (N=58). Of these, 334 and 46 participants went on to receive two doses of either MVC-COV1901 or placebo, respectively, and were included in the per protocol set (PPS) for immunogenicity analysis. Adverse events were mostly mild and were similar in MVC-COV1901 and placebo groups. The most commonly reported adverse events were pain/tenderness and malaise/fatigue. All immunogenicity endpoints in the adolescent group were non-inferior to the endpoints seen in the young adult and placebo groups.
ConclusionsThe safety and immunogenicity data presented here showed that MVC-COV1901 has similar safety profile and non-inferior immunogenicity in adolescents compared to young adults.
ClinicalTrials.gov registrationNCT04951388. | infectious diseases |
10.1101/2022.03.15.22272384 | Deep kinetoplast genome analyses result in a novel molecular assay for detecting Trypanosoma brucei gambiense-specific minicircles | The World Health Organization targeted Trypanosoma brucei gambiense (Tbg) human African trypanosomiasis for elimination of transmission by 2030. Sensitive molecular markers that specifically detect Tbg type 1 (Tbg1) parasites will be important tools to assist in reaching this goal. Here, we aim at improving molecular diagnosis of Tbg1 infections by targeting the abundant mitochondrial minicircles within the kinetoplast of Trypanosoma brucei parasites. Using Next-Generation Sequencing of total cellular DNA extracts, we assembled and annotated the kinetoplast genome and investigated minicircle sequence diversity in 38 animal- and human-infective trypanosome strains. Computational analyses recognized a total of 241 Minicircle Sequence Classes as Tbg1-specific, of which three were shared by the 18 studied Tbg1 strains. We then developed a novel multiplex quantitative PCR assay (g-qPCR3) targeting one Tbg1-specific minicircle and three Tbg1-specific or Trypanozoon-specific markers. Molecular analyses revealed that the minicircle-based assay is applicable on animals and is as specific as the TgsGP-based assay, the current golden standard for molecular detection of Tbg1. The median copy number of the targeted minicircle was equal to eight, suggesting that our minicircle-based assay may be used for the sensitive detection of Tbg1 parasites. Finally, annotation of the targeted minicircle sequence indicated that it encodes genes essential for the survival of the parasite, and will thus likely be preserved in natural Tbg1 populations. These results demonstrated that our minicircle-based assay is a promising new tool for reliable and sensitive detection of Tbg1 infections in humans and animals. | infectious diseases |
10.1101/2022.03.15.22272371 | Neutralizing responses in fully vaccinated with BNT162b2, CoronaVac, ChAdOx1, and Ad26.COV2.S against SARS-CoV-2 lineages in Colombia, 2020-2021 | BackgroundBy March 2022, around 34 million people in Colombia had received a complete scheme of vaccines against the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) including, mRNA-based vaccines, viral vectored coronavirus vaccines, or the inactivated whole virus vaccine. However, as several SARS-CoV-2 variants of concern (VOC) and interest (VOI) co-circulate in the country, determining the resistance level to vaccine-elicited neutralizing antibodies (nAbs) is useful to improve the efficacy of COVID-19 vaccination programs.
MethodsMicroneutralization assays with the most prevalent SARS-CoV-2 lineages in Colombia during 2020-2021 were performed using serum samples from immunologically naive individuals between 9 and 13 weeks after receiving complete regimens of CoronaVac, BNT162b2, ChAdOx1, or Ad26.COV2.S. The mean neutralization titer (MN50) was calculated by the Reed-Muench method and used to determine differences in vaccine-elicited nAbs against the SARS-CoV-2 lineages B.1.111, P.1 (Gamma), B.1.621 (Mu), and AY.25.1 (Delta).
ResultsThe most administered vaccines in the country, BNT162b2 and CoronaVac, elicited significantly different nAb responses against Mu, as the GMTs were 75.7 and 5.9-fold lower relative to the control lineage (B.1.111), while for Delta were 15.8 and 1.1-fold lower, respectively. In contrast, nAb responses against Mu and Delta were comparable between ChAd0x1-s and Ad26.COV2.S as the GMTs remained around 5 to 7-fold lower relative to B.1.111.
ConclusionsThe emergence of SARS-CoV-2 variants in Colombia with a significant capacity to escape from vaccine-elicited nAbs indicates that a booster dose is highly recommended. Furthermore, other non-pharmacological measures should be retained in the vaccinated population. | infectious diseases |
10.1101/2022.03.14.22272391 | Integrated Bile Acid Profile Analysis for The Non-invasive Staging Diagnosis of Ulcerative Colitis | Clinical staging diagnosis and progression tracking for ulcerative colitis (UC) is challenging as poor patient compliance of endoscopic biopsy, we aimed to explore a non-invasive integrative biochemical index to quantitative track and monitor pathological activity. Here we perform a study that integrates bile acid metabolomic profiling, metagenomic sequencing and clinical monitoring on serum and feacal samples from 24 active-state UC patients, 25 remission-state UC patients and 20 healthy volunteers from China. Besides known associations of Fusobacterium nucleatum and Peptostreptococcus stomatis with UC, we found several bile acid-transforming species, including 7-dehydroxygenase and 7/{beta}-dehydrogenase expressing microbiota, were significant correlated with UC pathological activity. We identified 7 microbial gene markers that differentiated active and remission-stage UC and healthy control microbiomes. Relevantly, decreased serum deoxycholic acid /cholic acid ratio and increased fecal ursodeoxycholic acid/chenodeoxycholic acid ratio were associated with pathological activity of UC. Moreover, receiver operating characteristic analysis based on serum/fecal bile acids ratios was much accurate in prediction of active and remission stage outcome. This species-specific temporal change and bile acid dysregulation pattern linked to disease severity indicating that integrated microbiome-bile acid profile maybe implied for disease activity prediction, and that targeting microbiome-mediated restoring gut flora and bile acids homeostasis may be implicative of therapy efficacy. Collectively, these insights will help improve clinical diagnosis and optimize existing medical treatments. | gastroenterology |
10.1101/2022.03.14.22271975 | METHYLATION BIOMARKERS ASSOCIATED WITH DRUG-RESISTANT EPILEPSY | Epilepsy is a disabling neurological disease that affects 2% of the population. Drug-resistant epilepsy (DRE) affects 25-30% of epilepsy patients. Understanding its underlying mechanisms is key to adequately manage this condition. To analyze the main epigenetic marks of DRE an epigenome-wide association study was carried out including samples from different regions of DRE patients brain and peripheral blood. An Illumina Infinium MethylationEPIC BeadChip array including cortex, hippocampus, amygdala, and peripheral blood from DRE subjected to neurosurgical resection of the epileptogenic zone was used. Overall, 32, 59, 3210, and 6 differentially methylated probes (DMPs) associated with DRE were found in the hippocampus, amygdala, cortex, and peripheral blood, respectively. These DMPs harbored 19, 28, 1574, and 7 genes, respectively, which play different roles in processes such as neurotrophic or calcium signaling. Three of the top DMPs observed in cortex were validated with methylation specific qPCR. Moreover, 163 DMPs associated with neurosurgery response at 6 months were found in the hippocampus. Genes located on these DMPs were involved in diverse processes such as synaptic signaling and central nervous system development. Besides 3 DMPs in blood samples were associated with response to neurosurgery at 12 months. In conclusion, the present study reports genome-wide DNA methylation changes across different regions of the DRE brain. These changes could be useful for further studies to disentangle the bases of DRE to search for therapeutic alternatives for this disease. Furthermore, they could also help identify patients likely to respond to neurosurgery. | neurology |
10.1101/2022.03.14.22270915 | Combined administration of inhaled DNase, baricitinib and tocilizumab as rescue treatment in COVID-19 patients with severe respiratory failure | COVID-19-related severe respiratory failure (SRF) leads to mechanical ventilation increasing the in-hospital mortality substantially. Abundancy of lung fibroblasts (LFs) in injured lung tissue has been associated with the progression of respiratory failure in COVID-19. Aiming to reduce mortality in patients with SRF (PaO2/FiO2<100 mmHg) and considering the multi-mechanistic nature of severe COVID-19 pathogenesis, we applied a combined rescue treatment (COMBI) on top of standard-of-care (SOC: dexamethasone and heparin) comprised inhaled DNase to dissolve thrombogenic neutrophil extracellular traps, plus agents against cytokine-mediated hyperinflammation, such as anti-IL-6 receptor tocilizumab and selective JAK1/2 inhibitor baricitinib. COMBI (n=22) was compared with SOC (n= 26), and with two previously and consecutively used therapeutic approaches, including either IL-1 receptor antagonist anakinra (ANA, n=19), or tocilizumab (TOCI, n=11), on top of SOC. In parallel, evaluation of immunothrombosis was assessed in vitro in human LFs, treated with the applied therapeutic agents upon stimulation with COVID-19 plasma. COMBI was associated with lower in-hospital mortality (p=0.014) and intubation rate (p=0.013), shorter duration of hospitalization (p=0.019), and prolonged overall survival after a median follow-up of 110{+/-}4 days (p=0.003). In vitro, COVID-19 plasma markedly induced tissue factor/thrombin pathway in LFs, while this effect was inhibited by the immunomodulatory agents of COMBI providing a mechanistic explanation for the clinical observations. These results suggest the design of randomized trials using combined immunomodulatory therapies in COVID-19-associated SRF targeting multiple interconnected pathways of immunothrombosis. | infectious diseases |
10.1101/2022.03.14.22272363 | Optical Genome Mapping: Clinical Validation and Diagnostic Utility for Enhanced Cytogenomic Analysis of Hematological Neoplasms | Hematological neoplasms are predominantly defined by chromosomal aberrations that include structural variations (SVs) and copy number variations (CNVs). The current standard-of-care (SOC) genetic testing for the detection of SVs and CNVs relies on a combination of traditional cytogenetic techniques that include karyotyping, fluorescence in situ hybridization (FISH), and chromosomal microarrays (CMA). These techniques are labor-intensive, time and cost-prohibitive, and often do not reveal the genetic complexity of the tumor. Optical genome mapping (OGM) is an emerging technology that can detect all classes of SVs in a single assay. We report the results from our clinical validation (in a CLIA setting) of the OGM technique for hematological neoplasms. The study included 92 sample runs (including replicates) using 69 well-characterized unique samples (59 hematological neoplasms and 10 controls). The technical (QC metrics and first-pass rate) and analytical performance [sensitivity, specificity, accuracy, positive predictive value (PPV), and negative predictive value (NPV)] were evaluated using the clinical samples. The reproducibility was evaluated by performing inter-run, intra-run, and inter-instrument comparisons using six samples run in triplicates. The limit of detection (LoD) for aneuploidy, translocation, interstitial deletion, and duplication was assessed. To confirm the LoD, samples at 12.5%, 10%, and 5% allele fractions (theoretical LoD range) were run in triplicates. The technical performance resulted in a 100% first-pass rate with all samples meeting the minimum QC metrics. The analytical performance showed a sensitivity of 98.7%, specificity of 100%, accuracy of 99.2%, PPV of 100%, and NPV of 98%, which included the detection of 61 aneuploidies, 34 deletions, 28 translocations, 11 duplications/amplifications, 15 insertions/additional material not identified with karyotyping, 12 marker chromosomes, and one each of ring chromosome, inversion and isochromosome. OGM demonstrated robust technical and analytical inter-run, intra-run, and inter-instrument reproducibility. The LoD was determined to be at 5% allele fraction for all the variant classes evaluated in the study. In addition, OGM demonstrated higher resolution to refine breakpoints, identify the additional genomic material, marker, and ring chromosomes. OGM identified several additional SVs, revealing the genomic architecture in these neoplasms that provides an opportunity for better tumor classification, prognostication, risk stratification, and therapy selection. This study is the first CLIA validation report for OGM for genome-wide structural variation detection in hematological neoplasms. Considering the technical and analytical advantages of OGM compared to the current SOC methods used for chromosomal characterization, we highly recommend OGM as a potential first-tier cytogenetic test for the evaluation of hematological neoplasms. | genetic and genomic medicine |
10.1101/2022.03.14.22272330 | Estimating the burden of malaria and soil-transmitted helminth co-infection in sub-Saharan Africa: a geospatial study | BackgroundLimited understanding exists about the interactions between malaria and soil-transmitted helminths (STH), their potential geographical overlap and the factors driving it. This study characterised the geographical and co-clustered distribution patterns of malaria and STH infections among vulnerable populations in sub-Saharan Africa (SSA).
Methodology/Principal findingsWe obtained continuous estimates of malaria prevalence from the Malaria Atlas Project and STH prevalence surveys from the WHO-driven Expanded Special Project for the Elimination of NTDs (ESPEN) covering 2000-2018 and used spatial autocorrelation methods to identify statistically significant clusters for both diseases across SSA. We used the inverse distance weighted kriging (interpolation) methods to estimate STH prevalence. We calculated the population-weighted prevalence of malaria and STH co-infection, and used the bivariate local indicator of spatial association (LISA analysis) to explore potential co-clustering of both diseases at the implementation unit levels.
Our analysis shows spatial variations in the estimates of the prevalence of Plasmodium falciparum-STH co-infections and identified hotspots across many countries in SSA with inter-and intra-country variations. High P. falciparum and high hookworm co-infections were more prevalent in West and Central Africa, whereas high P. falciparum, high Ascaris lumbricoides, high P. falciparum, and high Trichuris trichiura co-infections were more predominant in Central Africa, compared to other sub-regions in SSA.
Conclusions/SignificanceWide spatial heterogeneity exists in the prevalence of malaria and STH co-infection within the regions and within countries in SSA. The geographical overlap and spatial co-existence of malaria and STH could be exploited to achieve effective control and elimination agendas through the integration of the vertical programmes designed for malaria and STH into a more comprehensive and sustainable community-based paradigm.
Author SummaryMalaria and worms frequently co-exist together among children living in the poorest countries of the world, but little is known about the specific locations of the combined infections involving the two major parasitic diseases and how they interact and change over the years.
We used open access data collected by two public registries, that is, the Malaria Atlas Project and Expanded Special Project for the Elimination of NTDs, to understand the overlap of the two diseases in different parts of Africa, where their burden are more predominant.
We found significant differences in the distributions of the combined diseases across different parts of Africa, with large concentrations identified in Central and West Africa. For example, double infections with malaria and hookworm were more common in West and Central Africa, whereas malaria and roundworm, and malaria and whipworm were predominantly found in Central Africa. A large collection of the dual infections was also found in some localities within the countries which appeared to have low burden of the two diseases.
These findings provide a useful insight into the areas which could be serving as a reservoir to propagating the transmission of the two diseases. The results of this study could also be used to develop and implement integrated control programmes for malaria and worms, and this could help to achieve the WHO NTD roadmap to ending the neglect to attain Sustainable Development Goals by 2030. | public and global health |
10.1101/2022.03.14.22271768 | Withdrawal of Assisted Ventilation at the Patient's Request in MND: A Retrospective Exploration of the Ethical and Legal Issues Concerning Relatives, Nurses and Allied Health Care Professionals | BackgroundThere is little literature focusing on the issues relatives and health professionals encounter when withdrawing assisted ventilation at the request of a patient with MND/ALS.
AimTo explore with relatives, nurses and allied health professionals the ethical and legal issues that they had encountered in the withdrawal of ventilation at the request of a patient with MND/ALS.
MethodA retrospective qualitative interview study with 16 family members and 26 professionals. Data was analysed thematically and compared with results from a previous study with doctors.
ResultsThe events surrounding ventilation withdrawal were extraordinarily memorable for both HCPs and family members with clear recall of explicit details, even from years previously. The events had had a profound and lasting effect due to the emotional intensity of the experiences. Withdrawal of ventilation posed legal, ethical and moral challenges for relatives and health are professionals. Relatives looked to health care professionals for knowledge, guidance and reassurance on these issues, worried about how the withdrawal would be perceived by others, and found professional ignorance and disagreement distressing. Many health care professionals lacked theoretical knowledge and confidence on the legal and ethical considerations of withdrawal and struggled morally knowing the outcome of the withdrawal would be death. Health care professionals also worried about the perception of others of their involvement, which in turn influenced their practice. There was a lack of consistency in understanding across professions, and professionals often felt uncomfortable and anxious
ConclusionsLegal, ethical and practical guidance is needed and open discussion of the ethical challenges as well as education and support for health care professionals and relatives would improve the experience of all involved. | palliative medicine |
10.1101/2022.03.15.22272428 | Predicting refugee flows from Ukraine with an approach to Big (Crisis) Data: a new opportunity for refugee and humanitarian studies | BackgroundThis paper shows that Big Data and the so-called tools of digital demography, such as Google Trends (GT) and insights from social networks such as Instagram, Twitter and Facebook, can be useful for determining, estimating, and predicting the forced migration flows to the EU caused by the war in Ukraine.
ObjectiveThe objective of this study was to test the usefulness of Google Trends indexes to predict further forced migration from Ukraine to the EU (mainly to Germany) and gain demographic insights from social networks into the age and gender structure of refugees.
MethodsThe primary methodological concept of our approach is to monitor the digital trace of Internet searches in Ukrainian, Russian and English with the Google Trends analytical tool (trends.google.com). Initially, keywords were chosen that are most predictive, specific, and common enough to predict the forced migration from Ukraine. We requested the data before and during the war outbreak and divided the keyword frequency for each migration-related query to standardise the data. We compared this search frequency index with official statistics from UNHCR to prove the significations of results and correlations and test the models predictive potential. Since UNHCR does not yet have complete data on the demographic structure of refugees, to fill this gap, we used three other alternative Big Data sources: Facebook, Twitter and Instagram.
ResultsAll tested migration-related search queries about emigration planning from Ukraine show the positive linear association between Google index and data from official UNHCR statistics; R2 = 0.1211 for searches in Russian and R2 = 0.1831 for searches in Ukrainian. It is noticed that Ukrainians use the Russian language more often to search for terms than Ukrainian. Increase in migration-related search activities in Ukraine such as "[gcy][p][a][ncy][icy][tscy][a]" (Rus. border), [kcy][o][p][dcy][o][ncy][u] (Ukr. border); "[Pcy][o][lcy][softcy][shchcy][a]" (Poland); "[Gcy][e][p][m][a][ncy][icy][yacy]" (Rus. Germany), "[H]i[m][e][chcy][chcy][icy][ncy][a]" (Ukr. Germany) and "[U][gcy][o][p][shchcy][icy][ncy][a]" and "[V][e][ncy][gcy][p][icy][yacy]" (Hungary) correlate strongly with officially UNHCR data for externally displaced persons from Ukraine. All three languages show that the interest in Poland is the highest. When refugees arrive in nearby countries, the search for terms related to Germany, such as "crossing the border + Germany", etc., is proliferating. This result confirms our hypothesis that one-third of all refugees will cross into Germany. According to Big Data insights, the estimate of the total number of expected refugees is to expect 5,4 Million refugees. The age group most represented is between 24 and 45 years (data for children are unavailable), and over 65% are women.
ConclusionThe increase in migration-related search queries is correlated with the rise in the number of refugees from Ukraine in the EU. Thus this method allows reliable forecasts. Understanding the consequences of forced migration from Ukraine is crucial to enabling UNHCR and governments to develop optimal humanitarian strategies and prepare for refugee reception and possible integration. The benefit of this method is reliable estimates and forecasting that can allow governments and UNHCR to prepare and better respond to the recent humanitarian crisis. | occupational and environmental health |
10.1101/2022.03.14.22272396 | Pan-cancer Analyses reveal functional similarities of three lncRNAs across multiple tumors | Long non-coding RNAs (lncRNAs) are emerging as key regulators in many biological processes. The dysregulation of lncRNA expression has been associated with many diseases, including cancer. Mounting evidence suggests that lncRNAs are involved in cancer initiation, progression, and metastasis. Thus, understanding the functional implications of lncRNAs in tumorigenesis can aid in developing novel biomarkers and therapeutic targets. Rich cancer datasets, documenting genomic and transcriptomic alterations together with advancement in bioinformatics tools, have presented an opportunity to perform pan-cancer analyses across different cancer types. This study is aimed at conducting a pan-cancer analysis of lncRNAs by performing differential expression and functional analyses between tumor and normal adjacent samples across eight cancer types. Among dysregulated lncRNAs, seven were shared across all cancer types. We focused on three lncRNAs, found to be consistently dysregulated among tumors. It has been observed that these three lncRNAs of interest are interacting with a wide range of genes across different tissues, yet enriching substantially similar biological processes, found to be implicated in cancer progression and proliferation. | genetic and genomic medicine |
10.1101/2022.03.15.22272404 | HDMAX2: A framework for High Dimensional Mediation Analysis with application to maternal smoking, placental DNA methylation and birth outcomes | Most high dimensional mediation epigenetic studies evaluate each mediator indirect effects independently. These approaches are underpowered to detect multiple mediators and the overall indirect effect remains poorly quantified. We developed HDMAX2, an efficient algorithm for high dimensional mediation analysis using max-squared tests, considering CpGs and aggregated mediator regions. HDMAX2 outperformed existing approaches in simulated and real data, detected true mediators with increased power, and assessed the overall indirect effect of several mediators from a high dimension matrix. We showed a polygenic architecture for placenta CpGs and regions mediating the effects of maternal smoking during pregnancy on babys birth weight. | genetic and genomic medicine |
10.1101/2022.03.15.22272407 | Sub-cellular level resolution of common genetic variation in the photoreceptor layer identifies continuum between rare disease and common variation | Photoreceptor cells (PRCs) are the light-detecting cells of the retina. Such cells can be non-invasively imaged using optical coherence tomography (OCT) which is used in clinical settings to diagnose and monitor ocular diseases. Here we present the largest genome-wide association study of PRC morphology to date utilising quantitative phenotypes extracted from OCT images within the UK Biobank. We discovered 111 loci associated with the thickness of one or more of the PRC layers, many of which had prior associations to ocular phenotypes and pathologies. We further identified 10 genes associated with PRC thickness through gene burden testing using exome data. In both cases there was a significant enrichment for genes involved in rare eye pathologies, in particular retinitis pigmentosa. There was evidence for an interaction effect between common genetic variants, VSX2 involved in eye development and PRPH2 known to be involved in retinal dystrophies. We further identified a number of genetic variants with a differential effect across the macular spatial field. Our results suggest a continuum between normal and rare variation which impacts retinal structure, sometimes leading to disease. | genetic and genomic medicine |
10.1101/2022.03.15.22271876 | Contribution of Mendelian disorders in a population-based pediatric neurodegeneration cohort | ObjectivesTo evaluate Mendelian causes of neurodegenerative disorders in a cohort of pediatric patients as pediatric neurodegenerative disorders are a rare, diverse group of diseases. As molecular testing has advanced, many children can be diagnosed, but the relative contribution of various disorders is unclear.
Study DesignPatients enrolled in the Center for Applied Genomics (CAG) Biobank at the Childrens Hospital of Philadelphia with neurodegenerative symptoms were identified using an algorithm that consisted of including and excluding selected ICD9 and ICD10 codes. A manual chart review was then performed to abstract detailed clinical information.
ResultsOut of approximately 100,000 patients enrolled in the CAG Biobank, 76 had a neurodegenerative phenotype. Following chart review, 7 patients were excluded. Of the remaining 69 patients, 42 had a genetic diagnosis (60.9%) and 27 were undiagnosed (39.1%). There were 32 unique disorders. Common diagnoses included Rett syndrome, mitochondrial disorders and neuronal ceroid lipofuscinoses.
ConclusionsThe disorders encountered in our cohort demonstrate the diverse diseases and pathophysiology that contribute to pediatric neurodegeneration. Establishing a diagnosis often informed clinical management, although curative treatment options are lacking. Many patients who underwent genetic evaluation remained undiagnosed, highlighting the importance of continued research efforts in this field. | genetic and genomic medicine |
10.1101/2022.03.15.22272399 | Plasma alkaline phosphatase is associated to mortality risk in aortic valve stenosis patients. | Backgroundaortic valve stenosis is an important clinical condition, with a significant mortality rate in the elderly. Plasma values of alkaline phosphatase (ALP) have been shown to act as a marker of prognosis in different clinical conditions and in the general population.
MethodsPlasma levels of alkaline phosphatase were studied in a cohort of patients with aortic valve stenosis, and a five-year survival evaluation was carried out.
Results24 patients were under study, of which 12 were dead at 5-year follow-up. The median age at baseline evaluation was 79 years (interquartile range, 72-85 years), and 11 patients were of the female sex (13 were male). The median value of ALP, of 83 IU/L, was used to separate patients into two groups; 2 patients died in the group with low ALP values versus 10 patients who died in the group with high ALP values. Using ALP with the same cut-off level, Kaplan Meyer study with log-rank analysis showed a significance level <0.01. Cox regression analysis showed an overall significant result, with a significant level for plasma ALP (significance level 0.03), but not for age, sex or transvalvular gradient (assessed by echocardiography).
ConclusionsElevated plasma ALP is associated to increased mortality risk in aortic valve stenosis patients. This finding may merit evaluation in studies with a larger number of patients. | cardiovascular medicine |
10.1101/2022.03.15.22272417 | IgG N-glycans are associated with prevalent and incident complications of type 2 diabetes | Aims/hypothesisInflammation is important in development of type 2 diabetes complications. The N-glycosylation of IgG influences its role in inflammation. Until now, the association of IgG N-glycosylation with type 2 diabetes complications has not been extensively investigated. We hypothesized that N-glycosylation of IgG may be related to development of complications of type 2 diabetes.
MethodsIn three independent type 2 diabetes cohorts, IgG N-glycosylation was measured by UPLC (DiaGene n=1815, GenodiabMar n=640) and mass spectrometry (DCS n=1266). We investigated the associations of IgG N-glycosylation (fucosylation, galactosylation, sialylation and bisection) with incident and prevalent nephropathy, retinopathy and macrovascular disease using Cox- and logistic regression, followed by meta-analyses. The models were adjusted for age, sex and additionally for clinical risk factors.
ResultsIgG galactosylation was negatively associated with prevalent and incident nephropathy after adjustment for clinical risk factors. Sialylation was negatively associated with incident diabetic nephropathy. For retinopathy, similar associations were found for galactosylation in the basic model. For macrovascular complications, negative associations with galactosylation and sialylation were confined to the cross-sectional analyses.
ConclusionsWe showed that IgG N-glycosylation traits are associated with higher prevalence and future development of nephropathy, after correction for clinical risk factors. For other complications, IgG N-glycosylation was associated with their prevalence only, possibly reflecting ongoing vascular inflammation. These findings indicate the predictive potential of IgG N-glycosylation in nephropathy. | cardiovascular medicine |
10.1101/2022.03.15.22272426 | SARS-CoV-2 infection during the Omicron surge among patients receiving dialysis: the role of circulating receptor-binding domain antibodies and vaccine doses | BackgroundIt is unclear whether a third dose of mRNA platform vaccines, or antibody response to prior infection or vaccination confer protection from the Omicron variant among patients receiving dialysis.
MethodsMonthly since February 2021, we tested plasma from 4,697 patients receiving dialysis for antibodies to the receptor-binding domain (RBD) of SARS-CoV-2. We assessed semiquantitative median IgG index values over time among patients vaccinated with at least one dose of the two mRNA vaccines. We ascertained documented COVID-19 diagnoses after December 25, 2021 and up to January 31, 2022. We estimated the relative risk for documented SARS-CoV-2 infection by vaccination status using a log-binomial model accounting for age, sex, and prior clinical COVID-19. Among patients with RBD IgG index value available during December 1-December 24, 2021, we also evaluated the association between the circulating RBD IgG titer and risk for Omicron variant SARS-CoV-2 infection.
ResultsOf the 4,697 patients we followed with monthly RBD assays, 3576 are included in the main analysis cohort; among these, 852 (24%) were unvaccinated. Antibody response to third doses was robust (median peak index IgG value at assay limit of 150, equivalent to 3270 binding antibody units/mL). Between December 25-January 31, 2022, SARS-CoV-2 infection was documented 340 patients (7%), 115 (36%) of whom were hospitalized. The final doses of vaccines were given a median of 272 (25th, 75th percentile, 245-303) days and 58 (25th, 75th percentile, 51-95) days prior to infection for the 1-2 dose and 3 dose vaccine groups respectively. Relative risks for infection were higher among patients without vaccination (RR 2.1 [95%CI 1.6, 2.8]), and patients with 1-2 doses (RR 1.3 [95%CI 1.0, 1.8]), compared with patients with three doses of the mRNA vaccines. Relative risks for infection were higher among patients with RBD index values < 23 (506 BAU/mL), compared with RBD index value [≥] 23 (RR 2.4 [95%CI 1.9, 3.0]). The higher risk for infection among patients with RBD index values < 23 was present among patients who received three doses (RR 2.1 [95%CI 1.3, 3.4]).
ConclusionsAmong patients receiving hemodialysis, patients unvaccinated, without a third mRNA vaccine dose, or those lacking robust circulating antibody response are at higher risk for Omicron variant infection. Low circulating antibodies could identify the subgroup needing intensified surveillance, prophylaxis or treatment in this patient population. | epidemiology |
10.1101/2022.03.15.22272426 | SARS-CoV-2 infection during the Omicron surge among patients receiving dialysis: the role of circulating receptor-binding domain antibodies and vaccine doses | BackgroundIt is unclear whether a third dose of mRNA platform vaccines, or antibody response to prior infection or vaccination confer protection from the Omicron variant among patients receiving dialysis.
MethodsMonthly since February 2021, we tested plasma from 4,697 patients receiving dialysis for antibodies to the receptor-binding domain (RBD) of SARS-CoV-2. We assessed semiquantitative median IgG index values over time among patients vaccinated with at least one dose of the two mRNA vaccines. We ascertained documented COVID-19 diagnoses after December 25, 2021 and up to January 31, 2022. We estimated the relative risk for documented SARS-CoV-2 infection by vaccination status using a log-binomial model accounting for age, sex, and prior clinical COVID-19. Among patients with RBD IgG index value available during December 1-December 24, 2021, we also evaluated the association between the circulating RBD IgG titer and risk for Omicron variant SARS-CoV-2 infection.
ResultsOf the 4,697 patients we followed with monthly RBD assays, 3576 are included in the main analysis cohort; among these, 852 (24%) were unvaccinated. Antibody response to third doses was robust (median peak index IgG value at assay limit of 150, equivalent to 3270 binding antibody units/mL). Between December 25-January 31, 2022, SARS-CoV-2 infection was documented 340 patients (7%), 115 (36%) of whom were hospitalized. The final doses of vaccines were given a median of 272 (25th, 75th percentile, 245-303) days and 58 (25th, 75th percentile, 51-95) days prior to infection for the 1-2 dose and 3 dose vaccine groups respectively. Relative risks for infection were higher among patients without vaccination (RR 2.1 [95%CI 1.6, 2.8]), and patients with 1-2 doses (RR 1.3 [95%CI 1.0, 1.8]), compared with patients with three doses of the mRNA vaccines. Relative risks for infection were higher among patients with RBD index values < 23 (506 BAU/mL), compared with RBD index value [≥] 23 (RR 2.4 [95%CI 1.9, 3.0]). The higher risk for infection among patients with RBD index values < 23 was present among patients who received three doses (RR 2.1 [95%CI 1.3, 3.4]).
ConclusionsAmong patients receiving hemodialysis, patients unvaccinated, without a third mRNA vaccine dose, or those lacking robust circulating antibody response are at higher risk for Omicron variant infection. Low circulating antibodies could identify the subgroup needing intensified surveillance, prophylaxis or treatment in this patient population. | epidemiology |
10.1101/2022.03.15.22272362 | Community-level characteristics of COVID-19 vaccine hesitancy in England: A nationwide cross-sectional study | One year after the start of the COVID-19 vaccination programme in England, more than 43 million people older than 12 years old had received at least a first dose. Nevertheless, geographical differences persist, and vaccine hesitancy is still a major public health concern; understanding its determinants is crucial to managing the COVID-19 pandemic and preparing for future ones. In this cross-sectional population-based study we used cumulative data on the first dose of vaccine received by 01-01-2022 at Middle Super Output Area level in England. We used Bayesian hierarchical spatial models and investigated if the geographical differences in vaccination uptake can be explained by a range of community-level characteristics covering socio-demographics, political view, COVID-19 health risk awareness and targeting of high risk groups and accessibility. Deprivation is the covariate most strongly associated with vaccine uptake (Odds Ratio 0.55, 95%CI 0.54-0.57; most versus least deprived areas). The most ethnically diverse areas have a 38% (95%CI 36-40%) lower odds of vaccine uptake compared with those least diverse. Areas with the highest proportion of population between 12 and 24 years old had lower odds of vaccination (0.87, 95%CI 0.85-0.89). Finally increase in vaccine accessibility is associated with higher COVID-19 uptake (OR 1.07, 95%CI 1.03-1.12). Our results suggest that one year after the start of the vaccination programme, there is still evidence of inequalities in uptake, affecting particularly minorities and marginalised groups. Strategies including prioritising active outreach across communities and removing practical barriers and factors that make vaccines less accessible are needed to level up the differences. | epidemiology |
10.1101/2022.03.14.22271392 | Childhood temperamental, emotional, and behavioral predictors of clinical mood and anxiety disorders in adolescence | BackgroundMood and anxiety disorders, often emerging during adolescence, account for a large share of the global burden of disability. Prospectively assessed premorbid early signs and trajectories can provide useful insights for early detection and development of these disorders.
MethodsUsing the health registry linked Norwegian Mother, Father and Child Cohort Study (MoBa) of 110,367 children, we here examine cross-sectional and longitudinal association between temperamental traits, emotional and behavioral problems in childhood (0.5-8 years) and diagnosis of mood or anxiety (emotional) disorders in adolescence (10-18 years). We included birth year and sex, retrieved from the Medical Birth Registry of Norway, as covariates in all analyses.
ResultsLogistic regression analyses showed consistent and increasing associations between childhood negative emotionality, behavioral and emotional problems and adolescent diagnosis of emotional disorders, present from 6 months of age (negative emotionality) and with similar magnitude of association for the associated traits. Latent profile analysis incorporating latent growth models identified five developmental profiles of emotional and behavioral problems. A profile of early increasing behavioral and emotional problems with combined symptoms at 8 years (1.3% of sample) was the profile most strongly associated with emotional disorders in adolescence (OR vs. reference: 5.00, 95% CI: 3.73-6.30).
ConclusionsWe found a consistent and increasing association between negative emotionality, behavioral and emotional problems in early to middle childhood and mood and anxiety disorders in adolescence. A developmental profile coherent with early and increasing disruptive mood dysregulation across childhood was most predictive of adolescent emotional disorders. Our results highlight the importance of early emotional dysregulation and childhood as a formative period in the development of adolescent mood and anxiety disorders, supporting a potential for prevention and early intervention initiatives. | psychiatry and clinical psychology |
10.1101/2022.03.15.22272402 | Network-Based Methods for Psychometric Data of Eating Disorders: A Systematic Review | Network science represents a powerful and increasingly promising method for studying complex real-world problems. In the last decade, it has been applied to psychometric data in the attempt to explain psychopathologies as complex systems of causally interconnected symptoms. With this work, we aimed to review a large sample of network-based studies that exploit psychometric data related to eating disorders (ED) trying to highlight important aspects such as core symptoms, influences of external factors, comorbidities, and changes in network structure and connectivity across both time and subpopulations. A particular focus is here given to the potentialities and limitations of the available methodologies used in the field. At the same time, we also give a review of the statistical software packages currently used to carry out each phase of the network estimation and analysis workflow. Although many theoretical results, especially those concerning the ED core symptoms, have already been confirmed by multiple studies, their supporting function in clinical treatment still needs to be thoroughly assessed. | psychiatry and clinical psychology |
10.1101/2022.03.15.22271536 | High Brain-Derived Neurotrophic Factor (BDNF) and Negative Psychological Flexibility associate with Early Fatigue Syndrome in healthy females | BackgroundRecent studies showed that enhancing psychological flexibility could improve fatigue interference. Brain-Derived Neurotrophic Factor (BDNF), Heart Rate Variability (HRV), and Cortisol were proposed to involve biomarkers in psychological flexibility. Our study aims to explore the association of fatigue with psychological flexibility and related biomarkers.
MethodA cross-sectional study gathered data from a baseline characteristic mindful volunteer. Each participant was self-evaluated with the questionnaire of fatigue and psychological flexibility. The participants were evaluated potential biomarkers related to psychological flexibility including HRV, serum cortisol, and BDNF within one week after responding to the questionnaire.
ResultsThe 47 healthy females including 22 nurses and 25 occupational therapy students, mean age 29.70 {+/-} 12.55 years. The prevalence of fatigue is 38.30%. The multivariate analysis showed the independent factors associated with fatigue including negative psychological flexibility (OR 1.31, p=0.03) and high BDNF (OR 1.33, p=0.05).
ConclusionOur study found that psychological flexibility and high BDNF was independent factors associate with fatigue. This result provide insight that intervention that increase either psychological flexibility may prevent fatigue symptoms. The high BDNF may reflex the adaptive response of fatigue person and may be potential biomarkers for detecting early fatigue conditions. | psychiatry and clinical psychology |
10.1101/2022.03.15.22271536 | High Brain-Derived Neurotrophic Factor (BDNF) and Negative Psychological Flexibility associate with Early Fatigue Syndrome in healthy females | BackgroundRecent studies showed that enhancing psychological flexibility could improve fatigue interference. Brain-Derived Neurotrophic Factor (BDNF), Heart Rate Variability (HRV), and Cortisol were proposed to involve biomarkers in psychological flexibility. Our study aims to explore the association of fatigue with psychological flexibility and related biomarkers.
MethodA cross-sectional study gathered data from a baseline characteristic mindful volunteer. Each participant was self-evaluated with the questionnaire of fatigue and psychological flexibility. The participants were evaluated potential biomarkers related to psychological flexibility including HRV, serum cortisol, and BDNF within one week after responding to the questionnaire.
ResultsThe 47 healthy females including 22 nurses and 25 occupational therapy students, mean age 29.70 {+/-} 12.55 years. The prevalence of fatigue is 38.30%. The multivariate analysis showed the independent factors associated with fatigue including negative psychological flexibility (OR 1.31, p=0.03) and high BDNF (OR 1.33, p=0.05).
ConclusionOur study found that psychological flexibility and high BDNF was independent factors associate with fatigue. This result provide insight that intervention that increase either psychological flexibility may prevent fatigue symptoms. The high BDNF may reflex the adaptive response of fatigue person and may be potential biomarkers for detecting early fatigue conditions. | psychiatry and clinical psychology |
10.1101/2022.03.14.22272395 | Clinical and Demographic Characteristics of Treatment Requiring Retinopathy of Prematurity (ROP) in Big Premature Infants in Turkey - Report No: 1 (BIG-ROP STUDY) | AimTo demonstrate the clinical and demographic features of infants with gestational age (GA) of 32-37 weeks (wk) and birth weight (BW) of >1500 g who developed treatment requiring retinopathy of prematurity (ROP).
MethodsRetrospective, observational, descriptive, multicentre study was conducted by the Turkish Ophthalmological Association ROP commission. Data on the infants with a GA of 32-37 wk and BW >1500 g who developed treatment-requiring ROP were collected from the 33 ROP centres in Turkey. GA, BW, type of hospital, neonatal intensive care units (NICU) level, length of stay in NICU, duration of oxygen therapy, comorbidities, type of ROP and time for treatment-requiring ROP (TR-ROP) development were analysed.
ResultsTotally 366 infants were included in the study. The mean GA and BW were 33{+/-}1 wk and 1896 {+/-} 316 g, respectively. Duration of hospitalization was 3-4 wk in 46.8% of them. The first ROP examination was performed at postnatal 4-5 wk in 80.3% of infants, which was significantly later in lower levels of NICU and non-university clinics. ROP was detected in 90.9% of infants at the first ROP examination, especially in clinics without an ophthalmologist. In 15.3% of the infants, treatment was required in postnatal fourth week, and the mean development of TR-ROP was 6.16 {+/-} 2.04 wk
ConclusionRoutine ROP screening thresholds need to be expanded in hospitals with suboptimal NICU conditions considering the development of TR-ROP in more mature and heavier preterm infants, and the first ROP examination should be no later than postnatal fourth week.
What is already known on this topicTreatment-requiring retinopathy of prematurity (TR-ROP) may develop in bigger and more mature infants with a gestational age >32 weeks and birth weight >1500g especially in low/middle-income countries where proper neonatal intensive care conditions could not be provided,
What this study addsThis is the first study analysing the regional differences and the effect of presence of the ophthalmologist and neonatologist in the same hospital as NICU on the development of TR-ROP in a nation-wide study in Turkey. This study emphasizes the high rate of ROP at the first examination in these bigger babies and progression to TR-ROP in a short time. The ROP in these infants may be more aggressive like A-ROP and may progress rapidly in a short time. The results of our study suggest the need for a timely (even earlier) screening for bigger infants and a revision to expand the limits of the ROP screening program to bigger infants in at least underdeveloped parts of Turkey. This may be generalized to all of the underdeveloped countries.
How this study might affect research, practice or policyScreening criteria for ROP need to be revised for the coverage of bigger infants in Turkey depending on the NICU conditions of the hospitals. Infants with a gestational age of >32 weeks and a birth weight of >1500 g may need to be screened for ROP earlier than postnatal four weeks. Increasing the number of well-educated neonatologists and ophthalmologists as well as other NICU conditions will improve neonatal care for ROP to the standards of developed countries. | ophthalmology |
10.1101/2022.03.14.22272393 | Estimating the burden of foodborne gastroenteritis due to nontyphoidal Salmonella enterica, Shigella and Vibrio parahaemolyticus in China | To estimate the incidence of foodborne gastroenteritis caused by nontyphoidal Salmonella enterica, Shigella, and Vibrio parahaemolyticus in China, population surveys and sentinel hospital surveillance were implemented in six provinces from July 2010 to July 2011, and a multiplier calculation model for the burden of disease was constructed. The multiplier for salmonellosis and V. parahaemolyticus gastroenteritis was estimated at 4,137 [95% confidence interval (CI) 2,320-5,663], and for shigellosis at 4,356 (95% CI 2,443-5,963). Annual incidence per 100,000 population was estimated as 245 (95% CI 138-336), 67 (95% CI 38-92), and 806 (95% CI 452-1,103) for foodborne salmonellosis, shigellosis, and V. parahaemolyticus gastroenteritis, respectively, indicating that foodborne infection caused by these three pathogens constitutes an important burden to the Chinese healthcare system. Continuous implementation of active surveillance of foodborne diseases, combined with multiplier models to estimate disease burden, makes it possible for us to better understand food safety status in China. | epidemiology |
10.1101/2022.03.16.22272465 | No evidence for environmental transmission risk of SARS-CoV-2 in the UK's largest urban river system: London as a case study | The presence of SARS-CoV-2 in untreated sewage has been confirmed in many countries but its incidence and infection risk in contaminated freshwaters is still poorly understood. The River Thames in the UK receives untreated sewage from 57 Combined Sewer Overflows (CSOs), with many discharging dozens of times per year. We investigated if such discharges provide a pathway for environmental transmission of SARS-CoV-2. Samples of wastewater, surface water, and sediment collected close to six CSOs on the River Thames were assayed over 8 months for SARS-CoV-2 RNA and infectious virus. Bivalves were sampled as sentinel species of viral bioaccumulation. Sediment and water samples from the Danube and Sava rivers in Serbia, where raw sewage is also discharged in high volumes, were assayed as a positive control. We found no evidence of SARS-CoV-2 RNA or infectious virus in UK samples, in contrast to RNA positive water and sediment samples from Serbia. Furthermore, we show that infectious SARS-CoV-2 inoculum is stable in Thames water and sediment for < 3 days, while RNA remained detectable for at least seven days. This indicates that dilution of wastewater likely limits environmental transmission, and that infectivity should be embedded in future risk assessments of pathogen spillover. | epidemiology |
10.1101/2022.03.15.22272438 | COVID-19 Vaccine Hesitancy among Marginalized Populations in the U.S. and Canada: Protocol for a Scoping Review | IntroductionDespite the development of safe and highly efficacious COVID-19 vaccines, extensive barriers to vaccine deployment and uptake threaten the effectiveness of vaccines in controlling the pandemic. Notably, marginalization produces structural and social inequalities that render certain populations disproportionately vulnerable to COVID-19 incidence, morbidity, and mortality, and less likely to be vaccinated. The purpose of this scoping review is to provide a comprehensive overview of definitions/conceptualizations, elements, and determinants of COVID-19 vaccine hesitancy among marginalized populations in the U.S. and Canada.
Materials and MethodsThe proposed scoping review follows the framework outlined by Arksey and OMalley, and further developed by the Joanna Briggs Institute. It will comply with reporting guidelines from the Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR). The overall research question is: What are the definitions/conceptualizations and factors associated with vaccine hesitancy in the context of COVID-19 vaccines among adults from marginalized populations in the U.S. and Canada. Search strategies will be developed using controlled vocabulary and selected keywords, and customized for relevant databases, in collaboration with a research librarian. The results will be analyzed and synthesized quantitatively (i.e., frequencies) and qualitatively (i.e., thematic analysis) in relation to the research questions, guided by a revised WHO Vaccine Hesitancy Matrix.
DiscussionThis scoping review will contribute to honing and advancing the conceptualization of COVID-19 vaccine hesitancy and broader elements and determinants of underutilization of COVID-19 vaccination among marginalized populations, identify evidence gaps, and support recommendations for research and practice moving forward. | public and global health |
10.1101/2022.03.16.22272461 | Parapneumonic effusions related to Streptococcus pneumoniae: serotype and disease severity trends | IntroductionStreptococcus pneumoniae epidemiology is changing in response to vaccination and some data suggest empyema incidence is increasing. However, differences exist between UK and USA studies. We describe trends in the clinical phenotype of adult pneumococcal pleural infection, including simple parapneumonic effusions (SPE) in the pneumococcal conjugate vaccination (PCV) era.
MethodsRetrospective cohort study, all adults [≥]16 years admitted to three large UK hospitals, 2006-2018 with pneumococcal disease. Medical records were reviewed for each clinical episode. Serotype data were obtained from the UKHSA national reference laboratory.
Results2477 invasive pneumococcal cases were identified: 459 SPE and 100 pleural infection cases. Incidence increased over time, including non-PCV-serotype disease. PCV7-serotype disease declined following paediatric PCV7 introduction, but the effect of PCV13 was less apparent as disease caused by the additional six serotypes plateaued with serotypes 1 and 3 causing such parapneumonic effusions from 2011 onwards.
Mortality was lower in pleural infection patients than SPE patients. Patients with frank pus had lower mortality then those without. 90-day mortality could be predicted by baseline increased RAPID score.
ConclusionPneumococcal infection continues to cause severe disease despite the introduction of PCVs. The predominance of serotype 1 and 3 in this adult UK cohort is in keeping with previous studies in paediatric and non-UK studies. Rising non-PCV serotype disease and limited impact of PCV13 on cases caused by serotypes 1 and 3 offset the reductions in adult pneumococcal parapneumonic effusion disease burden observed following introduction of the childhood PCV7 programme.
Take home messagePneumococcal parapneumonic effusions show increasing incidence despite PCV introduction, with serotype 1 and 3 persisting and increase in non-PCV13 disease. Pleural infection showed lower mortality than SPE, and baseline RAPID score predicted 90-day mortality. | respiratory medicine |
10.1101/2022.03.15.22272453 | Low Vision Rehabilitation Training and Referral Patterns Among Ophthalmologists | ObjectiveVision impairment represents a growing burden to society and training protocols related to low vision rehabilitation (vision rehab) vary across ophthalmology residency programs. We surveyed practicing ophthalmologists regarding their vision rehab knowledge, confidence levels, and referral thresholds. We categorized subjects and compared response patterns between groups.
DesignProspective observational
Subjects185 practicing ophthalmologists
MethodsWe created an Ophthalmology Low Vision Questionnaire and administered it to all enrolled subjects via email. We categorized subjects based on duration of practice, subspecialty area, and exposure to vision rehab during residency training. We drew conclusions by comparing responses between various subject categories.
Main Outcome MeasuresThe primary outcome measure was comparison of confidence levels and thresholds for vision rehab referral across groups. We used statistical tests to look for associations between practice duration, subspecialty area, vision rehab exposure during residency, and referral patterns.
ResultsOphthalmologists practicing for 6 or less years were more likely to have had a formal vision rehab rotation during their residency training compared to ophthalmologists practicing for 7 or more years (P = 0.03). Ophthalmologists who completed a formal vision rehab rotation during residency reported greater confidence in their ability to appropriately identify patients who could benefit from vision rehab referral (P = 0.04) and referred patients at earlier visual acuity thresholds (P = 0.04). Clinical subspecialty did not have a significant effect on vision rehab referral confidence or thresholds.
ConclusionsVision rehab rotations during residency lead to improved referral confidence and earlier referral for these vital services. Standardization of vision rehab exposure across ophthalmology residency programs can help to improve outcomes for visually impaired patients. | ophthalmology |
10.1101/2022.03.16.22271983 | Minimal SARS-CoV-2 classroom transmission at a large urban university experiencing repeated into campus introduction | SARS-CoV-2, the causative agent of COVID-19, has displayed person to person transmission in a variety of indoor situations. This potential for robust transmission has posed significant challenges to day-to-day activities of colleges and universities where indoor learning is a focus. Concerns about transmission in the classroom setting have been of concern for students, faculty and staff. With the simultaneous implementation of both non-pharmaceutical and pharmaceutical control measures meant to curb the spread of the disease, defining whether in-class instruction without any physical distancing is a risk for driving transmission is important. We examined the evidence for SARS-CoV-2 transmission on a large urban university campus that mandated vaccination and masking but was otherwise fully open without physical distancing during a time of ongoing transmission of SARS-CoV-2 both at the university and in the surrounding counties. Using weekly surveillance testing of all on-campus individuals and rapid contact tracing of individuals testing positive for the virus we found little evidence of in-class transmission. Of more than 140,000 in-person class events, only nine instances of potential in-class transmission were identified. When each of these events were further interrogated by whole-genome sequencing of all positive cases significant genetic distance was identified between all potential in-class transmission pairings, providing evidence that all individuals were infected outside of the classroom. These data suggest that under robust transmission abatement strategies, in-class instruction is not an appreciable source of disease transmission. | infectious diseases |
10.1101/2022.03.15.22272432 | Comorbidities Diminish the Likelihood of Seropositivity After SARS-CoV-2 Vaccination | BackgroundThe impact of chronic health conditions (CHC) on serostatus post-severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) vaccination is unknown.
MethodsWe assessed serostatus post-SARS-CoV-2 vaccination among fully vaccinated participants recruited between April 2021 through August 2021 in 18 years and older residents of Jefferson County, Kentucky, USA. Serostatus was determined by measuring SARS-CoV-2 Spike protein specific immunoglobulin (Ig) G (Spike IgG) antibodies via enzyme-linked immunoassay (ELISA) in peripheral blood samples.
ResultsOf the 5,178 fully vaccinated participants, 51 were seronegative and 5,127 were seropositive. Chronic kidney disease (CKD) (OR=13.49; 95% CI: 4.88-37.3; P<0.0001) and autoimmune disease (OR=11.34; 95% CI: 5.21-24.69; P<0.0001) showed highest association with negative serostatus in fully vaccinated participants. The absence of any CHC was strongly associated with positive serostatus (OR=0.37; 95% CI: 0.19-0.73; P=0.003). The risk of negative serostatus increased in the presence of two CHCs (OR=2.82; 95% CI: 1.14-7) to three or more CHCs (OR=4.52; 95% CI: 1.68-12.14). Similarly, use of 2 or more CHC related medications was significantly associated with seronegative status (OR=6.08; 95%: 2.01-18.35).
ConclusionsPresence of any CHC, especially CKD or autoimmune disease, increased the likelihood of seronegative status among individuals who were fully vaccinated to SAR-CoV-2. This risk increased with a concurrent increase in number of comorbidities, especially with multiple medications. Absence of any CHC was protective and increased the likelihood of a positive serological response post-vaccination. These results will help develop appropriate guidelines for booster doses and targeted vaccination programs. | allergy and immunology |