id
stringlengths 16
27
| title
stringlengths 18
339
| abstract
stringlengths 95
38.7k
| category
stringlengths 7
44
|
---|---|---|---|
10.1101/2022.04.13.22273562 | Quantitative longitudinal natural history of eight gangliosidoses - conceptual framework and baseline data of the German 8-in-1 disease registry. A cross-sectional analysis. | PurposeGangliosidoses are a group of inherited neurogenetic autosomal recessive lysosomal storage disorders usually presenting with progressive macrocephaly, developmental delay and regression, leading to significant morbidity, and premature death. A quantitative definition of the natural history would support and enable clinical development of specific therapies.
MethodsSingle disease registry of eight gangliosidoses (NCT04624789).
Cross-sectional analysis of baseline data in N= 26 patients.
Primary endpoint: disease severity assessed by the 8-in-1 score.
Secondary endpoints: first neurological sign or symptom observed a. by parents and b. by physicians, diagnostic delay, as well as phenotypical characterization.
Tertiary endpoints: Neurological outcomes (development, ataxia, dexterity) and disability.
ResultsThe 8-in-1 score quantitatively captured severity of disease. Parents recognized initial manifestations (startle reactions) earlier than physicians (motor developmental delay and hypotonia). Median diagnostic delay was 3.16 [IQR 0.69 ... 6.25] years. Eight patients presented with late-infantile phenotypes.
ConclusionData in this registry raise awareness of these rare and fatal conditions in order to accelerate diagnosis, inform counselling of afflicted families, define quantitative endpoints for clinical trials, and can serve as historical controls for future therapeutic studies. The characterization of a late-infantile phenotype is novel. Longitudinal follow-up is planned. | genetic and genomic medicine |
10.1101/2022.04.13.22273691 | The Co-Production of the Roots Framework: A Reflective Framework for Mapping the Implementation Journey of Trauma-Informed Care | BackgroundThe trauma-informed care programme at the Tees, Esk and Wear Valleys Foundation NHS Trust identified a need to evaluate the ongoing service-wide trauma-informed care implementation effort. An absence of staff, service user and system-related outcomes specific to trauma-informed care presented barriers to monitoring the adoption of trauma-informed approaches and progress over time across the Tees, Esk and Wear Valleys Foundation NHS Trust. This paper describes the co-production of a new self-assessment tool, Roots, a discussion-based framework that facilitates learning and improvement by reflecting on positive or negative examples of trauma-informed services.
MethodsUsing secondary data obtained from an affiliated national trauma summit and instruments found in literature, domains and items were co-produced with the help of trauma-informed care leads, NHS staff and service users. The research design consisted of community-based co-production methods such as surveys, focus groups, and expert consultations.
ResultsAdopting trauma-informed care requires enthusiasm and commitment from all members of the organisation. Services must adapt to meet the dynamic needs of staff and service users to ensure they remain trauma-informed; this must be done as a community.
ConclusionsFollowing an extensive co-production process, the Roots framework was published open-access and accompanied by a user manual. Roots can provide both qualitative and quantitative insights on trauma-informed care implementation by provoking the sharing of experience across services. | health systems and quality improvement |
10.1101/2022.04.14.22273759 | Understanding community level influences on COVID-19 prevalence in England: New insights from comparison over time and space | The COVID-19 pandemic has impacted communities far and wide and put tremendous pressure on healthcare systems of countries across the globe. Understanding and monitoring the major influences on COVID-19 prevalence is essential to inform policy making and device appropriate packages of non-pharmaceutical interventions (NPIs).
This study evaluates community level influences on COVID-19 incidence in England and their variations over time with specific focus on understanding the impact of working in so called high-risk industries such as care homes and warehouses. Analysis at community level allows accounting for interrelations between socioeconomic and demographic profile, land use, and mobility patterns including residents self-selection and spatial sorting (where residents choose their residential locations based on their travel attitudes and preferences or social structure and inequality); this also helps understand the impact of policy interventions on distinct communities and areas given potential variations in their mobility, vaccination rates, behavioural responses, and health inequalities. Moreover, community level analysis can feed into more detailed epidemiological and individual models through tailoring and directing policy questions for further investigation.
We have assembled a large set of static (socioeconomic and demographic profile and land use characteristics) and dynamic (mobility indicators, COVID-19 cases and COVID-19 vaccination uptake in real time) data for small area statistical geographies (Lower Layer Super Output Areas, LSOA) in England making the dataset, arguably, the most comprehensive set assembled in the UK for community level analysis of COVID-19 infection. The data are integrated from a wider range of sources including telecommunications companies, test and trace data, national travel survey, Census and Mid-Year estimates.
To tackle methodological challenges specifically accounting for highly interrelated influences, we have augmented different statistical and machine learning techniques. We have adopted a two-stage modelling framework: a) Latent Cluster Analysis (LCA) to classify the country into distinct land use and travel patterns, and b) multivariate linear regression to evaluate influences at each distinct travel cluster. We have also segmented our data into different time periods based on changes in policies and evolvement in the course of pandemic (such as the emergence of a new variant of the virus). By segmenting and comparing influences across spaces and time, we examine more homogeneous behaviour and uniform distribution of infection risks which in turn increase the potential to make causal inferences and help understand variations across communities and over time. Our findings suggest that there exist significant spatial variations in risk influences with some being more consistent and persistent over time. Specifically, the analysis of industrial sectors shows that communities of workers in care homes and warehouses and to a lesser extent textile and ready meal industries tend to carry a higher risk of infection across all spatial clusters and over the whole period we modelled in this study. This demonstrates the key role that workplace risk has to play in COVID-19 risk of outbreak after accounting for the characteristics of workers residential area (including socioeconomic and demographic profile and land use features), vaccination rate, and mobility patterns. | health informatics |
10.1101/2022.04.08.22272501 | EMS responses to hospice patients: a qualitative study | 1In the palliative care community, multiple emergency department visits and hospitalizations are considered an indicator of poor-quality end of life care. There is substantial research on the events surrounding ED visits, and subsequent hospitalizatons, by hospice patients. It is important however to understand that these hospital arrivals often begin with an ambulance response to the home. Emergency medical services (EMS) responses to hospice patients have been less well-studied. We conducted a retrospective study of 170 consecutive electronic patient care reports produced contemporaneously by EMS providers in several US counties during the normal course of their responses to hospice patients. We summarized descriptive epidemiology of the patients and of the incidents, and we explored the provider narratives for recurring themes. To our knowledge this is the first study to explore the contemporaneously-documented on-scene circumstances of pre-hospital care for patients enrolled in a hospice program.
The median patient age was 76. Cancer, chronic lung disease, and heart failure were the most common hospice-qualifying diagnoses. Transportation of the patient from the scene occurred in 111 cases (65.3%). Consistent with previous studies based on interviews of EMS providers, the most common chief complaints included pain, dyspnea, and altered mental status. EMS was frequently called for confirmation of death, a finding not previously reported.
Transitions of care, such as from curative medical care to hospice care, and from hospital to home, were a recurring theme. Gaps in care around these transitions were often evident, with EMS being called upon to fill them. Care transitions around weekends were prominent. Also emerging from the narratives was the concept that hospice enrollment is not a single event but rather a step-wise process; we developed a preliminary coding taxonomy to classify those enrollment stages.
The hospice-EMS interface can be complex. Although all care by EMS providers is with the literal or implied consent of the patient, the arrival of an ambulance to a scene where emotions are running high carries a certain momentum and could lead to a cascade of interventions that, in retrospect and given other options, the patient may not have wanted. A better understanding of the hospice-EMS interface might illuminate changes that could be made to improve it, with a goal of ensuring that hospice patients and their families receive emergency medical services when they need and can benefit from them but receive other, non-EMS, services when those are more suitable to their needs. It might also improve the efficiency of an already overburdened EMS system. | emergency medicine |
10.1101/2022.04.05.22273394 | Ambient Temperature and Dengue Hospitalisation in Brazil over a 10-year period, 2010-2019: a times series analysis | BackgroundClimate factors are known to influence seasonal patterns of dengue transmission. However, little is known about the effect of extreme heat on the severity of dengue infection, such as hospital admission. We aimed to quantify the effect of ambient temperature on dengue hospitalisation risk in Brazil.
MethodsWe retrieved daily dengue hospitalisation counts by each of 5,570 municipalities across the 27 states of Brazil from 1st of January 2010 to 31st of December 2019, from the Brazilian Public Hospital Admission System ("SIH"). We obtained average daily ambient temperature for each municipality from the ERA5-land product reanalysis. We combined distributed lag non-linear models with time stratified design model framework to pool an estimate for dose-response and lag-response structures for the association of Dengue hospitalisation relative risk (RR) and temperature. We estimated the overall dengue hospitalisation RR for the whole country as well as for each of the five macro-regions by meta-analysing state level estimates.
Findings579,703 hospital admissions due to dengue occurred over the 10 years period of 2010 to 2019. We observed a positive association between high temperatures and high risk of hospitalisation for Brazil and each of the five macro-regions. The overall RR for dengue hospitalisation was at the 50th percentile of temperature distribution 1{middle dot}40 (95% IC 1{middle dot}27-1{middle dot}54) and at 95th percentile of temperature the RR was 1{middle dot}50 (1{middle dot}38-1{middle dot}66) for Brazil, relative to the minimum temperature, which was the one with the lowest risk. We also found lagged effects of heat on hospitalisation, particularly observed on the same day (lag 0) both at the 50th percentile and 95th.
InterpretationHigh temperatures are associated with an increase in the risk of hospitalisation by dengue infection. These findings may guide preparedness and mitigation policies during dengue season outbreaks, particularly on the health-care demand.
FundingConselho Nacional de Pesquisa, Coordenacao Nacional de Aperfeicoamento de Pessoal, Institut de Salud Carlos III.
Research in contextO_ST_ABSEvidence before this studyC_ST_ABSTemperature can have an effect of aggravating a dengue infection progressing to severe disease requiring hospitalisation. On February 12, 2022, we searched PudMed for the terms "dengue", "severe or hospital" and "temperature or heat". Although there is extensive literature on the association of temperature and number of dengue cases, we did not find any study quantifying the association of ambient temperature and hospitalisation by dengue. Quantitative evidence on this association, and how the extreme high temperature augments the risk of hospitalisation by dengue is needed to guide plans of mitigation and preparedness.
Added value of this studyOur findings show that heat can aggravate an on-going infection of Dengue virus leading to hospitalisation. Average temperatures above 23.96{degrees} Celsius degrees, the 50th percentile of temperature distribution over Brazil, represents a risk of over 1.3x to the Minimum Hospitalisation Temperature. An average temperature of 28.68{degrees} Celsius degrees, the 95th percentile of temperature distribution, represents a risk of hospitalisation of over 1.4x to the Minimum Hospitalisation Temperature. The South region, which does not have a high incidence of Dengue cases and a colder climate compared to other regions, has a temperature-related risk of hospitalisation of greater magnitude compared to other regions.
Implications of all the available evidenceHeat exposure plays an important role to the public health policy for mitigating dengue virus burden on the population. Hospitalisation is costly to any health system and can be more impacting on fragile or unprepared health systems as frequently found in countries affected by dengue. Due to its hyperendemicity and its seasonality, dengue can be predicted in mid to long term allowing for better designed early-warnings systems to mitigation measures. | epidemiology |
10.1101/2022.04.05.22273394 | Ambient Temperature and Dengue Hospitalisation in Brazil over a 10-year period, 2010-2019: a times series analysis | BackgroundClimate factors are known to influence seasonal patterns of dengue transmission. However, little is known about the effect of extreme heat on the severity of dengue infection, such as hospital admission. We aimed to quantify the effect of ambient temperature on dengue hospitalisation risk in Brazil.
MethodsWe retrieved daily dengue hospitalisation counts by each of 5,570 municipalities across the 27 states of Brazil from 1st of January 2010 to 31st of December 2019, from the Brazilian Public Hospital Admission System ("SIH"). We obtained average daily ambient temperature for each municipality from the ERA5-land product reanalysis. We combined distributed lag non-linear models with time stratified design model framework to pool an estimate for dose-response and lag-response structures for the association of Dengue hospitalisation relative risk (RR) and temperature. We estimated the overall dengue hospitalisation RR for the whole country as well as for each of the five macro-regions by meta-analysing state level estimates.
Findings579,703 hospital admissions due to dengue occurred over the 10 years period of 2010 to 2019. We observed a positive association between high temperatures and high risk of hospitalisation for Brazil and each of the five macro-regions. The overall RR for dengue hospitalisation was at the 50th percentile of temperature distribution 1{middle dot}40 (95% IC 1{middle dot}27-1{middle dot}54) and at 95th percentile of temperature the RR was 1{middle dot}50 (1{middle dot}38-1{middle dot}66) for Brazil, relative to the minimum temperature, which was the one with the lowest risk. We also found lagged effects of heat on hospitalisation, particularly observed on the same day (lag 0) both at the 50th percentile and 95th.
InterpretationHigh temperatures are associated with an increase in the risk of hospitalisation by dengue infection. These findings may guide preparedness and mitigation policies during dengue season outbreaks, particularly on the health-care demand.
FundingConselho Nacional de Pesquisa, Coordenacao Nacional de Aperfeicoamento de Pessoal, Institut de Salud Carlos III.
Research in contextO_ST_ABSEvidence before this studyC_ST_ABSTemperature can have an effect of aggravating a dengue infection progressing to severe disease requiring hospitalisation. On February 12, 2022, we searched PudMed for the terms "dengue", "severe or hospital" and "temperature or heat". Although there is extensive literature on the association of temperature and number of dengue cases, we did not find any study quantifying the association of ambient temperature and hospitalisation by dengue. Quantitative evidence on this association, and how the extreme high temperature augments the risk of hospitalisation by dengue is needed to guide plans of mitigation and preparedness.
Added value of this studyOur findings show that heat can aggravate an on-going infection of Dengue virus leading to hospitalisation. Average temperatures above 23.96{degrees} Celsius degrees, the 50th percentile of temperature distribution over Brazil, represents a risk of over 1.3x to the Minimum Hospitalisation Temperature. An average temperature of 28.68{degrees} Celsius degrees, the 95th percentile of temperature distribution, represents a risk of hospitalisation of over 1.4x to the Minimum Hospitalisation Temperature. The South region, which does not have a high incidence of Dengue cases and a colder climate compared to other regions, has a temperature-related risk of hospitalisation of greater magnitude compared to other regions.
Implications of all the available evidenceHeat exposure plays an important role to the public health policy for mitigating dengue virus burden on the population. Hospitalisation is costly to any health system and can be more impacting on fragile or unprepared health systems as frequently found in countries affected by dengue. Due to its hyperendemicity and its seasonality, dengue can be predicted in mid to long term allowing for better designed early-warnings systems to mitigation measures. | epidemiology |
10.1101/2022.04.12.22273277 | Estimation of the morbidity and mortality of congenital Chagas disease: a systematic review and meta-analysis | Chagas disease is caused by the parasite Trypanosoma cruzi which can be transmitted from mother to baby during pregnancy. There is no consensus on the proportion of infected infants that become symptomatic for congenital Chagas disease (cCD). The objective of this systematic review is to determine the burden of cCD.
Articles from journal inception to 2020 reporting morbidity and mortality associated with cCD were retrieved from academic search databases. Observational studies, randomized-control trials, and studies of babies diagnosed with cCD were included. Studies were excluded if they were case reports or series, without original data, case-control without cCD incidence estimates, and/or did not report number of participants. Two reviewers screened articles for inclusion. To determine pooled proportion of symptomatic infants with cCD, individual symptoms, and case-fatality, random effects meta-analysis was performed.
We identified 4,531 records and reviewed 4,301, including 47 articles in the narrative summary and analysis. 28.3% (95% confidence interval (CI) = 19.0%, 38.5%); of cCD infants were symptomatic and 2.2% of infants died (95% CI = 1.3%, 3.5%). The proportion of infected infants with hepatosplenomegaly was 12.5%, preterm birth 6.0%, low birth weight 5.8%, anemia 4.9%, and jaundice 4.7%. Although most studies did not include a comparison group of non-infected infants, the proportion of infants with cCD symptomatic at birth are comparable to those with congenital toxoplasmosis (10.0%-30.0%) and congenital cytomegalovirus (10.0%-15.0%).
We conclude that cCD burden appears significant, but more studies comparing infected mother-infant dyads to non-infected ones are needed to determine an association of this burden to cCD infection.
Author summaryChagas disease is caused by the parasite Trypanosoma cruzi, which can be passed from mother to infant. It is estimated that one million women of reproductive age are infected with T. cruzi. Prior to our work, the proportion of infants infected with T. cruzi congenitally presenting with clinical symptoms was unknown. After systematically searching for and identifying studies that collected information on infants with congenital Chagas disease, we summarized and analyzed 47 studies. Our pooled analysis of these studies estimated that 28.3% of infants with congenital Chagas disease were symptomatic and 2.2% died. Prior work has shown that transmission of T. cruzi from mother to child occurs in 5% of cases. Other studies have shown that this transmission is preventable through treatment of women prior to conception, and infants can be cured if shown to be infected at birth. Our estimated proportion of 28.3% of infants diagnosed with cCD at birth presenting with clinical symptoms are comparable to infants diagnosed with congenital toxoplasmosis presenting with clinical symptoms (10.0%-30.0%) and congenital cytomegalovirus (10.0%-15.0%). More studies comparing infected mother-infant dyads to non-infected mother-infant dyads are needed to determine an association of this burden to cCD infection. | epidemiology |
10.1101/2022.04.06.22273471 | Alpha-1 adrenoreceptor antagonism mitigates extracellular mitochondrial DNA accumulation in lung fibrosis models and in patients with IPF | Idiopathic Pulmonary Fibrosis is increasingly associated with adrenergic innervation and endogenous innate immune ligands such as mitochondrial DNA (mtDNA). Interestingly, a connection between these entities has not been explored. Here we report that noradrenaline (NA) derived from the lungs adrenergic nerve supply drives the accumulation of SMA-expressing fibroblasts via a mechanism involving 1 adrenoreceptors and mtDNA. Using the bleomycin model of lung fibrosis we compared the effect of lung specific adrenergic denervation achieved via the inhalational administration of the sympathetic neurotoxin 6-hydroxydopamine to surgically mediated adrenal ablation and found that NA derived from local but not adrenal sources drives lung fibrosis. Bleomycin induced the appearance of a SMA+ fibroblast population co-expressing the adrenoreceptor alpha-1D (ADRA1D). Therapeutic delivery of the 1 adrenoreceptor antagonist terazosin reversed these changes and suppressed the accumulation of extracellular mtDNA. TGF{beta}1-stimulated normal human lung fibroblasts treated with TGF{beta}1 and Noradrenaline expressed ADRA1D and developed reduced SMA expression and extracellular mtDNA concentrations when treated with terazosin. IPF patients prescribed 1 adrenoreceptor antagonists for non-pulmonary indications showed improved survival and reduced concentrations of plasma mtDNA. These findings link nerve-derived NA and 1 adrenoreceptor antagonism with mtDNA accumulation and lung fibrogenesis in mouse models, cultured cells, and humans with IPF. Further study of this neuro-innate connection may yield new avenues for investigation in the clinical and basic science realms. | respiratory medicine |
10.1101/2022.04.12.22273331 | Beyond Common Outcomes: Clients' Perspectives on the Benefits of Prenatal Care Coordination | Prenatal Care Coordination (PNCC) is a fee-for-service Medicaid benefit available in Wisconsin and several other states. It provides for home visiting, health education, care coordination and other supportive services to high-risk mothers. However, PNCC is not supported by an evidence-based model, its impact is not systematically evaluated, and the benefit is not currently reaching all eligible mothers. The purpose of this qualitative descriptive study was to describe PNCC clients perspectives on the experience of receiving the PNCC benefit and the value and impact of PNCC in the context of their own lives. We interviewed recent clients of PNCC programs in two PNCC sites that varied by racial/ethnic community makeup, rural/urban geography, and health department size. PNCC clients identified two major benefits of PNCC: 1) social and emotional support from the PNCC nurse; and 2) assistance with obtaining or getting connected to other resources they needed. These two program benefits were especially meaningful to PNCC clients in the context of difficult life events and circumstances. Findings from this study highlight the impact of PNCC services on social and emotional health through trusting and supportive nurse-client relationships. Our findings also suggest that a longer program period and the development of standards to assess program effectiveness would improve PNCC client outcomes and reduce disparities in maternal health. | nursing |
10.1101/2022.04.12.22273506 | The Influence of Pregnancy on PrEP uptake and Adherence Amongst HIV-Negative High-Risk Young Women in Kampala, Uganda: A Qualitative Assessment | BackgroundPregnant young women who engage in high-risk sexual activity are at elevated biological and social risk for HIV acquisition. PrEP serves as an effective means of HIV prevention, including during pregnancy. This study aimed to explore attitudes, experiences and challenges with PrEP to understand what motivates or limits PrEP uptake and adherence during pregnancy among this population of young women.
MethodsSemi-structured interviews were conducted with 23 participants, recruited from the Prevention on PrEP (POPPi) study in the Good Health for Women Project clinic in Kampala, Uganda. POPPis inclusion criteria comprised of HIV-uninfected women, aged 15-24, who engaged in high-risk sexual activity. Interviews focused on experience with PrEP and pregnancy. Data were analyzed utilizing a framework analysis approach.
FindingsKey themes were comprised of participant barriers to and facilitators of PrEP uptake and adherence. Reasons for PrEP initiation included desire for autonomy and agency, mistrust of partners, and social support. Participants expressed challenges with initiating or sustaining their use of PrEP, including PrEP access and perceived or felt stigma. During pregnancy, participants primary motivators for altering PrEP use were either understanding of PrEP safety for their baby or changes in perceptions of their HIV risk. Many of these factors were similar across participants who had experience with pregnancy and those who did not.
InterpretationThis study highlights the importance of addressing barriers to and facilitators of PrEP adherence, especially during pregnancy where risk is elevated, with a multi-level approach. Community-oriented education, stigma reduction activities alongside access to PrEP, can serve as means for adherence. The development of robust PrEP adherence support guidelines regarding PrEP use during pregnancy among high-risk women, and strategies for their implementation, are of utmost importance for the control of HIV in key populations and the elimination of mother-to-child transmission of HIV. | hiv aids |
10.1101/2022.04.14.22272888 | Anti-inflammatory therapy with nebulised dornase alfa in patients with severe COVID-19 pneumonia | BackgroundSARS-CoV2 infection causes severe, life-threatening pneumonia. Hyper-inflammation, coagulopathy and lymphopenia are associated with pathology and poor outcomes in these patients. Cell-free (cf) DNA is prominent in COVID-19 patients, amplifies inflammation and promotes coagulopathy and immune dysfunction. We hypothesized that cf-DNA clearance by nebulised dornase alfa may reduce inflammation and improve disease outcomes. Here, we evaluated the efficacy of nebulized dornase alfa in patients hospitalised with severe COVID-19 pneumonia.
MethodsIn this randomised controlled single-centre phase 2 proof-of-concept trial, we recruited adult patients admitted to hospital that exhibited stable oxygen saturation ([≥]94%) on supplementary oxygen and a C-reactive protein (CRP) level [≥]30mg/L post dexamethasone treatment. Participants were randomized at a 3:1 ratio to receive twice-daily nebulised dornase alfa in addition to best available care (BAC) or BAC alone for seven days or until hospital discharge. A 2:1 ratio of historical controls to treated individuals (HC, 2:1) were included as the primary endpoint comparators. The primary outcome was a reduction in systemic inflammation measured by blood CRP levels over 7 days post-randomisation, or to discharge if sooner. Secondary and exploratory outcomes included time to discharge, time on oxygen, D-dimer levels, lymphocyte counts and levels of circulating cf-DNA.
ResultsWe screened 75 patients and enrolled 39 participants out of which 30 in dornase alfa arm, and 9 in BAC group. We also matched the recruited patients in the treated group (N=30) to historical controls in the BAC group (N=60). For the the primary outcome, 30 patients in the dornase alfa were compared to 69 patients in the BAC group. Dornase alfa treatment reduced CRP by 33% compared to the BAC group at 7-days (P=0.01). The dornase alfa group least squares mean CRP was 23.23 mg/L (95% CI 17.71 to 30.46) and the BAC group 34.82 mg/L (95% CI 28.55 to 42.47). A significant difference was also observed when only randomised participants were compared. Furthermore, compared to the BAC group, the chance of live discharge was increased by 63% in the dornase alfa group (HR 1.63, 95% CI 1.01 to 2.61, P=0.03), lymphocyte counts were improved (least-square mean: 1.08 vs 0.87, P=0.02) and markers of coagulopathy such as D-dimer were diminished (least-square mean: 570.78 vs 1656.96g/mL, P=0.004). Moreover, the dornase alfa group exhibited lower circulating cf-DNA levels that correlated with CRP changes over the course of treatment. No differences were recorded in the rates and length of stay in the ICU or the time on oxygen between the groups. Dornase alfa was well-tolerated with no serious adverse events reported.
ConclusionsIn this proof-of-concept study in patients with severe COVID-19 pneumonia, treatment with nebulised dornase alfa resulted in a significant reduction in inflammation, markers of immune pathology and time to discharge. The effectiveness of dornase alfa in patients with acute respiratory infection and inflammation should be investigated further in larger trials. | infectious diseases |
10.1101/2022.04.14.22272888 | Anti-inflammatory therapy with nebulised dornase alfa in patients with severe COVID-19 pneumonia. A Randomised Clinical Trial. | BackgroundSARS-CoV2 infection causes severe, life-threatening pneumonia. Hyper-inflammation, coagulopathy and lymphopenia are associated with pathology and poor outcomes in these patients. Cell-free (cf) DNA is prominent in COVID-19 patients, amplifies inflammation and promotes coagulopathy and immune dysfunction. We hypothesized that cf-DNA clearance by nebulised dornase alfa may reduce inflammation and improve disease outcomes. Here, we evaluated the efficacy of nebulized dornase alfa in patients hospitalised with severe COVID-19 pneumonia.
MethodsIn this randomised controlled single-centre phase 2 proof-of-concept trial, we recruited adult patients admitted to hospital that exhibited stable oxygen saturation ([≥]94%) on supplementary oxygen and a C-reactive protein (CRP) level [≥]30mg/L post dexamethasone treatment. Participants were randomized at a 3:1 ratio to receive twice-daily nebulised dornase alfa in addition to best available care (BAC) or BAC alone for seven days or until hospital discharge. A 2:1 ratio of historical controls to treated individuals (HC, 2:1) were included as the primary endpoint comparators. The primary outcome was a reduction in systemic inflammation measured by blood CRP levels over 7 days post-randomisation, or to discharge if sooner. Secondary and exploratory outcomes included time to discharge, time on oxygen, D-dimer levels, lymphocyte counts and levels of circulating cf-DNA.
ResultsWe screened 75 patients and enrolled 39 participants out of which 30 in dornase alfa arm, and 9 in BAC group. We also matched the recruited patients in the treated group (N=30) to historical controls in the BAC group (N=60). For the the primary outcome, 30 patients in the dornase alfa were compared to 69 patients in the BAC group. Dornase alfa treatment reduced CRP by 33% compared to the BAC group at 7-days (P=0.01). The dornase alfa group least squares mean CRP was 23.23 mg/L (95% CI 17.71 to 30.46) and the BAC group 34.82 mg/L (95% CI 28.55 to 42.47). A significant difference was also observed when only randomised participants were compared. Furthermore, compared to the BAC group, the chance of live discharge was increased by 63% in the dornase alfa group (HR 1.63, 95% CI 1.01 to 2.61, P=0.03), lymphocyte counts were improved (least-square mean: 1.08 vs 0.87, P=0.02) and markers of coagulopathy such as D-dimer were diminished (least-square mean: 570.78 vs 1656.96g/mL, P=0.004). Moreover, the dornase alfa group exhibited lower circulating cf-DNA levels that correlated with CRP changes over the course of treatment. No differences were recorded in the rates and length of stay in the ICU or the time on oxygen between the groups. Dornase alfa was well-tolerated with no serious adverse events reported.
ConclusionsIn this proof-of-concept study in patients with severe COVID-19 pneumonia, treatment with nebulised dornase alfa resulted in a significant reduction in inflammation, markers of immune pathology and time to discharge. The effectiveness of dornase alfa in patients with acute respiratory infection and inflammation should be investigated further in larger trials. | infectious diseases |
10.1101/2022.04.13.22273851 | A Sepsis Treatment Algorithm to Improve Early Antibiotic De-escalation While Maintaining Adequacy of Coverage (Early-IDEAS): A Prospective Observational Study | BackgroundEmpiric antibiotic treatment selection should provide adequate coverage for potential pathogens while minimizing unnecessary broad-spectrum antibiotic use. We sought to pilot a rule- and model-based early sepsis treatment algorithm (Early-IDEAS) to make optimal individualized antibiotic recommendations.
MethodsThe Early-IDEAS decision support algorithm was derived from previous Gram-negative and Gram-positive prediction rules and models. The Gram-negative prediction consists of multiple parametric regression models which predict the likelihood of susceptibility for each commonly used antibiotic for Gram-negative pathogens, based on epidemiologic predictors and prior culture results and recommends the narrowest spectrum agent that exceeds a predefined threshold of adequate coverage. The Gram-positive rules direct the addition or cessation of vancomycin based on prior culture results. We applied the algorithm to prospectively identified consecutive adults within 24-hours of suspected sepsis. The primary outcome was the proportion of patients for whom the algorithm recommended de-escalation of the primary antibiotic regimen. Secondary outcomes included: (1) the proportion of patients for whom escalation was recommended; (2) the number of recommended de-escalation steps along a pre-specified antibiotic cascade; and (3) the adequacy of therapy in the subset of patients with culture-confirmed infection.
ResultsWe screened 578 patients, of whom 107 eligible patients with sepsis were included. The Early-IDEAS treatment recommendation was informed by Gram-negative models in 76 (71%) of patients, Gram-positive rules in 66 (61.7%), and local guidelines in 27 (25%). Antibiotic de-escalation was recommended by the algorithm in almost half of all patients (n=50, 47%), no treatment change was recommended in 48 patients (45%), and escalation was recommended in 9 patients (8%). Amongst the patients where de-escalation was recommended, the median number of steps down the a priori antibiotic treatment cascade was 2. Among the 17 patients with relevant culture-positive blood stream infection, the clinician prescribed regimen provided adequate coverage in 14 (82%) and the algorithm recommendation would have provided adequate coverage in 13 (76%), p=1. Among the 25 patients with positive relevant (non-blood) cultures, the clinician prescribed regimen provided adequate coverage in 22 (88%) and the algorithm recommendation would have provided adequate coverage in 21 (84%), p=1.
ConclusionsAn individualized decision support algorithm in early sepsis could lead to substantial antibiotic de-escalation without compromising adequate antibiotic coverage. | infectious diseases |
10.1101/2022.04.13.22273835 | Evaluation of machine learning for predicting COVID-19 outcomes from a national electronic medical records database | ObjectiveWhen novel diseases such as COVID-19 emerge, predictors of clinical outcomes might be unknown. Using data from electronic medical records (EMR) allows evaluation of potential predictors without selecting specific features a priori for a model. We evaluated different machine learning models for predicting outcomes among COVID-19 inpatients using raw EMR data.
Materials and MethodsIn Premier Healthcare Data Special Release: COVID-19 Edition (PHD-SR COVID-19, release date March, 24 2021), we included patients admitted with COVID-19 during February 2020 through April 2021 and built time-ordered medical histories. Setting the prediction horizon at 24 hours into the first COVID-19 inpatient visit, we aimed to predict intensive care unit (ICU) admission, hyperinflammatory syndrome (HS), and death. We evaluated the following models: L2-penalized logistic regression, random forest, gradient boosting classifier, deep averaging network, and recurrent neural network with a long short-term memory cell.
ResultsThere were 57,355 COVID-19 patients identified in PHD-SR COVID-19. ICU admission was the easiest outcome to predict (best AUC=79%), and HS was the hardest to predict (best AUC=70%). Models performed similarly within each outcome.
DiscussionAlthough the models learned to attend to meaningful clinical information, they performed similarly, suggesting performance limitations are inherent to the data.
ConclusionPredictive models using raw EMR data are promising because they can use many observations and encompass a large feature space; however, traditional and deep learning models may perform similarly when few features are available at the individual patient level. | infectious diseases |
10.1101/2022.04.12.22273466 | Level and duration of IgG and neutralizing antibodies to SARS-CoV-2 in children with symptomatic or asymptomatic SARS-CoV-2 infection | BackgroundThere are presently conflicting data about level and duration of antibodies to SARS-CoV-2 in children after symptomatic or asymptomatic infection.
MethodsWe enrolled adults and children in a prospective 6-month study in the following categories: 1) symptomatic, SARS-CoV-2 PCR+ (SP+; children, n=8; adults, n=16), 2) symptomatic, PCR- or untested (children, n=27), 3) asymptomatic exposed (children, n=13) and 4) asymptomatic, no known exposure (children, n=19). Neutralizing and IgG antibodies to SARS-CoV-2 antigens and Spike protein variants were measured by multiplex serological assays.
ResultsAll SP+ children developed nAb, whereas 81% of SP+ adults developed nAb. Decline in the presence of nAb over 6 months was not significant in symptomatic children (100% to 87.5%, p=0.32) in contrast to adults (81.3 to 50.0%, p=0.03). Among all children with nAb (n=22), nAb titers and change in titers over 6 months were similar in symptomatic and asymptomatic children. Levels of IgG antibodies in children to the SARS-CoV-2 Spike, RBD-1 and -2, nucleocapsid and N-terminal domain antigens and to Spike protein variants were similar to those in adults. IgG levels to primary antigens decreased over time in both children and adults, but levels to three of six Spike variants decreased only in children.
ConclusionsChildren with asymptomatic or symptomatic SARS-CoV-2 infection develop robust neutralizing antibodies that remain present longer than in adults but wane in titer over time, and broad IgG antibodies that also wane in level over time.
Key PointsChildren have robust neutralizing and IgG antibody responses to SARS-CoV-2 infection after symptomatic or asymptomatic disease that are at least as strong as in adults. Neutralizing antibodies in children last longer than in adults but wane over time. | infectious diseases |
10.1101/2022.04.07.22273168 | Development of highly specific singleplex and multiplex real-time reverse transcription PCR assays for the identification of SARS-CoV-2 Omicron BA.1, BA.2 and Delta variants | The Omicron variant of SARS-CoV-2 (B.1.1.529), first identified during November 2021, is rapidly spreading throughout the world, replacing the previously dominant Delta variant. Omicron has a high number of mutations in the spike gene, some of which are associated with greatly increased transmissibility and immune evasion. The BA.1 sublineage has been most prevalent but there is recent evidence that the BA.2 sublineage is increasing in proportion in many countries. Genome sequencing is the gold standard for Omicron identification but is relatively slow, resource intensive, of limited capacity and often unavailable. We therefore developed a simple, rapid reverse transcription PCR (RT-PCR) method for sensitive and specific detection of the Omicron variant, including both the BA.1 and BA.2 sublineages. The assay targets a total of 5 nucleotide mutations in the receptor binding domain of the spike gene that give rise to 4 amino acid substitutions at G339D, S371L, S373P and S375F. The forward primer was designed as a double-mismatch allele specific primer (DMAS) with an additional artificial mismatch located four nucleotides from the 3 end to enhance binding specificity. Assay specificity was confirmed by testing a wide range of previously-sequenced culture-derived viral isolates and clinical samples including the Alpha, Beta and Delta variants and wild type SARS-CoV-2. Respiratory syncytial virus and influenza A were also tested. The assay can be run in singleplex format, or alternatively as a multiplex RT-PCR to enable Omicron and Delta variants to be detected and distinguished within the same reaction by means of probes labelled with different fluorescent dyes. Sublineages BA.1 and BA.2 can be differentiated if required. The methods presented here can readily be established in any PCR laboratory and should provide valuable support for epidemiologic surveillance of Omicron infections, particularly in those regions that lack extensive sequencing facilities. | infectious diseases |
10.1101/2022.04.07.22273595 | Epidemiology of infections with SARS-CoV-2 Omicron BA.2 variant in Hong Kong, January-March 2022 | Hong Kong reported 12,631 confirmed COVID-19 cases and 213 deaths in the first two years of the pandemic but experienced a major wave predominantly of Omicron BA.2.2 in early 2022 with over 1.1 million reported SARS-CoV-2 infections and more than 7900 deaths. Our data indicated a shorter incubation period, serial interval, and generation time of infections with Omicron than other SARS-CoV-2 variants. Omicron BA.2.2 cases without a complete primary vaccination series appeared to face a similar fatality risk to those infected in earlier waves with the ancestral strain. | epidemiology |
10.1101/2022.04.11.22273693 | Pan-cancer analysis of pre-diagnostic blood metabolite concentrations in the European Prospective Investigation into Cancer and Nutrition | BackgroundEpidemiological studies of associations between metabolites and cancer risk have typically focused on specific cancer types separately. Here, we designed a multivariate pan-cancer analysis to identify metabolites potentially associated with multiple cancer types, while also allowing the investigation of cancer type-specific associations.
MethodsWe analyzed targeted metabolomics data available for 5,828 matched case-control pairs from cancer-specific case-control studies on breast, colorectal, endometrial, gallbladder, kidney, localized and advanced prostate cancer, and hepatocellular carcinoma nested within the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort. From pre-diagnostic blood levels of an initial set of 117 metabolites, 33 cluster representatives of strongly correlated metabolites, and 17 single metabolites were derived by hierarchical clustering. The mutually adjusted associations of the resulting 50 metabolites with cancer risk were examined in penalized conditional logistic regression models adjusted for body mass index, using the data shared lasso penalty.
ResultsOut of the 50 studied metabolites, (i) six were inversely associated with risk of most cancer types: glutamine, butyrylcarnitine, lysophosphatidylcholine a C18:2 and three clusters of phosphatidylcholines (PCs); (ii) three were positively associated with most cancer types: proline, decanoylcarnitine and one cluster of PCs; and (iii) 10 were specifically associated with particular cancer types, including histidine that was inversely associated with colorectal cancer risk, and one cluster of sphingomyelins that was inversely associated with risk of hepatocellular carcinoma and positively with endometrial cancer risk.
ConclusionsThese results could provide novel insights for the identification of pathways for cancer development, in particular those shared across different cancer types. | epidemiology |
10.1101/2022.04.11.22273658 | Unique Metabolic Profiles in South Asian Women Characterizes Gestational Diabetes in Low and High-Risk Women | BackgroundGestational Diabetes Mellitus (GDM) is the most common pregnancy complication to occur worldwide, however prevalence varies substantially between ethnicities with South Asians experiencing up to 3-times the risk of the disease compared to white Europeans (WEs). Factors driving this discrepancy in prevalence and the pathogenesis of GDM are unclear, although the metabolome is of great importance due to the metabolic dysregulation characterised GDM.
ObjectiveTo characterise and distinguish the metabolic profiles of GDM in two distinct ethnic populations at < 28 weeks gestation.
Design146 fasting serum metabolite values, quantified by nuclear magnetic resonance, from 2668 pregnant WE and 2671 pregnant Pakistani (PK) women from the Born in Bradford (BIB) cohort, were analysed using partial least squares discriminatory analyses (PLSDA). The presence of a linear relationships between metabolite values and post-oral glucose tolerance test measures of dysglycemia (fasting glucose and 2-hour post glucose) were also examined.
ResultsSeven metabolites were associated with GDM status in both ethnicities, while 6 additional metabolites associated with GDM only in WE women. Unique metabolic profiles were observed in healthy weight women who later developed GDM, with distinct metabolite patterns identified by ethnicity and BMI status. Of the 146 metabolite values analysed in relation to dysglycemia, quantities of lactate, histidine, apolipoprotein A1, HDL cholesterol, HDL2 cholesterol, and DHA, as well as the diameter of very low density lipoprotein particles (nm) were associated with dysglycemia in WE women; while in PK women albumin alone associated with dysglycemia.
ConclusionThis is the first study to show that the metabolic risk profile for GDM differs between ethnicities and highlights a need for ethnically appropriate GDM prevention strategies. | epidemiology |
10.1101/2022.04.11.22273726 | Racial differences in white matter hyperintensity burden in aging, MCI, and AD | White matter hyperintensities may be one of the earliest pathological changes in aging and may potentially accelerate cognitive decline. Whether race influences WMH burden has been conflicting. The goal of this study was to examine if race differences exist in WMH burden and whether these differences are influenced by vascular factors [i.e., diabetes, hypertension, body mass index (BMI)]. Participants from the Alzheimers Disease Neuroimaging Initiative were included if they had a baseline MRI, diagnosis, and WMH measurements. Ninety-one Black and 1937 White individuals were included. Using bootstrap re-sampling, 91 Whites were randomly sampled and matched to Black participants based on age, sex, education, and diagnosis 1000 times. Linear regression models examined the influence of race on baseline WMHs with and without vascular factors: WMH [~] Race + Age + Sex + Education + BMI + Hypertension + Diabetes and WMH [~] Race + Age + Sex + Education. The 95% confidence limits of the t-statistics distributions for the 1000 samples were examined to determine statistical significance. All vascular risk factors had significantly higher prevalence in Black than White individuals. When not including vascular risk factors, Black individuals had greater WMH volume overall as well as in frontal and parietal regions, compared to White individuals. After controlling for vascular risk factors, no WMH group differences remained significant. These findings suggest that vascular risk factors are a major contributor to racial group differences observed in WMHs. | geriatric medicine |
10.1101/2022.04.11.22273719 | Pre-Pandemic COVID-19 in New York City: A descriptive analysis of COVID-19 illness prior to February 29, 2020 | BackgroundOn January 30, 2020 the COVID-19 pandemic was declared a Public Health Emergency of International Concern (PHEIC) by the World Health Organization. Almost a month later on February 29, 2020, the first case in New York City (NYC) was diagnosed.
MethodsThree-hundred-sixty persons with COVID-like illness was reported to the NYC Department of Health and Mental Hygiene (DOHMH) before February 29, but 37 of these tested negative and 237 were never tested for SARS-COV-2. Records of 86 persons with confirmed COVID-19 and symptom onset prior to February 29, 2020 were reviewed by four physician-epidemiologists. Case-patients were classified as likely early onset COVID-19, or insufficient evidence to determine onset. Clinical and epidemiological factors collected by DOHMH and supplemented with emergency department records were analyzed.
ResultsThirty-nine likely early onset COVID-19 cases were identified. The majority had severe disease with 69% presenting to an ED visit within 2 weeks of symptom onset. The first likely COVID-19 case on record had symptom onset on January 28, 2020. Only 7 of the 39 cases (18%) had traveled internationally within 14 days of onset (none to China).
ConclusionsSARS-CoV-2 and COVID-19 was in NYC before being classified as a PHEIC, and eluded surveillance for another month. The delay in recognition limited mitigation effort and by the time that city and state-wide mandates were enacted 16 and 22 days later there was already community transmission.
Key PointsRecords of 86 persons with confirmed COVID-19 and symptom onset prior to February 29, 2020 were reviewed for likelihood of early onset COVID-19. Thirty-nine likely early onset COVID-19 cases were identified, suggesting that early COVID-19 transmission in NYC went undetected. | infectious diseases |
10.1101/2022.04.15.22273711 | Beta infection combined with Pfizer BNT162b2 vaccination leads to broadened neutralizing immunity against Omicron | Omicron (B.1.1.529) shows extensive escape from vaccine immunity, although vaccination reduces severe disease and death1. Boosting with vaccines incorporating variant spike sequences could possibly broaden immunity2. One approach to choose the variant may be to measure immunity elicited by vaccination combined with variant infection. Here we investigated Omicron neutralization in people infected with the Beta (B.1.351) variant and subsequently vaccinated with Pfizer BNT162b2. We observed that Beta infection alone elicited poor Omicron cross-neutralization, similar to what we previously found3 with BNT162b2 vaccination alone or in combination with ancestral or Delta virus infection. In contrast, Beta infection combined with BNT162b2 vaccination elicited neutralization with substantially lower Omicron escape. | infectious diseases |
10.1101/2022.04.06.22273304 | Immediate effect of osteopathic techniques on human resting muscle tone in healthy subjects using myotonometry: A factorial randomized trial | BackgroundMusculoskeletal disorders (MSDs) are highly prevalent, burdensome, and putatively associated with an altered human resting muscle tone (HRMT). Osteopathic manipulative treatment (OMT) is commonly and effectively applied to treat MSDs and reputedly influences the HRMT. Arguably, OMT may modulate alterations in HRMT underlying MSDs. However, there is sparse evidence even for the effect of OMT on HRMT in healthy subjects.
MethodsA 3x3 factorial randomised trial was performed to investigate the immediate-term effect of myofascial release (MRT), muscle energy (MET), and soft tissue techniques (STT) on the HRMT of the corrugator supercilii (CS), superficial masseter (SM), and upper trapezius muscles (UT) in healthy subjects in Hamburg, Germany. Participants were randomised into three groups (1:1:1 allocation ratio) receiving treatment, according to different muscle-technique pairings, over the course of three sessions with one-week washout periods. Primarily, we assessed the effect of osteopathic techniques on muscle tone (F), biomechanical (S, D), and viscoelastic properties (R, C) from baseline to follow-up (main effect) and tested if specific muscle-technique pairs modulate the effect pre- to post-intervention (interaction effect) using the MyotonPRO (at rest). Data were analysed using descriptive (mean, standard deviation, quantiles, and simple effect) and inductive statistics (Bayesian ANOVA).
Results59 healthy participants were randomised into three groups and two subjects dropped out from one group (n=20; n=20; n=19 and n=17, respectively). The CS produced frequent measurement errors and was excluded from analysis. The main effect changed significantly for F (-0.163 [0.060]; p=0.008), S (-3.060 [1.563]; p=0.048), R (0.594 [0.141]; p<0.001), and C (0.038 [0.017]; p=0.028) but not for D (0.011 [0.017]; p=0.527). The interaction effect did not change significantly (p>0.05). No adverse events were reported.
ConclusionOMT modified the HRMT in healthy subjects which may inform future research on MSDs. In detail, MRT, MET, and STT reduced the muscle tone (F), decreased biomechanical (S not D), and increased viscoelastic properties (R and C) of the SM and UT (CS was not measurable) at immediate term. However, the effect on HRMT was not modulated by muscle-technique interaction.
Trial registrationGerman Clinical Trial Register (DRKS00020393). | primary care research |
10.1101/2022.04.11.22273180 | Investigating direct and indirect genetic effects in attention deficit hyperactivity disorder (ADHD) using parent-offspring trios | BackgroundAttention deficit hyperactivity disorder (ADHD) is highly heritable, but little is known about the relative effects of transmitted (i.e. direct) and non-transmitted (i.e. indirect) common variant risks. Using parent-offspring trios, we tested whether polygenic liability for neurodevelopmental and psychiatric disorders and lower cognitive ability is over-transmitted to ADHD probands. We also tested for indirect or genetic nurture effects, by examining whether non-transmitted ADHD polygenic liability is elevated. Finally, we examined whether complete trios are representative.
MethodsPolygenic risk scores (PRS) for ADHD, anxiety, autism, bipolar disorder, depression, obsessive-compulsive disorder (OCD), schizophrenia, Tourettes syndrome, and cognitive ability were calculated in UK controls (N=5,081), UK probands with ADHD (N=857), and, where available, their biological parents (N=328 trios), and also a replication sample of 844 ADHD trios.
ResultsADHD PRS were over-transmitted and cognitive ability and OCD PRS were under-transmitted to probands. These results were independently replicated. Over-transmission of polygenic liability was not observed for other disorders. Non-transmitted alleles were not enriched for ADHD liability compared to controls. Probands from incomplete trios had more hyperactive-impulsive and conduct disorder symptoms, lower IQ, and lower socioeconomic status than complete trios. PRS did not vary by trio status.
ConclusionsThe results support direct transmission of polygenic liability for ADHD and cognitive ability from parents to offspring, but not for other neurodevelopmental/psychiatric disorders. They also suggest that non-transmitted neurodevelopmental/psychiatric parental alleles do not contribute indirectly to ADHD via genetic nurture. Furthermore, ascertainment of complete ADHD trios may be non-random, in terms of demographic and clinical factors. | psychiatry and clinical psychology |
10.1101/2022.04.11.22273708 | Prioritisation of Informed Health Choices (IHC) Key Concepts to be included in lower-secondary school resources: a consensus study | BackgroundThe Informed Health Choices Key Concepts are principles for thinking critically about healthcare claims and deciding what to do. The Key Concepts provide a framework for designing curricula, learning resources, and evaluation tools.
ObjectivesTo prioritise which of the 49 Key Concepts to include in resources for lower-secondary schools in East Africa.
MethodsTwelve judges used an iterative process to reach a consensus. The judges were curriculum specialists, teachers, and researchers from Kenya, Uganda, and Rwanda. After familiarising themselves with the concepts, they pilot tested draft criteria for selecting and ordering the concepts. After agreeing on the criteria, nine judges independently assessed all 49 concepts and reached an initial consensus. We sought feedback on the draft consensus from teachers and other stakeholders. After considering the feedback, nine judges independently reassessed the prioritised concepts and reached a consensus. The final set of concepts was determined after user-testing prototypes and pilot-testing the resources.
ResultsThe first panel prioritised 29 concepts. Based on feedback from teachers, students, curriculum developers, and other members of the research team, two concepts were dropped. A second panel of nine judges prioritised 17 of the 27 concepts. Due to the Covid-19 pandemic and school closures, we have only been able to develop one set of resources instead of two, as originally planned. Based on feedback on prototypes of lessons and pilot-testing a set of 10 lessons, we determined that it was possible to introduce nine concepts in 10 single-period (40 minute) lessons. We included eight of the 17 prioritised concepts and one additional concept.
ConclusionUsing an iterative process with explicit criteria, we prioritised nine concepts as a starting point for students to learn to think critically about healthcare claims and choices. | public and global health |
10.1101/2022.04.11.22273635 | Epidural stimulation of the cervical spinal cord improves voluntary motor control in post-stroke upper limb paresis | A large proportion of cerebral strokes disrupt descending commands from motor cortical areas to the spinal cord which can results in permanent motor deficits of the arm and hand1,2. However, below the lesion, the spinal circuits that control movement5 remain intact and could be targeted by neurotechnologies to restore movement6-9. Here we demonstrate that by engaging spinal circuits with targeted electrical stimulation we immediately improved voluntary motor control in two participants with chronic post-stroke hemiparesis. We implanted a pair of 8-contact percutaneous epidural leads on the lateral aspect of the cervical spinal cord to selectively target the dorsal roots that provide excitatory inputs to motoneurons controlling the arm and hand10,11. With this strategy, we obtained independent activation of shoulder, elbow and hand muscles. Continuous stimulation through selected contacts at specific frequencies enabled participants to perform movements that they had been unable to perform for many years. Overall, stimulation improved strength, kinematics, and functional performance. Unexpectedly, both participants retained some of these improvements even without stimulation, suggesting that spinal cord stimulation could be a restorative as well as an assistive approach for upper limb recovery after stroke. | rehabilitation medicine and physical therapy |
10.1101/2022.04.11.22273710 | The "Trauma Pitch": How stigma emerges for Iraq and Afghanistan veterans seeking disability compensation | Posttraumatic Stress Disorder continues to be a highly stigmatized disease for the veteran population and stigma continues to be identified as the main deterrent in treatment seeking. Little attention has been paid to how the process of obtaining service-connected disability status can amplify veterans perceptions of being stigmatized. The following ethnographic study identified how combat veterans experienced stigma in processing through Veterans Affairs care and the effects of linking a Posttraumatic Stress Disorder diagnosis with disability compensation to perceived stigmas. Stigma was identified in two inter-related areas: 1) the structural level in the Veterans Affairs disability claims process and 2) the individual level in interactions with Veterans Affairs service providers. Results based on veterans narratives suggest that the disability claims process, requiring multiple repetitions of personal trauma, coupled with perceptions of institutional stigmas of malingering, created bureaugenic effects: a worsening of symptoms caused by bureaucratic protocols intended to help veterans. This process influenced first time treatment users of the Veterans Affairs by deterring treatment-seeking behavior but was not found to affect veterans who had already initiated treatment. Despite the experience of stigma and commodification of their suffering through disability and diagnostic screening, veterans still sought disability compensation. Veterans viewed this compensation as acknowledgment of their loss and validation of their sacrifice. | medical ethics |
10.1101/2022.04.12.22273791 | Significance and clinical suggestions for the Somatosensory Evoked Potentials increased in amplitude revealed by a large sample of neurological patients | ObjectivesTo investigate the relationship between N20-P25 peak-to-peak amplitude (N20p-P25p) of Somatosensory Evoked Potentials (SEPs) and the occurrence of abnormalities of the peripheral and/or central sensory pathways and of myoclonus/epilepsy, in 308 patients with increased SEPs amplitude from upper limbs stimulation
MethodsWe compared cortical response (N20p-P25p) in different groups of patients identified by demographic, clinical and neurophysiological factors and performed a cluster analysis for classifying the natural occurrence of subgroups of patients.
ResultsNo significant differences of N20p-P25p were found among different age-dependent groups, and in patients with or without PNS/CNS abnormalities of sensory pathways, while myoclonic/epileptic patients showed higher N20p-P25p than other groups. Cluster analysis identified four clusters including patients with myoclonus/epilepsy, patients with central sensory abnormalities, patients with peripheral sensory abnormalities, patients with neither myoclonus nor sensory abnormalities.
ConclusionsIncreased N20p-P25p correlated to different pathophysiological conditions: strong cortical hyperexcitability in patients with cortical myoclonus and/or epilepsy and enlarged N20p-P25p, while milder increase of N20p-P25p could be underpinned by plastic cortical changes following abnormalities of sensory pathways, or degenerative process involving the cortex. SEPs increased in amplitude cannot be considered a specific correlated for myoclonus/epilepsy, but it in several neurological disorders may represent a sign of adaptive, plastic and/or degenerative cortical changes. | neurology |
10.1101/2022.04.10.22273671 | Elevated plasma phosphorylated tau in amyotrophic lateral sclerosis relates to lower motor neuron dysfunction | ObjectivePlasma phosphorylated tau (p-tau181) is reliably elevated in Alzheimers disease (AD), but less explored is its specificity relative to other neurodegenerative conditions. Here we find novel evidence that plasma p-tau181 is elevated in amytrophic lateral sclerosis (ALS), a neurodegenerative condition typically lacking tau pathology. We performed a detailed clinical evaluation to unravel the potential source of this unexpected observation.
MethodsPatients were clinically or pathologically diagnosed with ALS (n=130) or AD (n=82), or were healthy non-impaired controls (n=33). Receiver operating characteristic (ROC) curves were analyzed and area under the curve (AUC) was used to discriminate AD from ALS. Within ALS, Mann-Whitney-Wilcoxon tests compared analytes by presence/absence of upper motor neuron (UMN) and lower motor neuron (LMN) signs. Spearman correlations tested associations between plasma p-tau181 and postmortem neuron loss.
ResultsA Wilcoxon test showed plasma p-tau181 was higher in ALS than controls (W=3297, p=0.0000020), and ROC analyses showed plasma p-tau181 poorly discriminated AD and ALS (AUC=0.60). In ALS, elevated plasma p-tau181 was associated with LMN signs in cervical (W=827, p=0.0072), thoracic (W=469, p=0.00025), and lumbosacral regions (W=851, p=0.0000029). In support of LMN findings, plasma p-tau181 was associated with neuron loss in the spinal cord (rho=0.46, p=0.017), but not in the motor cortex (p=0.41). CSF p-tau181 and plasma neurofilament light chain (NfL) were included as reference analytes, and demonstrate specificity of findings.
InterpretationWe found strong evidence that plasma p-tau181 is elevated in ALS and may be a novel marker specific to LMN dysfunction. | neurology |
10.1101/2022.04.07.22273430 | Characteristics of Patients with Hematologic Malignancies Without Seroconversion Post-COVID19 Third Vaccine Dosing | Patients with hematologic malignancies (HM) are at greater risk of severe morbidity and mortality caused by COVID19 and show a lower response to a two-dose COVID19 mRNA vaccine series. The primary objective of this retrospective cohort study is to explore the characteristics of the subset of patients with HM who had little to no change in SARS-CoV-2 spike antibody titer levels after a 3rd vaccine dose (3V) (-/-). As a secondary objective, we seek to compare the cohorts of patients who did and did not seroconvert post-3V to get a better understanding of the demographics and potential drivers of serostatus. A total of 625 patients with HM had two titer results at least 21 days apart pre- and post- the 3V dose. Among the participants who were seronegative prior to 3V (268), 149 (55.6%) seroconverted after the 3V dose and 119 (44.4%) did not. HM diagnosis was significantly associated with seroconversion status (P = .0003) with patients non-Hodgkin lymphoma 6 times the odds of not seroconverting compared to multiple myeloma patients (P = .0010). Among the cohort of patients who remained seronegative post-3V, 107 (90.0%) patients showed no reaction to 3V as indicated by pre- and post- 3V index values. This study focuses on an important subset of patients with HM who are not seroconverting after the COVID mRNA 3V, providing much needed data for clinicians to target and counsel this subset of patients. | oncology |
10.1101/2022.04.11.22273720 | The contribution of genetic risk to the comorbidity of depression and anxiety: a multi-site electronic health records study | ImportanceDepression and anxiety are common and highly comorbid, posing a clinical and public health concern because such comorbidity is associated with poorer outcomes.
ObjectiveTo evaluate association of genetic risk scores with depression and anxiety diagnosis either in isolation or comorbid with each other.
DesignInternational Classification of Diseases (ICD) ninth and tenth edition codes were extracted from longitudinal electronic health records (EHR) from four EHR-linked biobanks with genetic data available. Data analysis was performed between February 2021 to October 2021.
SettingEHR-linked biorepository study.
ParticipantsAcross the four biobanks, 140947 patients (80601 female [57.2%] including 109592 European ancestry [77.8%], 22321 African ancestry [15.8%], and 9034 Hispanic [6.4%]) were included in the study.
Main outcomes and measuresPolygenic risk scores (PRS) for depression and anxiety were computed for all participants. They were assessed for diagnosis of depression and anxiety using ICD9/10 codes. The primary outcome was a four-level depression/anxiety diagnosis group variable: neither, depression-only, anxiety-only, and comorbid. Estimated effect measures include odds ratios and the proportion of variance on the liability scale explained by the PRS.
Results95992 patients had neither diagnosis (68.1%), 14918 depression-only (10.6%), 12682 anxiety-only (9.0%), and 17355 comorbid (12.3%). PRS for depression and anxiety predicted their respective diagnoses within each biobank and each ancestry with the exception of anxiety-PRS not predicting anxiety in any ancestral group from one biobank. In the meta-analysis across participants of European ancestries, both PRSs for depression and anxiety were higher in each diagnosis group compared to controls. Notably, depression-PRS (OR=1.20 per SD increase in PRS; 95% CI= 1.18-1.23) and anxiety-PRS (OR=1.07; 95% CI=1.05-1.09) had the largest effect size for the comorbid group when compared to controls. The confidence interval for the depression-PRS effect did not overlap across groups demonstrating a gradient of genetic risk across the four diagnosis groups.
Conclusions and RelevanceThe genetic risk of depression and anxiety make distinct contributions to the risk of comorbid depression and anxiety, supporting the hypothesis that the correlated disorders represent distinct nosological entities.
Key PointsO_ST_ABSQuestionC_ST_ABSIs the genetic risk of depression and anxiety associated with comorbidity of depression and anxiety?
FindingsUsing electronic health records from four academic medical centers, this study found that genetic risk of depression and anxiety are jointly associated with clinical depression and anxiety diagnoses with better prediction occurring for a diagnosis of depression.
MeaningThe genetic risk of depression and anxiety make distinct contributions to comorbid depression and anxiety, which supports the hypothesis that the correlated disorders represent distinct nosological entities. | genetic and genomic medicine |
10.1101/2022.04.08.22273231 | Altered gene expression profiles impair the nervous system development in individuals with 15q13.3 microdeletion | BackgroundThe 15q13.3 microdeletion has pleiotropic effects ranging from apparently healthy to severely affected individuals. The underlying basis of the variable phenotype remains elusive.
MethodsWe analyzed gene expression using blood from 3 individuals with 15q13.3 microdeletion and brain cortex tissue from 10 mice Df[h15q13]/+. We assessed differentially expressed genes (DEGs), protein-protein interaction (PPI) functional modules, and gene expression in brain developmental stages.
ResultsThe deleted genes haploinsufficiency was not transcriptionally compensated, suggesting a dosage effect may contribute to the pathomechanism. DEGs shared between tested individuals and a corresponding mouse model show a significant overlap including genes involved in monogenic neurodevelopmental disorders. Yet, network-wide dysregulatory effects suggest the phenotype is not caused by a singular critical gene. A significant proportion of blood DEGs, silenced in adult brain, have maximum expression during the prenatal brain development. Based on DEGs and their PPI partners we identified altered functional modules related to developmental processes, including nervous system development.
ConclusionsWe show that the 15q13.3 microdeletion has a ubiquitous impact on the transcriptome pattern, especially dysregulation of genes involved in brain development. The high phenotypic variability seen in 15q13.3 microdeletion could stem from an increased vulnerability during brain development, instead of a specific pathomechanism. | genetic and genomic medicine |
10.1101/2022.04.07.22273552 | Allelic expression imbalance in articular cartilage and subchondral bone refined genome-wide association signals in osteoarthritis. | Osteoarthritis (OA) is an age-related joint disease with a strong and complex genetic component. Genome-wide association studies (GWAS) discovered a large number of genomic regions associated with OA. Nevertheless, to link associated genetic variants affecting the expression of OA-risk genes in relevant tissues remains a challenge. Here, we showed an unbiased approach to identify transcript single nucleotide polymorphisms (SNPs) of OA risk loci by allelic expression imbalance (AEI). We used RNA sequencing of articular cartilage (N = 65) and subchondral bone (N= 24) from OA patients. AEI was determined for all genes present in the 100 regions reported by GWAS catalog. The count fraction of the alternative allele ({varphi}) was calculated for each heterozygous individual with the risk-SNP or with the SNP in linkage disequilibrium (LD) with it. Furthermore, a meta-analysis was performed to generate a meta-{varphi} (null hypothesis median {varphi}=0.49) and P-value for each SNP. We identified 30 transcript SNPs subject to AEI (28 in cartilage and 2 in subchondral bone). Notably, 10 transcript SNPs were located in genes not previously reported in the GWAS catalogue, including two long intergenic non-coding RNAs (lincRNAs), MALAT1 (meta-{varphi}=0.54, FDR=1.7x10-4) and ILF3-DT (meta-{varphi}=0.6, FDR=1.75x10-5). Moreover, 14 drugs were interacting with 7 genes displaying AEI, of which 7 drugs has been already approved. By prioritizing proxy transcript SNPs that mark AEI in cartilage and/or subchondral bone at loci harboring GWAS signals, we present an unbiased approach to identify the most likely functional OA risk-SNP and gene. We identified 10 new potential OA risk genes ready for further, translation towards underlying biological mechanisms. | genetic and genomic medicine |
10.1101/2022.04.14.22273883 | Art therapy as treatment modality for persons living with dementia- a protocol for a scoping review of systematic reviews | ObjectiveTo establish a robust understanding of the state of the evidence on the effectiveness and/or efficacy of art therapy (AT) as a non-pharmacological treatment (NPT) modality for persons living with dementia (PLWD).
BackgroundOver the past decade, AT has received increased attention from health care professionals and researchers as having a potential role to play within treatment plans for PLWD.(1-4)
Inclusion criteriaThis scoping review will include systematic reviews from health-related disciplines conducted within the last 20 years that report the effectiveness and/or efficacy of AT as an NPT modality for PLWD and Mild Cognitive Impairment (MCI). Study outcomes must include cognition, quality of life, emotional and psychological well-being, and/or neuropsychiatric symptoms (NPS).
MethodsA scoping review of systematic reviews was selected to outline different types of evidence and to identify gaps in the literature. The proposed review will be guided by the methodological framework proposed by the Joanna Briggs Institute. We will therefore specify the research question, identify relevant studies, select eligible studies, extract, collate, and summarize our results. We will not conduct a quality appraisal of the included studies as this review aims to explore the general scope of research conducted that assesses AT effectiveness and efficacy in PLWD. | geriatric medicine |
10.1101/2022.04.12.22273813 | Using Facebook groups to support families: midwives perceptions and experiences of professional social media use | Seeking support from Facebook groups during pregnancy is now widespread and social media has been widely used by maternity services during the COVID-19 pandemic. Despite this, little is currently known about midwives attitudes towards, and experiences of social media in practice. Research is needed to understand barriers and solutions to meeting mothers expectations of online support and to improve services.
This study explored midwife involvement in Facebook groups, exploring experiences and perceptions of its use to communicate with and support mothers. 719 midwives and student midwives completed an online survey during August-September 2020 and their numerical and free-text responses analysed descriptively.
Few participants were involved in providing Facebook support, and most of these were unpaid. There was a consensus on a range of benefits for mothers, but widespread concern that engaging with mothers online was a personal and professional risk, underpinned by a lack of support. Experience of being involved in midwife moderation increased belief in its benefits and reduced fear of engaging online, despite a lack of renumeration and resources. Midwives and students felt they were discouraged from offering Facebook support and sought further training, guidance and support.
Although limited, experiences of providing Facebook group support are positive. Perceptions of risk and a lack of support are significant barriers to midwives involvement in using Facebook groups to support mothers. Midwives seek support and training to safely and effectively engage with mothers using Facebook.
Engaging with mothers via social media is embedded in national policy and digital strategy, and progress is needed to fulfil these, to improve services and meet mothers expectations. Midwives experiences suggest extending opportunities to provide Facebook support would benefit midwives, services and families. Consultation to revise local policy to support midwives and students in line with strategic goals is recommended.
Author SummarySocial media use continues to grow and we know that use among pregnant and new parents for peer support and accessing information is widespread. Previous research suggests however that they can find it difficult to know which information to trust, and would like to engage with their midwives online. However, little was known about how many midwives are supporting families via social media, or what their experiences of this are. Nor did we know what the perceptions of developing this service are amongst the workforce. Here, we used an online survey to explore these attitudes and perceptions. We found that although few midwives are engaging with families on social media, those that do have positive experiences. Many fear that a lack of support and guidance presents risks to themselves and to families, but recognised the potential benefits to developing the service. We also found that midwives would like to receive more training to deliver services online and engage safely on social media. Our study provides new insights that can be used to improve support for midwives and to realise the potential of social media in midwifery care. | health informatics |
10.1101/2022.04.11.22273553 | Domain expert evaluation of advanced visual computing solutions for the planning of left atrial appendage occluder interventions | Advanced visual computing solutions and 3D printing are starting to move from the engineering and development stage to being integrated into clinical pipelines for training, planning and guidance of complex interventions. Commonly, clinicians make decisions based on the exploration of patient-specific medical images in 2D flat monitors using specialised software with standard multi-planar reconstruction (MPR) visualisation. The new generation of visual computing technologies such as 3D imaging, 3D printing, 3D advanced rendering, Virtual Reality and in-silico simulations from Virtual Physiological Human models, provide complementary ways to better understand the structure and function of the organs under study and improve and personalise clinical decisions. Cardiology is a medical field where new visual computing solutions are already having an impact in decisions such as the selection of the optimal therapy for a given patient. A good example is the role of emerging visualisation technologies to choose the most appropriate settings of a left atrial appendage occluder (LAAO) device that needs to be implanted in some patients with atrial fibrillation having contraindications to drug therapies. Clinicians need to select the type and size of the LAAO device to implant, as well as the location to be deployed. Usually, interventional cardiologists make these decisions after the analysis of patient-specific medical images in 2D flat monitors with MPR visualisation, before and during the procedure, obtain manual measurements characterising the cardiac anatomy of the patient to avoid adverse events after the implantation. In this paper we evaluate several advanced visual computing solutions such as web-based 3D imaging visualisation (VIDAA platform), Virtual Reality (VRIDAA platform) and computational fluid simulations and 3D printing for the planning of LAAO device implantations. Six physicians including three interventional and three imaging cardiologists, with different level of experience in LAAO, tested the different technologies in preoperative data of 5 patients to identify the usability, friendliness, limitations and requirements for clinical translation of each technology through a qualitative questionnaire. The obtained results demonstrate the potential impact of advanced visual computing solutions to improve the planning of LAAO interventions but also a need of unification of them in order to be able to be uses in a clinical environment. | health systems and quality improvement |
10.1101/2022.04.11.22273102 | Patient and Provider Perceptions of a Community-Based Accompaniment Intervention for Adolescents Transitioning to Adult HIV Care in Urban Peru: A Qualitative Analysis | IntroductionAdolescents living with HIV (ALWH) experience higher mortality rates compared to other age groups, exacerbated by suboptimal transition from pediatric to adult HIV care in which decreased adherence to antiretroviral treatment (ART) and unsuppressed viremia are frequent. Care transition--a process lasting months or years--ideally prepares ALWH for adult care and can be improved by interventions that are youth-friendly and address psychosocial issues affecting ART adherence; however, such interventions are infrequently operationalized. Community-based accompaniment (CBA), in which laypeople provide individualized support and health system navigation, can improve health outcomes among adults with HIV. Here, we describe patient and provider perceptions of a novel HIV CBA intervention called "PASEO" for ALWH in Lima, Peru.
MethodsPASEO consisted of six core elements designed to support ALWH before, during, and after transition to adult HIV care. Community-based health workers provided tailored accompaniment for ALWH aged 15-21 years over 9 months, after which adolescent participants were invited to provide feedback in a focus group or in-depth interview. HIV care personnel were also interviewed to understand their perspectives on PASEO. A semi-structured interview guide probing known acceptability constructs was used. Qualitative data were analyzed using a Framework Analysis approach and emergent themes were summarized with illustrative quotes.
ResultsWe conducted 5 focus groups and 11 in-depth interviews among N=26 ALWH and 9 key-informant interviews with HIV care personnel. ALWH participants included those with both vertically- and behaviorally acquired HIV. ALWH praised PASEO, attributing increased ART adherence to the project. Improved mental health, independence, self-acceptance, and knowledge on how to manage their HIV were frequently cited. HIV professionals similarly voiced strong support of PASEO. Both ALWH and HIV professionals expressed hope that PASEO would be scaled. HIV professionals voiced concerns regarding financing PASEO in the future.
ConclusionA multicomponent CBA intervention to increase ART adherence among ALWH in Peru was highly acceptable by ALWH and HIV program personnel. Future research should determine the efficacy and economic impact of the intervention. | hiv aids |
10.1101/2022.04.13.22273827 | qPCR in a suitcase for rapid Plasmodium falciparum and Plasmodium vivax surveillance in Ethiopia | Many Plasmodium spp. infections, both in clinical and asymptomatic patients, are below the limit of detection of light microscopy or rapid diagnostic test (RDT). Molecular diagnosis by qPCR can be valuable for surveillance, but is often hampered by absence of laboratory capacity in endemic countries. To overcome this limitation, we optimized and tested a mobile qPCR laboratory for molecular diagnosis in Ziway, Ethiopia, where transmission intensity is low. Protocols were optimized to achieve high throughput and minimize costs and weight for easy transport. 899 samples from febrile patients and 1021 samples from asymptomatic individuals were screened by local microscopy, RDT, and qPCR within a period of six weeks. 34/52 clinical Plasmodium falciparum infections were missed by microscopy and RDT. Only 4 asymptomatic infections were detected. No hrp2 deletions were observed among 25 samples typed, but 19/24 samples carried hrp3 deletions. The majority (25/41) of Plasmodium vivax infections (1371 samples screened) were found among asymptomatic individuals. All asymptomatic P. vivax infections were negative by microscopy and RDT. In conclusion, the mobile laboratory described here can identify hidden parasite reservoirs within a short period of time, and thus inform malaria control activities. | infectious diseases |
10.1101/2022.04.09.22273653 | Disease profile and plasma neutralizing activity of post-vaccination Omicron BA.1 infection in Tianjin, China: a retrospective study | BackgroundSARS-CoV-2 Omicron variant BA.1 first emerged on the Chinese mainland in January 2022 in Tianjin and caused a large wave of infections. During mass PCR testing, a total of 430 cases infected with Omicron were recorded between January 8 and February 7, 2022, with no new infections detected for the following 16 days. Most patients had been vaccinated with SARS-CoV-2 inactivated vaccines. The disease profile associated with BA.1 infection, especially after vaccination with inactivated vaccines, is unclear. Whether BA.1 breakthrough infection after receiving inactivated vaccine could create a strong enough humoral immunity barrier against Omicron is not yet investigated.
MethodsWe collected the clinical information and vaccination history of the 430 COVID-19 patients infected with Omicron BA.1. Re-positive cases and inflammation markers were monitored during the patients convalescence phase. Ordered multiclass logistic regression model was used to identify risk factors for COVID-19 disease severity. Authentic virus neutralization assays against SARS-CoV-2 wildtype, Beta and Omicron BA.1 were conducted to examine the plasma neutralizing titers induced after post-vaccination Omicron BA.1 infection, and were compared to a group of uninfected healthy individuals who were selected to have a matched vaccination profile.
FindingsAmong the 430 patients, 316 (73.5%) were adults with a median age of 47 years, and 114 (26.5%) were under-age with a median age of 10 years. Female and male patients account for 55.6% and 44.4%, respectively. Most of the patients presented with mild (47.7%) to moderate diseases (50.2%), with only 2 severe cases (0.5%) and 7 (1.6%) asymptomatic infections. No death was recorded. 341 (79.3%) of the 430 patients received inactivated vaccines (54.3% BBIBP-CorV vs. 45.5% CoronaVac), 49 (11.4%) received adenovirus-vectored vaccines (Ad5-nCoV), 2 (0.5%) received recombinant protein subunit vaccines (ZF2001), and 38 (8.8%) received no vaccination. No vaccination is associated with a substantially higher ICU admission rate among Omicron BA.1 infected patients (2.0% for vaccinated patients vs. 23.7% for unvaccinated patients, P<0.001). Compared with adults, child patients presented with less severe illness (82.5% mild cases for children vs. 35.1% for adults, P<0.001), no ICU admission, fewer comorbidities (3.5% vs. 53.2%, P<0.001), and less chance of turning re-positive on nucleic acid tests (12.3% vs. 22.5%, P=0.019). For adult patients, compared with no prior vaccination, receiving 3 doses of inactivated vaccine was associated with significantly lower risk of severe disease (OR 0.227 [0.065-0.787], P=0.020), less ICU admission (OR 0.023 [0.002-0.214], P=0.001), lower re-positive rate on PCR (OR 0.240 [0.098-0.587], P=0.002), and shorter duration of hospitalization and recovery (OR 0.233 [0.091-0.596], P=0.002). At the beginning of the convalescence phase, patients who had received 3 doses of inactivated vaccine had substantially lower systemic immune-inflammation index (SII) and C-reactive protein than unvaccinated patients, while CD4+/CD8+ ratio, activated Treg cells and Th1/Th2 ratio were higher compared to their 2-dose counterparts, suggesting that receipt of 3 doses of inactivated vaccine could step up inflammation resolution after infection. Plasma neutralization titers against Omicron, Beta, and wildtype significantly increased after breakthrough infection with Omicron. Moderate symptoms were associated with higher plasma neutralization titers than mild symptoms. However, vaccination profiles prior to infection, whether 2 doses versus 3 doses or types of vaccines, had no significant effect on post-infection neutralization titer. Among recipients of 3 doses of CoronaVac, infection with Omicron BA.1 largely increased neutralization titers against Omicron BA.1 (8.7x), Beta (4.5x), and wildtype (2.2x), compared with uninfected healthy individuals who have a matched vaccination profile.
InterpretationReceipt of 3-dose inactivated vaccines can substantially reduce the disease severity of Omicron BA.1 infection, with most vaccinated patients presenting with mild to moderate illness. Child patients present with less severe disease than adult patients after infection. Omicron BA.1 convalescents who had received inactivated vaccines showed significantly increased plasma neutralizing antibody titers against Omicron BA.1, Beta, and wildtype SARS-CoV-2 compared with vaccinated healthy individuals.
FundingThis research is supported by Changping Laboratory (CPL-1233) and the Emergency Key Program of Guangzhou Laboratory (EKPG21-30-3), sponsored by the Ministry of Science and Technology of the Peoples Republic of China.
Research in contextO_ST_ABSEvidence before this studyC_ST_ABSPrevious studies (many of which have not been peer-reviewed) have reported inconsistent findings regarding the effect of inactivated vaccines against the Omicron variant. On Mar 6, 2022, we searched PubMed with the query "(SARS-CoV-2) AND ((Neutralisation) OR (Neutralisation)) AND ((Omicron) OR (BA.1)) AND (inactivated vaccine)", without date or language restrictions. This search identified 18 articles, of which 13 were directly relevant.
Notably, the participants in many of these studies have received only one or two doses of inactivated vaccine with heterologous booster vaccination; other studies have a limited number of participants receiving inactivated vaccines.
Added value of this studyTo date, this is the first study to report on the protective effect of inactivated vaccines against the severe disease caused by the Omicron variant. We examine and compare the disease profile of adults and children. Furthermore, we estimate the effect of post-vaccination omicron infection on plasma neutralization titers against Omicron and other SARS-COV-2 variants. Specifically, the disease profile of Omicron convalescents who had received two-dose primary series of inactivated vaccines with or without a booster dose prior to infection is compared with unvaccinated patients. We also analyzed the effect of infection on neutralizing activity by comparing vaccinated convalescents with vaccinated healthy individuals with matched vaccination profiles.
Implications of all the available evidenceCompared with adults, child patients infected with Omicron tend to present with less severe disease and are less likely to turn re-positive on nucleic acid tests. Receipt of two-dose primary series or three doses of inactivated vaccine is a protective factor against severe disease, ICU admission, re-positive PCR and longer hospitalization. The protection afforded by a booster dose is stronger than two-dose primary series alone. Besides vaccination, infection with Omicron is also a key factor for elevated neutralizing antibody titers, enabling cross-neutralization against Omicron, wildtype (WT) and the Beta variant. | infectious diseases |
10.1101/2022.04.08.22273513 | SARS-CoV-2 spike 340 and 337 mutations in Omicron variants are selected after Sotrovimab infusion in immunocompromised patients | After monoclonal antibody sotrovimab implementation, Rockett et al have warned on March 9th about two resistant mutations in the spike at position 337 and 340 occurring within the first week in four immunocompromised patients infected by a Delta variant and resulting in viable infection up to 25 days. As sotrovimab is currently the only effective treatment against BA.1 lineage of Omicron variant, we investigated the presence of these mutations in our 22,908 Omicron sequences performed from December 2021 to March 2022.
Among 25 Omicron sequences with S:337 and S:340 substitutions, 9 were reported in six patients who had available clinical data and a follow up. All were immunicompromised, and presented a rapid selection of these mutations after sotrovimab monotherapy infusion.
With these findings, we underscore that although these mutations are rare, they have been exclusively reported in immunocompromised patients treated with sotrovimab. We urge to consider monoclonal antibody as monotherapy in immunocompromised patients as a risk for escape mutants selection. | infectious diseases |
10.1101/2022.04.08.22273532 | Heterologous Gam-COVID-Vac (Sputnik V) / mRNA-1273 (Moderna) vaccination induces a stronger humoral response than homologous Sputnik V in a real-world data analysis | IntroductionGrowing data are demonstrating safety and immunogenicity of heterologous vaccination schemes against severe acute respiratory syndrome coronavirus-2 (SARS-CoV-2) infection. This strategy opens up the possibility of a shorter path towards the end of the pandemic.
ObjectiveTo compare the homologous prime-boost vaccination scheme of Gam-COVID-Vac (Sputnik V, SpV) to its heterologous combination with mRNA-1273 (Moderna, Mod) vaccine.
MethodsSARS-CoV-2 anti-spike (S)-receptor binding domain (RBD) IgG concentration was assessed three to seven weeks after complete vaccination. Reactogenicity was evaluated by declared side events and medical assistance required until day 7 post-boost.
ResultsOf 190 participants enrolled, 105 received homologous SpV/SpV and the remaining heterologous SpV/Mod vaccination scheme, respectively. Median (interquartile range, IQR) age was 54 (37-63) years, 132 (69.5%) were female and 46 (24.2%) individuals had a prior confirmed COVID-19. Anti-S-RBD IgG median (IQR) titers were significantly higher for SpV/Mod [2511 (1476-3992) BAU/mL] than for SpV/SpV [582 (209-1609) BAU/mL, p<0.001] vaccination scheme. In a linear model adjusted for age, gender, time to the serological assay and time between doses, SpV/Mod [4.154 (6.585-615.554), p<0.001] and prior COVID [3.732 (8.641-202.010), p<0.001] were independently associated with higher anti-S-RBD IgG values. A higher frequency of mild-moderate adverse effects was associated with the heterologous scheme, although it was well tolerated by all individuals and no medical assistance was required.
ConclusionThe heterologous SpV/Mod combination against SARS-CoV-2 is well tolerated and significantly increases humoral immune response as compared to the homologous SpV/SpV immunization. | infectious diseases |
10.1101/2022.04.07.22273591 | Monitoring real-time transmission heterogeneity from Incidence data | The transmission heterogeneity of an epidemic is associated with a complex mixture of host, pathogen and environmental factors. And it may indicate superspreading events to reduce the efficiency of population-level control measures and to sustain the epidemic over a larger scale and a longer duration. Methods have been proposed to identify significant transmission heterogeneity in historic epidemics based on several data sources, such as contact history, viral genomes and spatial information, which is sophisticated and may not be available, and more importantly ignore the temporal trend of transmission heterogeneity. Here we attempted to establish a convenient method to estimate real-time heterogeneity over an epidemic. Within the branching process framework, we introduced an instant-individualheterogenous infectiousness model to jointly characterized the variation in infectiousness both between individuals and among different times. With this model, we could simultaneously estimate the transmission heterogeneity and the reproduction number from incidence time series. We validated the model with both simulated data and five historic epidemics. Our estimates of the overall and real-time heterogeneities of the five epidemics were consistent with those presented in the literature. Additionally, our model is robust to the ubiquitous bias of under-reporting and misspecification of serial interval. By analyzing the recent data from South Africa, we found evidences that the Omicron might be of more significant transmission heterogeneity than the Delta. Our model based on incidence data was proved to be reliable in estimating the real-time transmission heterogeneity.
Author summaryThe transmission of many infectious diseases is usually heterogeneous in time and space. Such transmission heterogeneity may indicate superspreading events (where some infected individuals transmit to disproportionately more susceptible than others), reduce the efficiency of the population-level control measures, and sustain the epidemic over a larger scale and a longer duration. Classical methods of monitoring epidemic spread centered on the reproduction number which represent the average transmission potential of the epidemic at the population level, but failed to reflect the systematic variation in transmission. Several recent methods have been proposed to identify significant transmission heterogeneity in the epidemics such as Ebola, MERS, COVID-19. However, these methods are developed based on some sophisticated information such as contact history, viral genome and spatial information, of the confirmed cases, which are typically field-specific and not easy to generalize. In this study, we proposed a simple and generic method of estimating transmission heterogeneity from incidence time series, which provided consistent estimation of heterogeneity with those records with sophisticated data. It also helps in exploring the transmission heterogeneity of the newly emerging variant of Omicron. Our model enhances current understanding of epidemic dynamics, and highlight the potential importance of targeted control measures. | infectious diseases |
10.1101/2022.04.10.22273476 | Prevalence and risk factors of Schistosomiasis among school aged children in Al- Fashir, North Darfur state, Sudan: a cross sectional study | ObjectiveSchistosomiasis represents a significant health problem in Sudan. School aged children who live in areas with poor sanitation are often at risk because they tend to spend time swimming or bathing in water containing infectious cercariae. Therefore, this study aimed to investigate schistosomiasis in terms of prevalence of the infection, and its risk factors among school aged children at Al- Fashir, the capital city of North Darfur state in Sudan.
ResultsIn this study, S. haematobium was detected in 6.1% of the school age children at Al- Fashir. Also, hematuria was detected in 85.7% of infected patients, and there was significant correlation between hematuria and presence of S. haematobium eggs (P. value= 0.001). Regarding the risk factor, the low prevalence rate of S. haematobium was observed in populations who depend on faucets as water sources and live in Nifasha and Zamzam camps. | infectious diseases |
10.1101/2022.04.12.22273767 | Clustering ICU patients with sepsis based on the patterns of their circulating biomarkers: a secondary analysis of the CAPTAIN prospective multicenter cohort study. | BackgroundAlthough sepsis is a life-threatening condition, its heterogeneous presentation likely explains the negative results of most trials on adjunctive therapy. This study in patients with sepsis aimed to identify subgroups with similar immune profiles and their clinical and outcome correlates.
MethodsA secondary analysis used data of a prospective multicenter cohort that included patients with early assessment of sepsis. They were described using Predisposition, Insult, Response, Organ failure sepsis (PIRO) staging system. Thirty-eight circulating biomarkers (27 proteins, 11 mRNAs) were assessed at sepsis diagnosis, and their patterns were determined through principal component analysis (PCA). Hierarchical clustering was used to group the patients and k-means algorithm was applied to assess the internal validity of the clusters.
ResultsTwo hundred and three patients were assessed, of median age 64.5 [52.0-77.0] years and SAPS2 score 55 [49-61] points. Five main patterns of biomarkers and six clusters of patients (including 42%, 21%, 17%, 9%, 5% and 5% of the patients) were evidenced. Clusters were distinguished according to the certainty of the causal infection, inflammation, use of organ support, pro- and anti-inflammatory activity, and adaptive profile markers.
ConclusionsIn this cohort of patients with suspected sepsis, we individualized clusters which may be described with criteria used to stage sepsis. As these clusters are based on the patterns of circulating biomarkers, whether they might help to predict treatment responsiveness should be addressed in further studies.
Trial registrationThe CAPTAIN study was registered on clinicaltrials.gov on June 22, 2011, # NCT01378169. | intensive care and critical care medicine |
10.1101/2022.04.15.22273847 | Toward Evaluation of Disseminated Effects of Medications for Opioid Use Disorder within Provider-Based Clusters Using Routinely-Collected Health Data | Routinely-collected health data can be employed to emulate a target trial when randomized trial data are not available. Patients within provider-based clusters likely exert and share influence on each others treatment preferences and subsequent health outcomes and this is known as dissemination or spillover. Extending a framework to replicate an idealized two-stage randomized trial using routinely-collected health data, an evaluation of disseminated effects within provider-based clusters is possible. In this paper, we propose a novel application of causal inference methods for dissemination to retrospective cohort studies in administrative claims data and evaluate the impact of the normality of the random effects distribution for the cluster-level propensity score on estimation of the causal parameters. An extensive simulation study was conducted to study the robustness of the methods under different distributions of the random effects. We applied these methods to evaluate baseline prescription for medications for opioid use disorder among a cohort of patients diagnosed opioid use disorder and adjust for baseline confounders using information obtained from an administrative claims database. We discuss future research directions in this setting to better address unmeasured confounding in the presence of disseminated effects. | addiction medicine |
10.1101/2022.04.11.22273703 | Comparing the efficacy and safety of direct oral anticoagulants versus Vitamin K antagonists in patients with antiphospholipid syndrome: a systematic review and meta-analysis | BackgroundThromboprophylaxis is the cornerstone strategy for thrombotic antiphospholipid syndrome (APS). Data comparing direct oral anticoagulants (DOACs) to Vitamin K antagonists (VKAs) in the secondary prevention of thrombosis in APS patients remain contentious.
ObjectivesWe aim to review and analyze literature on the efficacy and safety of DOACs compared to VKAs in treating patients with APS. A literature search was performed from inception to March 1, 2022. Subgroups were analyzed based on the risk stratification of APS profiles and different DOAC types.
ResultsA total of 9 studies with 1131 patients were included in the meta-analysis. High-risk APS patients (triple positive APS) who used DOACs displayed an increased risk of recurrent thrombosis (RR=3.65, 95% CI:1.49-8.93; I2=29%, P=0.005) compared to those taking VKAs. Similar risk of recurrent thrombosis or major bleeding was noted in low-risk APS patients (single or double antibody-positive) upon administering DOACs or VKAs. The utilization of Rivaroxaban was associated with a high risk of recurrent thromboses (RR=2.63; 95% CI, 1.56-4.42; I2 =0, P=0.0003), particularly recurrent arterial thromboses (RR=4.52; 95% CI, 1.99-10.29; I2 =0, P=0.18) in overall APS patients. Comparisons of the rate of recurrent thrombosis events and major bleeding events when using dabigatran or apixaban versus VKAs yielded no statistical differences.
ConclusionsIn the absence of contraindications, this meta-analysis suggests that VKAs remain the first-choice treatment for high-risk APS patients, with DOACs a more appropriate option for low-risk APS patients. Different DOACs may exhibit different levels of efficacy and safety for thromboprophylaxis in APS patients and require further exploration. | cardiovascular medicine |
10.1101/2022.04.13.22273676 | Clinical validation of digital biomarkers and machine learning models for remote measurement of psoriasis and psoriatic arthritis | BackgroundPsoriasis and psoriatic arthritis are common immune-mediated inflammatory conditions that primarily affect the skin, joints and entheses and can lead to significant disability and worsening quality of life. Although early recognition and treatment can prevent the development of permanent damage, psoriatic disease remains underdiagnosed and undertreated due in part to the disparity between disease prevalence and relative lack of access to clinical specialists in dermatology and rheumatology. Remote patient self-assessment aided by smartphone sensor technology may be able to address these gaps in care, however, these innovative disease measurements require robust clinical validation.
MethodsWe developed smartphone-based assessments, collectively named the Psorcast suite, that can be self-administered to measure cutaneous and musculoskeletal signs and symptoms of psoriatic disease. The image and motion sensor data collected by these assessments was processed to generate digital biomarkers or machine learning models to detect psoriatic disease phenotypes. To evaluate these digital endpoints, a cross-sectional, in-clinic validation study was performed with 92 participants across two specialized academic sites consisting of healthy controls and participants diagnosed with psoriasis and/or psoriatic arthritis.
FindingsIn the domain of skin disease, digital patient assessment of percent body surface area (BSA) affected with psoriasis demonstrated very strong concordance (CCC = 0{middle dot}94, [95%CI = 0{middle dot}91-0{middle dot}96]) with physician-assessed BSA. Patient-captured psoriatic plaque photos were remotely assessed by physicians and compared to in-clinic Physician Global Assessment parameters for the same plaque with fair to moderate concordance (CCCerythema=0{middle dot}72 [0{middle dot}59-0{middle dot}85]; CCCinduration=0{middle dot}72 [0{middle dot}62-0{middle dot}82]; CCCscaling=0{middle dot}60 [0{middle dot}48-0{middle dot}72]). Arm range of motion was measured by the Digital Jar Open assessment to classify physician-assessed upper extremity involvement with joint tenderness or enthesitis, demonstrating an AUROC = 0{middle dot}68 (0{middle dot}47-0{middle dot}85). Patient-captured hand photos were processed with object detection and deep learning models to classify clinically-diagnosed nail psoriasis with an accuracy of 0{middle dot}76, which is on par with remote physician rating of nail images (avg. accuracy = 0{middle dot}63) with model performance maintaining accuracy when raters were too unsure or image quality was too poor for a remote assessment.
InterpretationThe Psorcast digital assessments, performed by patient self-measurement, achieve significant clinical validity when compared to in-person physical exams. These assessments should be considered appropriately validated for self-monitoring and exploratory research applications, particularly those that require frequent, remote disease measurements. However, further validation in larger cohorts will be necessary to demonstrate robustness and generalizability across populations for use in evidence-based medicine or clinical trial settings. The smartphone software and analysis pipelines from the Psorcast suite are open source and available to the scientific community.
FundingThis work is funded by the Psorcast Digital Biomarker Consortium consisting of Sage Bionetworks, Psoriasis and Psoriatic Arthritis Centers for Multicenter Advancement Network (PPACMAN), Novartis, UCB, Pfizer, and Janssen Pharmaceuticals. J.U.S work was supported by the Snyder Family Foundation and the Riley Family Foundation.
Research in contextO_ST_ABSEvidence before this studyC_ST_ABSNo systematic literature review was performed. Patient self-measurement with smartphone sensors has been shown to be clinically valid for assessing signs and symptoms such as tremor, gait, physical activity, or range of motion across multiple disease indications. While smartphone-based applications have been developed for digitally tracking psoriatic disease, they have largely focused on questionnaire-based patient reported outcomes.
Added value of this studyTo our knowledge, Psorcast is the first application using ubiquitous smartphone sensor technology for patients to remotely measure their psoriatic disease phenotypes, including detection of nail psoriasis and a continuous variable outcome measure of joint tenderness and enthesitis based on range of motion. This study not only developed a suite of novel, smartphone sensor-based assessment that can be self-administered to measure cutaneous and musculoskeletal signs and symptoms, but provides clinical validation of these measures.
Implications of all the available evidenceThe developed Psorcast suite of measurements can serve as groundwork for patient-driven, remote measurement of psoriatic disease. The use and continued development of this technology opens up new possibilities for both clinical care and research endeavors on a large scale. Psorcast measurements are currently being validated for their ability to assess disease changes longitudinally, allowing for more frequent symptom monitoring in clinical trials, more granular insight into the time course of medication action, and possible identification of responders from non-responders to specific therapies. | dermatology |
10.1101/2022.04.13.22273853 | Association between metabolic syndrome and humoral immune response to Pfizer BioNTech vaccine in healthcare workers | PurposeThe clustering of metabolic abnormalities may weaken vaccine-induced immunity, but epidemiological data regarding SARS-CoV-2 vaccines are scarce. The present study examined the cross-sectional association between metabolic syndrome (MetS) and humoral immune response to Pfizer-BioNTech vaccine among healthcare workers.
MethodsParticipants were 946 healthcare workers, aged 21-75 years, who had completed the second dose of Pfizer-BioNTech vaccine 1-3 months before the survey. MetS was defined according to the Joint Interim Statement. SARS-CoV-2 spike immunoglobulin G (IgG) antibody was measured using quantitative assays. Multivariable linear regression was used to estimate the geometric mean titers (GMT) and geometric mean ratio (GMR) of IgG titers, relative to MetS status.
ResultsA total of 51 participants (5.4%) had MetS. Healthcare workers with MetS had a significantly lower IgG titer (GMT 3882; 95% confidence interval [CI], 3124-4824) than those without MetS (GMT 5033; 95% CI, 4395-5764); the GMR was 0.77 (95% CI 0.64-0.93). The GMR for IgG titers among those having 0 (reference group), 1, 2, 3, or [≥] 4 MetS components was 1.00, 1.00, 0.89, 0.86 and 0.61, respectively (Ptrend = 0.004).
ConclusionResults suggest that having MetS and a greater number of its components are associated with a weaker humoral immune response to the Pfizer-BioNTech vaccine. | endocrinology |
10.1101/2022.04.12.22273780 | Malaria and typhoid fever coinfection among patients presenting with febrile illnesses in Ga West Municipality, Ghana | BackgroundMalaria and typhoid fever coinfection presents major public health problems especially in the tropics and sub-tropics where malaria and typhoid fever are co-endemic. Clinicians often treat both infections concurrently without laboratory confirmation. However, concurrent treatment has public health implications as irrational use of antibiotics or anti-malarials may lead to the emergence of drug resistance, unnecessary cost and exposure of patients to unnecessary side effects. This study determined the proportion of febrile conditions attributable to either malaria and/or typhoid fever and the susceptibility patterns of Salmonella spp. isolates to commonly used antimicrobial agents in Ghana.
MethodsOne hundred and fifty-seven (157) febrile patients attending the Ga West Municipal Hospital, Ghana, from February to May 2017 were sampled. Blood samples were collected for cultivation of pathogenic bacteria and the susceptibility of the Salmonella isolates to antimicrobial agents was performed using the Kirby-Bauer disk diffusion method with antibiotic discs on Muller Hinton agar plates. For each sample, conventional Widal tests for the detection of Salmonella spp were done as well as blood film preparation for detection of Plasmodium spp. Data on the socio-demographic and clinical characteristics of the study participants were collected using an android technology software kobo-collect by interview. Data were analyzed using Stata version 13 statistical Software. Logistic regression models were run to determine odds ratio (OR) and the direction of association between dependent and independent variables, setting p-value at <0.05 for statistical significance.
ResultsOf the total number of patients aged 2-37 years (median age = 6 years, IQR 3-11), 82 (52.2%) were females. The proportion of febrile patients with falciparum malaria were 57/157 (36.3%), while Salmonella typhi O and H antigens were detected in 23/157 (14.6%) of the samples. The detection rate of Salmonella spp in febrile patients was 10/157 (6.4%). Malaria and typhoid fever coinfection using Widal test and blood culture was 9 (5.7%) and 3 (1.9%), respectively. The isolates were highly susceptible to cefotaxime, ceftriaxone, ciprofloxacin, and amikacin but resistant to ampicillin, tetracycline, co-trimoxazole, gentamicin, cefuroxime, chloramphenicol, and meropenem.
ConclusionPlasmodium falciparum and Salmonella spp coinfections were only up to 1.9%, while malaria and typhoid fever, individually, were responsible for 36.3% and 6.4%, respectively. Treatment of febrile conditions must be based on laboratory findings in order not to expose patients to unnecessary side effects of antibiotics and reduce the emergence and spread of drug resistance against antibiotics. | epidemiology |
10.1101/2022.04.11.22273744 | A meta-epidemiological assessment of transparency indicators of infectious disease models | Mathematical models have become very influential, especially during the COVID-19 pandemic. Data and code sharing are indispensable for reproducing them, protocol registration may be useful sometimes, and declarations of conflicts of interest (COIs) and of funding are quintessential for transparency. Here, we evaluated these features in publications of infectious disease-related models and assessed whether there were differences before and during the COVID-19 pandemic and for COVID-19 models versus models for other diseases. We analysed all PubMed Central open access publications of infectious disease models published in 2019 and 2021 using previously validated text mining algorithms of transparency indicators. We evaluated 1338 articles: 216 from 2019 and 1122 from 2021 (of which 818 were on COVID-19); almost a six-fold increase in publications within the field. 511 (39.2%) were compartmental models, 337 (25.2%) were time series, 279 (20.9%) were spatiotemporal, 186 (13.9%) were agent-based and 25 (1.9%) contained multiple model types. 288 (21.5%) articles shared code, 332 (24.8%) shared data, 6 (0.4%) were registered, and 1197 (89.5%) and 1109 (82.9%) contained COI and funding statements, respectively. There was no major changes in transparency indicators between 2019 and 2021. COVID-19 articles were less likely to have funding statements and more likely to share code. Manual assessment of 10% of the articles that were identified by the text mining algorithms as fulfilling transparency indicators showed that 24/29 (82.8%) actually shared code, 29/33 (87.9%) actually shared data; and all had COI and funding statements, but 95.8% disclosed no conflict and 11.7% reported no funding. On manual assessment, 5/6 articles identified as registered had indeed been registered. Transparency in infectious disease modelling is relatively low, especially for data and code sharing. This is concerning, considering the nature of this research and the heightened influence it has acquired. | epidemiology |
10.1101/2022.04.13.22273824 | Longitudinal change in cognition in older adults in Uganda: a prospective population study | IntroductionDementia is an important and growing issue in sub-Saharan Africa, but epidemiological data are lacking. Risk factors may differ from other regions due to high stroke incidence and HIV prevalence. Understanding the epidemiology of cognition in older adults in Africa is crucial for informing public health strategies to improve the lives of people with dementia and their carers.
MethodsThe Wellbeing of Older People Study in Uganda is an open cohort of adults aged 50+ with and without HIV, established in 2009. Detailed socio-demographic and health data have been collected at four waves spanning 10 years, including cognitive assessment using internationally validated WHO-recommended tests: verbal recall, digit span, and verbal fluency. Mortality data was collected until the end of the fourth wave (2019). We examined associations of low baseline cognition scores and cognitive decline over time, care needs of people with lower cognition, and the relationship between cognition and mortality.
ResultsData were collected on 811 participants. Older age, lower educational attainment, lower socio-economic position, and extremes of BMI were associated with lower cognition scores. Cognition declined faster at older ages, but rate of decline was not associated with cardiovascular disease or HIV. People with lower cognition required more assistance with Activities of Daily Living, but mortality rates were similar across the range of cognition.
ConclusionsThe crucial next step will be investigating the determinants of low cognition scores and clinical dementia, to better understand the clinical relevance of these findings to inform public health planning in sub-Saharan Africa. | epidemiology |
10.1101/2022.04.13.22273435 | Understanding and comparing the globe-trotting cancer patient with the locally managed patient: A case control study | Medical tourism is characterized by people seeking treatment abroad for various medical conditions due to varied reasons, many of whom benefit from specialized care for non-communicable diseases. Conversely, there are associated negative effects such as medical complications and weakened health systems. Currently, there is paucity of scientific evidence on factors influencing seeking treatment benefits abroad. This study sought to compare patient-related factors associated with choice of cancer treatment center locally or abroad, to understand reasons for seeking treatment outside Kenya.
Materials and MethodsAs a case-control study, 254 cancer patients were randomly sampled to compare responses from those who chose to receive treatment abroad or in Kenya. The cases were recruited from Ministry of Health while the controls from Kenyatta National Hospital and Texas Cancer Center. Data was analyzed using SPSS Software Version 21. Descriptive statistics, bivariate and multiple logistic regression analysis was carried out. Level of significance was set at 5%.
ResultsOut of 254 respondents, 174 (69.5%) were treated for cancer in Kenya and 80 (31.5%) in India. We found that cost effectiveness was a significant factor for over 73% of all respondents. The study revealed independent predictors for seeking treatment in India were: monthly income higher than US$ 250; every additional month from diagnosis increased likelihood by 1.16 times; physician advice (Odds Ratio(OR) 66; 95% Confidence Interval(CI) 7.9 -552.9); friends and family (OR 42; 95% CI 7.07-248.6); and perception of better quality of care (OR 22.5; 95% CI 2.2-230.6).
ConclusionReasons patients with cancer sought treatment in India are multifactorial. Several of these can be addressed to reverse out-ward bound medical tourism and position Kenya to be a regional hub as per the countrys development blueprint. It will require strengthening the health system accordingly and sensitizing the medical fraternity and general public on the same. | epidemiology |
10.1101/2022.04.07.22273592 | Microbiological risk ranking of foodborne pathogens and food products in scarce-data settings. | In the absence of epidemiological, microbiological or outbreak data, systematic identification of the hazards and food products posing the higher risk to the consumers is challenging. It is usually in Low- and Middle-Income Countries (LMICs), where the burden of foodborne disease is highest that data tend to be particularly scarce. In this study, we propose qualitative risk-ranking methods for pathogens and food products that can be used in settings where scarcity of data on the frequency/concentration of pathogens in foodstuff is a barrier towards the use of classical risk assessment frameworks. The approach integrates the existing knowledge on foodborne pathogens, manufacturing processes and intrinsic/extrinsic properties of food products with key context-specific information regarding the supply chain(s), characteristics of the Food Business Operators (FBOs) and cultural habits to identify: (i) the pathogens that should be considered as a "High" food safety priority and (ii) the food products posing the higher risk of consumer exposure to microbiological hazards via oral (ingestion) route. When applied to the dairy sector of Andhra Pradesh (India) as a case study, E. coli O157:H7, Salmonella spp., S. aureus and L. monocytogenes were identified as a "High" food safety priority across all FBOs, C. sakazakii a "High" priority for the FBOs producing infant formula/milk powder whilst Shigella spp. and Cryptosporidium spp. a "High" priority when considering the FBOs operating towards the informal end of the formal-informal spectrum. The risk ranking of dairy products was informed by a preliminary cluster analysis for early identification of products that are similar with regards to intrinsic/extrinsic features known to drive the microbiological risk. Products manufactured/retailed by FBOs in the informal market were considered as posing a "High" risk for the consumers due to a widespread lack of compliance to sanitary regulations. For dairy products produced by FBOs operating in the middle and formal end of the formal-informal spectrum, the risk of consumers exposure to microbiological hazards ranged from "Medium" to "Extremely low" depending on the FBO and the intrinsic/extrinsic properties of the products. While providing risk estimates of lower resolution if compared to data-driven risk assessments, the proposed method maximises the value of the information that can be easily gathered in LMICs and provide informative outputs to support food safety decision-making in contexts where resources to be allocated for prevention of foodborne diseases are limited and the food system is complex. | epidemiology |
10.1101/2022.04.15.22273915 | Effectiveness of COVID-19 mRNA vaccine booster dose relative to primary series during a period of Omicron circulation | During a period of Omicron variant circulation, we estimated relative VE of COVID-19 mRNA booster vaccination versus primary two-dose series in an ongoing community cohort. Relative VE was 66% (95% CI: 46%, 79%) favoring the booster dose compared to primary series vaccination. Our results support current booster recommendations. | epidemiology |
10.1101/2022.04.07.22273549 | Dose-response modelling of endemic coronavirus and SARS-CoV-2: human challenge trials reveal the individual variation in susceptibility | We propose a mathematical framework to analyze and interpret the outcomes of human challenge trials. We present plausible infection risks with HCoV-229E and SARS-CoV-2 over a wide range of infectious dose, and suggest ways to improve the design of future trials and to translate its outcomes to the general population.
One sentence summaryWe rephrase dose-response models in terms of heterogeneity in susceptibility in order to present the possible range of infection risks for endemic coronaviruses and SARS-CoV-2 | epidemiology |
10.1101/2022.04.11.22273704 | Effectiveness of a psychosocial training intervention in reducing psychological distress among caregivers of intellectually disabled children in Malawi - a randomized wait-list trial | BackgroundRates of disability are high in resource-poor settings with eighty-five percent of disabled children living in these settings. Long-term caregiving for disabled children is associated with fatigue, financial difficulties, parenting distress, and other psychological issues. Studies have shown a link between parenting children with intellectual disabilities and psychological distress as well as overall Health-Related Quality of Life (QoL). However, with interventions, these negative impacts may not be as severe as thought before. This study aimed at developing and testing the impact of a contextualized psychological intervention, Titukulane, in reducing psychological distress among caregivers with intellectually disabled children in Malawi.
MethodsWe conducted a randomized waitlist trial of a psychosocial training intervention (Titukulane) provided to caregivers of children with intellectual disabilities. Caregivers of children with intellectual disabilities aged 0 to 18 years were recruited, screened, and then enrolled in the trial through two disability organizations operating in Mzuzu (St John of God) and Lilongwe (Children of Blessings). They were then randomized in blocks to the Titukulane intervention or waitlist and provided with the intervention or standard care for 3 months respectively. Assessment of socioeconomic status, age, gender, and maternal psychological distress (through the Self Reported Questionnaire (SRQ) were conducted at baseline and follow-up.
ResultsWe found that psychological distress on SRQ was significantly lower in caregivers of children with intellectual disability in the Titukulane intervention in comparison to the waitlist control group even when the confounding variables of age, gender, and social-economical status were taken into account (Cohen d = 0.08; CI = 0.33-0.754; p =0.0005).
ConclusionsPyschosocial interventions such as the Titukulane intervention provided over a few months can improve caregiver mental health and quality of life - an important factor for supporting families of children with intellectual disability. | psychiatry and clinical psychology |
10.1101/2022.04.16.22273926 | Growing up unloved: the enduring consequence of childhood emotional neglect on the qualia of memory and imagination. | Childhood Adversity (CA) is one of the strongest factors associated with the onset of Major Depressive Disorder (MDD), and both CA and MDD have been linked to altered hippocampal structure/function. The current study aimed to explore the relationships between retrospectively reported childhood emotional neglect (CEN), current wellbeing and depressive symptoms, and a range of hippocampal-dependent cognitive functions i.e., anterograde learning and memory, episodic memory recollection, and imagination (episodic future thinking and scene construction). In two-wave recruitment periods at undergraduate intake 2014-15 (Cohort 1) and 2016-17 (Cohort 2), a combined cohort of n=1485 university students completed online surveys, with n=64 further participating in experimental testing session. As anticipated, higher CEN ratings consistently correlated with poorer current wellbeing and higher depressive symptoms. However, whilst the anticipated relationships between CEN, current wellbeing, and subjectively reported estimates of hippocampal-dependent cognitions were observed in the data reported in the online survey, an unexpectedly circumscribed pattern was observed on formal in-person examination of these cognitive functions. More specifically, higher CEN related to less vivid and less detailed imagined future/scene constructions and with an attenuated sense of presence and emotional valence during these simulations. A similar pattern was not evidence when participants simulated experienced past events (i.e. episodic memories). Current depression scores did not consistently correlate with vividness, detail, or emotional valence. In addition, and contrary to expectation, no relationship between CEN, depressive symptoms, and the spatial coherence of imagined or recollected events was seen. Moreover, neither CEN nor depressive symptoms correlated with many key measures of anterograde memory. Hence, we observed a highly specific constellation of impairment related to CEN when explored on a simulation per simulation basis, that was not obviously linked to altered hippocampal function, indicating that the relationship between CEN, hippocampal function, and subsequent psychopathology may not readily explained by either spatial or mnemonic hippocampal- related deficits. We consider whether the observed experiential differences in the qualia of imagined simulations may represent an important therapeutic target to decrease a CEN-driven latent vulnerability to MDD. | psychiatry and clinical psychology |
10.1101/2022.04.14.22273886 | To mask, or not to mask, Alice and Bob's dating dilemma | AO_SCPLOWBSTRACTC_SCPLOWFace masking in current COVID-19 pandemic seems to be a deceivingly simple decision-making problem due to its multifaceted nature. Questions arising from masking span biomedicine, epidemiology, physics, and human behaviors. While science has shown masks work generally, human behaviors (particularly under influences of politics) complicate the problem significantly given science generally assumes rationality and our minds are not always rational and/or honest. Minding minds, a legitimate concern, can also make masking legitimately confusing. To disentangle the potential confusions, particularly, the ramifications of irrationality and dishonesty, here we resort to evolutionary game theory. Specifically, we formulate and analyze the masking problem with a fictitious pair of young lovers, Alice and Bob, as a Sir Philip Sydney (SPS) evolutionary game, inspired by the handicap principle in evolutionary biology and cryptography figures in computer science. With the proposed ABD (Alice and Bobs dating dilemma) as an asymmetric four-by-four strategic-form game, 16 strategic interactions were identified, and six of which may reach equilibriums with different characteristics such as separating, pooling, and polymorphic hybrid, being Nash, evolutionarily stable or neutrally stable. The six equilibrium types seem to mirror the diverse behaviors of mask believers, skeptics, converted, universal masking, voluntarily masking, coexisted and/or divided world of believers and skeptics. We suggest that the apparently simple ABD game is sufficiently general not only for studying masking policies for populations (via replicator dynamics), but also for investigating other complex decision-making problems with COVID-19 pandemic including lockdown vs. reopening, herd immunity vs. quarantines, and aggressive tracing vs. privacy protection. | public and global health |
10.1101/2022.04.13.22273828 | Protective efficacy of holed and aging PBO-pyrethroid synergist-treated nets on malaria infection prevalence in north-western Tanzania | Two billion pyrethroid long-lasting insecticidal nets (LLINs) have been distributed since 2010 for malaria prevention in Sub-Saharan Africa. Current malaria control strategies rely on an assumed effective 3-year lifespan for LLINs. PBO synergist LLINs are a newly recommended class but there is limited information on their life span and long-term protective efficacy in communities. To assess their operational survival, a cohort of 390 PBO LLINs (Olyset Plus) and 367 standard pyrethroid LLIN (Olyset net) from 396 households were followed for 36 months in Western Tanzania. To assess the association between the condition of the LLIN and malaria infection, nets from at least 480 randomly selected households were assessed during malaria prevalence cross-sectional surveys at 4, 9, 16, 21, 28, and 33 months post-distribution. Information on the presence and condition of nets, and demographic information from the household, were collected to evaluate factors influencing net durability. After 3 years less than 17% of nets distributed were available for sleeping. The fabric condition was not associated with malaria infection in either type of net. The difference between the net types was highest when nets were between 1-2 years old, when PBO nets appeared to be similarly protective as nets less than a year old, whereas standard nets were considerably less protective as they aged, regardless of fabric condition. There was no statistical difference in the estimated median functional survival time between net types with 1.6 years (95% CI 1.38-1.87) for PBO LLIN and 1.9 years (95% CI 1.67-2.06) for standard LLINs. After 3 years, there was a loss of 55% of permethrin content for both nets, and 97% of PBO content was lost in PBO LLIN. These results highlight that functional survival is less than the recommended 3 years for both net types. However, even as the nets age, the PBO nets remained more protective than standard nets, regardless of their condition. | public and global health |
10.1101/2022.04.14.22273838 | Cycling for climate and public health: a missed opportunity for France | In addition to its potential contribution to reaching climate targets, cycling may generate substantial population-level health benefits through the physical activity it requires. Due to the lack of nationally representative mobility data, the health impact of current levels of cycling is still unknown for France. Relying on a health impact assessment framework and using recent nationally-representative data on mobility, we assessed the health and related economic benefits of cycling in 2018-2019 in France. We show that such benefits remain moderated and fall short when compared to those estimated in other countries with high cycling levels. We argue that cycling in France did not receive the attention and investments it deserves over the past decade and thus represents a missed opportunity for climate action and public health. | public and global health |
10.1101/2022.04.11.22273607 | Animals in higher education settings: Do animal-assisted interventions improve mental and cognitive health outcomes of students? A systematic review and meta-analysis | BackgroundDue to the high burden of mental health issues among students at higher education institutions world-wide, animal-assisted interventions (AAIs) are being increasingly used to relieve student stress. The objective of this study was to systematically review of the effects of AAIs on the mental and cognitive health outcomes of higher education students.
MethodsRandomized controlled trials using any unfamiliar animal as the sole intervention tool were included in the systematic review. Study quality was assessed using the Cochrane Risk-of-Bias tool. Where possible, effect sizes (Hedges g) were pooled for individual outcomes using random-effects meta-analyses. Albatross plots were used to supplement the data synthesis.
ResultsOf 2.401 identified studies, 35 were included. Almost all studies used dogs as the intervention animal. The quality of most included studies was rated as moderate. Studies showed an overall reduction of acute anxiety (g= -0.57 (95%CI -1.45;0.31)) and stress. For other mental outcomes, studies showed an overall small reduction of negative affect (g= -0.47 (95%CI -1.46;0.52)), chronic stress (g= -0.23 (95%CI -0.57;0.11)) and depression, as well as small increases in arousal, happiness and positive affect (g= 0.06 (95%CI -0.78;0.90)). Studies showed no effect on heart rate and heart rate variability, a small reduction in salivary cortisol and mixed effects on blood pressure. No effect on cognitive outcomes was found.
ConclusionOverall, evidence suggests that AAIs are effective at improving mental, but not physiological or cognitive outcomes of students. Strong methodological heterogeneity between studies limited the ability to draw clear conclusions. | public and global health |
10.1101/2022.04.12.22273589 | Profile of Brazilian inpatients with COVID-19 vaccine breakthrough infection and risk factors for unfavorable outcome | ObjectiveTo characterize the epidemiological and clinical profile of individuals more likely to become infected by SARS-CoV-2 after the fully vaccination schedule in order to profile priority groups to receive a booster dose in situations of vaccine doses shortage as well as for maintenance of personal protective care.
MethodsData from hospitalized COVID-19 patients who had been fully vaccinated and had a SARS-CoV-2 infection positive diagnosis were collected from the SIVEP-Gripe database (Influenza Epidemiological Surveillance Information System) from January 18, 2021 to September 15, 2021. Demographic data, clinical symptoms/signs and preexisting medical conditions (comorbidities) were analyzed. The primary outcome was in-hospital death.
ResultsThe majority of hospitalized patients with vaccine breakthrough infection were elderly [≥] 60 years old, male, with critical or severe COVID-19. The fatality rate was extremely high (50.27%) and more pronounced in elderly groups. The most prevalent symptoms were cough, dyspnoea, respiratory distress, and low blood oxygen saturation. The most frequent comorbidities were heart disease and diabetes. High fatality rates were observed among patients admitted to the intensive care units (72.88%) and those who required invasive mechanical ventilation (87.82%). The main risk factors for an unfavorable outcome were older age, respiratory compromise, inactivated virus vaccine immunization, and preexisting medical conditions.
ConclusionsWe characterize the profile of hospitalized Brazilian patients with COVID-19 vaccine breakthrough infection and the risk factors for an unfavorable outcome. These data have made it possible to identify priority groups to receive a booster dose, in addition to not neglecting personal protection. | public and global health |
10.1101/2022.04.11.22273494 | Poor Working Relationship between Doctors and Hospital Managers: A Systematic Review | BackgroundThe problem of poor relationships between doctors and hospital managers is a common feature of many healthcare systems worldwide, including the United Kingdoms NHS. Despite the significant impact that a poor working relationship between doctors and managers could have on the quality of care, there is limited research in this area.
ObjectivesTo investigate the organisational factors, contributing to the poor working relationship between doctors and hospital managers with a view to recommend potential solutions to address them.
MethodsWe performed a systematic literature review; a comprehensive search of AMED, MEDLINE, CINAHL, plus with Full Text, SportDiscus and EBSCO EBooks from January 2000 to July 2019 and updated in March 2022, and no further article was found that meets the selection criteria. Mixed methods, qualitative studies and quantitative studies published in English language in peer reviewed journals between January 2000 and March 2022 were included. Study selection, data extraction and appraisal of study were undertaken by the authors. Quality criteria were selected from CASP Checklist.
ResultsA total of 49,340 citations were retrieved and screened for eligibility, 41 articles were assessed as full text and 15 met the inclusion criteria. These include 2 mixed method studies, 8 qualitative studies, and 5 quantitative studies. A thematic analysis was undertaken, and narrative summaries used to synthesise the findings.
ConclusionThe findings of this systematic review show strong evidence of poor collaboration and lack of effective communication that contribute to poor working relationships between physicians and hospital administrators. The results from this review may guide the development of a hospital plan that involves both doctors and managers in the decision making process regarding the quality of patient care, which could potentially enhance the relationship between the two groups as it would build trust between them.
O_TEXTBOXWhat is already known on this topicO_LIThe problem of poor relationships between doctors and hospital managers is a common feature of many healthcare systems worldwide.
C_LIO_LIDespite the significant impact this poor relationship could have on the quality of care and patient satisfaction, there is limited research in this area.
C_LI
What This Study AddsO_LIWe conducted a systematic review on the effect of tension between doctors and hospital managers on the quality of care provided in hospital or healthcare centres; 15 (qualitative, quantitative and mixed) primary papers were reviewed.
This qualitative systematic study found considerable evidence of organisational factors that contributes to poor working relationships between doctors and managers.
C_LI
C_TEXTBOX | public and global health |
10.1101/2022.04.11.22273494 | Poor Working Relationship between Doctors and Hospital Managers: A Systematic Review | BackgroundThe problem of poor relationships between doctors and hospital managers is a common feature of many healthcare systems worldwide, including the United Kingdoms NHS. Despite the significant impact that a poor working relationship between doctors and managers could have on the quality of care, there is limited research in this area.
ObjectivesTo investigate the organisational factors, contributing to the poor working relationship between doctors and hospital managers with a view to recommend potential solutions to address them.
MethodsWe performed a systematic literature review; a comprehensive search of AMED, MEDLINE, CINAHL, plus with Full Text, SportDiscus and EBSCO EBooks from January 2000 to July 2019 and updated in March 2022, and no further article was found that meets the selection criteria. Mixed methods, qualitative studies and quantitative studies published in English language in peer reviewed journals between January 2000 and March 2022 were included. Study selection, data extraction and appraisal of study were undertaken by the authors. Quality criteria were selected from CASP Checklist.
ResultsA total of 49,340 citations were retrieved and screened for eligibility, 41 articles were assessed as full text and 15 met the inclusion criteria. These include 2 mixed method studies, 8 qualitative studies, and 5 quantitative studies. A thematic analysis was undertaken, and narrative summaries used to synthesise the findings.
ConclusionThe findings of this systematic review show strong evidence of poor collaboration and lack of effective communication that contribute to poor working relationships between physicians and hospital administrators. The results from this review may guide the development of a hospital plan that involves both doctors and managers in the decision making process regarding the quality of patient care, which could potentially enhance the relationship between the two groups as it would build trust between them.
O_TEXTBOXWhat is already known on this topicO_LIThe problem of poor relationships between doctors and hospital managers is a common feature of many healthcare systems worldwide.
C_LIO_LIDespite the significant impact this poor relationship could have on the quality of care and patient satisfaction, there is limited research in this area.
C_LI
What This Study AddsO_LIWe conducted a systematic review on the effect of tension between doctors and hospital managers on the quality of care provided in hospital or healthcare centres; 15 (qualitative, quantitative and mixed) primary papers were reviewed.
This qualitative systematic study found considerable evidence of organisational factors that contributes to poor working relationships between doctors and managers.
C_LI
C_TEXTBOX | public and global health |
10.1101/2022.04.15.22273859 | Effectiveness of ChAdOx1 nCoV-19 Corona Virus Vaccine (CovishieldTM) in preventing SARS-CoV2 infection, Chennai, Tamil Nadu, India, 2021 | BackgroundIndia experienced the second wave of the COVID-19 pandemic in March 2021, driven by the delta variant. Apprehensions around the usefulness of vaccines against delta variant posed a risk to the vaccination program. Therefore, we estimated the effectiveness of two doses of the ChAdOx1 nCoV-19 (Covishield) vaccine against COVID-19 infection among individuals [≥]45 years in Chennai, India.
MethodsA community-based cohort study was conducted from May to September 2021 in a selected geographic area in Chennai, Tamil Nadu. The estimated sample size was 10,232. We enumerated individuals from all eligible households and periodically updated vaccination and COVID-19 infection data. We computed vaccine effectiveness with its 95% confidence interval for two doses of the Covishield vaccine against any COVID-19 infection.
ResultsWe enrolled 69,435 individuals, of which 21,793 were above 45 years. Two dose coverage of Covishield in the 18+ and 45+ age group was 18% and 31%, respectively. The overall incidence of COVID-19 infection was 1099 per 100,000 population. The vaccine effectiveness against COVID-19 disease in the [≥]45 age group was 61.3% (95% CI: 43.6 - 73.4) at least two weeks after receiving the second dose of Covishield. Genomic analysis of 74 (28 with two doses, 15 with one dose, and 31 with zero dose) out of the 90 aliquots collected from the 303 COVID-19 positive individuals in the 45+ age group showed delta variants and their sub-lineages.
ConclusionWe demonstrated the effectiveness of two doses of the ChAdOx1 vaccine against the delta variant in the general population of Chennai. We recommend similar future studies considering emerging variants and newer vaccines. Two-dose vaccine coverage could be ensured to protect against COVID-19 infection. | public and global health |
10.1101/2022.04.14.22273891 | Medical Exemptions to Vaccination - Riverside County, California, 2016-2019 | ObjectivesTo review the nature and clinical reasoning of medical exemptions to vaccination received in Riverside County, CA after California Senate Bill 277 eliminated personal belief exemptions statewide.
Methods614 deduplicated medical exemptions to vaccination from 156 providers were reviewed from August 2016 to August 2019. Exemptions covering all vaccines were additionally coded for number and category of medical justification.
Results81.3% of reviewed exemptions were for all vaccines, 91.0% were permanent or indefinite, and 74.9% were for all vaccines and permanent or indefinite. Of the 490 evaluated all-vaccine exemptions, a median of two and maximum of ten justifications were cited per exemption, most often a family history of autoimmune disease other than allergy. Three providers wrote more than 70 exemptions each.
ConclusionsThe number and nature of the exemptions reviewed here raise concerns over their impact on immunization rates, and future policies may need to ensure additional oversight. | public and global health |
10.1101/2022.04.07.22273575 | Monogenetic Rare Diseases in Biomedical Databases and Text Mining | 1AO_SCPLOWBSTRACTC_SCPLOWThe testing of pharmacological hypotheses becomes faster and more accurate, but at the same time more difficult than even two decades ago. It takes more time to collect and analyse disease mechanisms and experimental facts in various specialized resources. We discuss a new approach to aggregating individual pieces of information about a single disease using Elseviers automated text mining technology. Developed algorithm allows for the collection of published facts in a unified format starting only with the name of the disease. The special template, which combines research and clinical descriptions of diseases was developed. The approach was tested, and information was collected for 55 rare monogenic diseases. Clinical, molecular, and pharmacological characteristics of diseases with supporting references from the literature are available in the form of tables and files. Manually curated templates for 10 rare diseases, including top ranked Cystic Fibrosis and Huntingtons disease, were published to demonstrate the results of the described approach. | medical education |
10.1101/2022.04.16.22273927 | Memory and Concentration Skills In A Sample of First Grade Medical Students at University of Baghdad/College of Medicine | ObjectivesThe purpose of this study is to assess the level of memory skills and concentration skills among first year medical students in College of Medicine/University of Baghdad depending on global scale (Study Skills Inventory SSI).
Subjects and MethodA cross-sectional study to assess memory and concentration skills among first year medical students in College of Medicine/ University of Baghdad, the study was conducted using an online survey in September 2020. A sample of 103 students participated in the study by filling of an online questionnaire which was modified from the Study Skills Inventory (SSI).
Regarding memory skills a score less than 30 was considered not adequate, while regarding concentration skills a score less than 35 was considered not adequate.
ResultsPercentage of males was 68% and the percentage of females was 32%. About studying hours we found that 59.2% students study less than 3 hours, 25.2% students study between 3-6 hours and 15.5% students study more than 6 hours. The mean score of the students for concentration skills was 36.45 and was 32.40 for memory skills. Regarding concentration skills 35% students had non adequate score and 65% students had adequate score, regarding memory skills 28.2% students had non adequate score and 71.8% students had adequate score. There is a statistically significant association between concentration skills and studying hours and there is statistically significant association between memory skills and studying hours. There was a statistically significant moderate positive correlation between concentration skill score and the memory skill score of the students total score (r = 0.511, p < 0.01).
ConclusionAbout 75% of 1st year medical students have adequate concentration and memory skills. The students who study for 3-6 hours daily have the least mean score of both skills with 42.3% of them have adequate concentration skills score and 57.7% had adequate memory skills score. Further studies with larger sample size are needed to correlate the concentration and memory skills with student end year average total score. | medical education |
10.1101/2022.04.07.22273579 | Addition of formaldehyde releaser imidazolidinyl urea and MOPS buffer to urine samples enables delayed processing for flow cytometric analysis of urinary cells | Kidney diseases are a major health concern worldwide. Currently there is a large unmet need for novel biomarkers to non-invasively diagnose and monitor kidney diseases. Urinary cells are promising biomarkers and their analysis by flow cytometry has demonstrated its utility in diverse clinical settings. However, up to date this methodology depends on fresh samples, as cellular event counts and the signal-to-noise-ratio deter over time.
Here we developed an easy-to-use two-step preservation method for conservation of urine samples for subsequent flow cytometry. The protocol utilizes a combination of the formaldehyde releasing agent imidazolidinyl urea (IU) and MOPS buffer, leading to gentle fixation of urinary cells. The preservation method increases acceptable storing time of urine samples from several hours to up to 6 days. Cellular event counts and staining properties of cells remain comparable to fresh untreated samples.
The hereby presented preservation method facilitates future investigations on flow cytometry of urinary cells as potential biomarkers and may enable broad implementation in clinical practice. | nephrology |
10.1101/2022.04.10.22273666 | Multiple Cost Optimisation for Alzheimer's Disease Diagnosis | Current machine learning techniques for dementia diagnosis often do not take into account real-world practical constraints, which may include, for example, the cost of diagnostic assessment time and financial budgets. In this work, we built on previous cost-sensitive feature selection approaches by generalising to multiple cost types, while taking into consideration that stakeholders attempting to optimise the dementia care pathway might face multiple non-fungible budget constraints. Our new optimisation algorithm involved the searching of cost-weighting hyperparameters while constrained by total budgets. We then provided a proof of concept using both assessment time cost and financial budget cost. We showed that budget constraints could control the feature selection process in an intuitive and practical manner, while adjusting the hyperparameter increased the range of solutions selected by feature selection. We further showed that our budget-constrained cost optimisation framework could be implemented in a user-friendly graphical user interface sandbox tool to encourage non-technical users and stakeholders to adopt and to further explore and audit the model -a humans-in-the-loop approach. Overall, we suggest that setting budget constraints initially and then fine tuning the cost-weighting hyperparameters can be an effective way to perform feature selection where multiple cost constraints exist, which will in turn lead to more realistic optimising and redesigning of dementia diagnostic assessments.
Clinical RelevanceBy optimising diagnostic accuracy against various costs (e.g. assessment administration time and financial budget), predictive yet practical dementia diagnostic assessments can be redesigned to suit clinical use. | neurology |
10.1101/2022.04.09.22273637 | Intellectual enrichment and genetic modifiers of cognitive function in Huntington's disease | An important step towards the development of treatments for cognitive impairment in ageing and neurodegenerative diseases is to identify genetic and environmental modifiers of cognitive function and understand the mechanism by which they exert an effect. In Huntingtons disease, the most common autosomal dominant dementia, a small number of studies have identified intellectual enrichment, i.e. a cognitively stimulating lifestyle, and genetic polymorphisms as potential modifiers of cognitive function. The aim of our study was to further investigate the relationship and interaction between genetic factors and intellectual enrichment on cognitive function and brain atrophy in Huntingtons disease. For this purpose, we analysed data from Track- HD, a multi-centre longitudinal study in Huntingtons disease gene-carriers, and focused on the role of intellectual enrichment (estimated at baseline) and the genes FAN1, MSH3, BDNF, COMT and MAPT in predicting cognitive decline and brain atrophy. We found that carrying the 3a allele in the MSH3 gene had a positive effect on global cognitive function and brain atrophy in multiple cortical regions, such that 3a allele carriers had a slower rate of cognitive decline and atrophy compared to non-carriers, in agreement with its role in somatic expansion instability. No other genetic predictor had a significant effect on cognitive function and the effect of MSH3 was independent of intellectual enrichment. Intellectual enrichment also had a positive effect on cognitive function; participants with higher intellectual enrichment, ie. those who were better educated, had higher verbal IQ and performed an occupation that was intellectually engaging, had better cognitive function overall, in agreement with previous studies in Huntingtons disease and other dementias. We also found that intellectual enrichment interacted with the BDNF gene, such that the positive effect of intellectual enrichment was greater in Met66 allele carriers than non- carriers. A similar relationship was also identified for changes in whole brain and caudate volume; the positive effect of intellectual enrichment was greater for Met66 allele carriers, rather than non- carriers. In summary, our study provides additional evidence for the beneficial role of intellectual enrichment and carrying the 3a allele in MSH3 in cognitive function in Huntingtons disease and their mechanism of action. | neurology |
10.1101/2022.04.08.22273614 | Effect of fasting therapy on vitamin D, vitality and quality of life. A randomized control trial | BackgroundThe aim of the present study was to determine the effects of prolonged fasting (10 days) in the vitamin D, B12 levels, body mass index (BMI), weight, hemoglobin, vitality and quality of life (QoL) compared to normal diet.
MethodsThis randomized control trial included 52 participants (aged 19-74 years) randomized in to a fasting group (FG) or a normal diet group (NDG) with 26 participants in each group. The study was conducted at an in-patient setting where the FG were on a fasting diet (500 kCal/day) which included holy basil herbal tea, lemon honey juice and water (3 L). The NDG (1500 kCal/day) consumed routine diet that included Indian breads, pulses, steamed rice, vegetable salads and beverages.
ResultsThe FG has shown significant increase in the Vitamin D levels (p=0.003, d=0.475), vitality (p=0.006, d=0.425), physical QoL (p<0.001, d=0.549), psychological QoL (p=0.002, d=0.488), environmental QoL (p=0.004, d=0.457) compared to NDG. No significant changes were observed in Vitamin B12, weight, BMI, hemoglobin and social QoL. A weak to moderate ({rho}= 0.330-0.483) positive correlation was observed between vitality scores and QoL domains, whereas BMI scores showed an inverse correlation ({rho}=-0.280) with vitamin D levels.
DiscussionThe results suggest that prolonged fasting can improve the vitamin D levels, vitality and promote quality life compared to normal diet. Unlike previous studies FG does not differ from NDG with respect to weight and BMI. Nevertheless, fasting may be utilized as an effective tool to tackle vitamin d deficiency and associated health insufficiencies.
Trial RegistryClinical Trial Registry of India CTRI/2022/02/040446. | nutrition |
10.1101/2022.04.06.22273454 | Impact of a Multidisciplinary Nutritional Support Team (MNST) on Quality Improvement in Home Parenteral Nutrition (HPN) | IntroductionHPN is essential for patients requiring long term nutritional support. This Quality Improvement Project for HPN Patients (QIP-PN) studied the effect of a Physician Nutrition Expert (PNE)- led multidisciplinary nutritional support team (MNST) on HPN care.
ObjectiveTo test the effect of an MNST on adherence to protocols, outcomes and QOL in HPN.
MethodsData review was conducted in phases 1a and 1b, observation in phase 2 and intervention in phase 3. In Phase 3 the MNST made recommendations to treating physicians. All long-term HPN patients were offered study participation. 75 signed consent forms. Treating physician study participation agreements were signed for 42 patients (the study group). A random sample of 30 long-term HPN patients comprised the case-matched control group (control). Data were collected on demographics, treating physicians PNE status, HPN care variables, recommended interventions, quality-of-life assessment, adverse outcomes and hospitalizations. Independent samples t-test was used for continuous data between study group and control group. Paired t-test was used for phase 2 and phase 3 patient data. Comparison between the study and control group utilized a negative binomial regression model. Statistical analysis utilized R (https://www.r-project.org/).
Results34 patients were reviewed retrospectively in phase 1a and 197 prospectively in phase 1b. 40 study patients completed phase 2 and progressed into phase 3, of whom 30 completed [≥]60 therapy days. Improvements in weight, BMI and QOL were seen in the study patients during intervention. Recommendations made and accepted by treating physicians differed based on PNE status. Study patients had fewer adverse outcomes and related hospitalizations than case matched controls.
ConclusionMNST recommendations improved clinical, biochemical parameters and patients self-reported overall health. It reduced adverse outcomes, hospitalization and hospital length of stay. MNST input could have a significant impact on the quality and cost of HPN care. | nutrition |
10.1101/2022.04.06.22273454 | Impact of a Multidisciplinary Nutritional Support Team on Quality Improvement in Home Parenteral Nutrition | IntroductionHPN is essential for patients requiring long term nutritional support. This Quality Improvement Project for HPN Patients (QIP-PN) studied the effect of a Physician Nutrition Expert (PNE)- led multidisciplinary nutritional support team (MNST) on HPN care.
ObjectiveTo test the effect of an MNST on adherence to protocols, outcomes and QOL in HPN.
MethodsData review was conducted in phases 1a and 1b, observation in phase 2 and intervention in phase 3. In Phase 3 the MNST made recommendations to treating physicians. All long-term HPN patients were offered study participation. 75 signed consent forms. Treating physician study participation agreements were signed for 42 patients (the study group). A random sample of 30 long-term HPN patients comprised the case-matched control group (control). Data were collected on demographics, treating physicians PNE status, HPN care variables, recommended interventions, quality-of-life assessment, adverse outcomes and hospitalizations. Independent samples t-test was used for continuous data between study group and control group. Paired t-test was used for phase 2 and phase 3 patient data. Comparison between the study and control group utilized a negative binomial regression model. Statistical analysis utilized R (https://www.r-project.org/).
Results34 patients were reviewed retrospectively in phase 1a and 197 prospectively in phase 1b. 40 study patients completed phase 2 and progressed into phase 3, of whom 30 completed [≥]60 therapy days. Improvements in weight, BMI and QOL were seen in the study patients during intervention. Recommendations made and accepted by treating physicians differed based on PNE status. Study patients had fewer adverse outcomes and related hospitalizations than case matched controls.
ConclusionMNST recommendations improved clinical, biochemical parameters and patients self-reported overall health. It reduced adverse outcomes, hospitalization and hospital length of stay. MNST input could have a significant impact on the quality and cost of HPN care. | nutrition |
10.1101/2022.04.08.22273610 | Protocol for the pilot study of group video yogic breathing app in breast cancer survivors | IntroductionBreast cancer remains a leading cause of cancer deaths; however, recent improvements in treatment have improved survivorship. As a result of this improvement, more individuals are living with the long-term side effects of cancer treatment. Therefore, methods that incorporate lifestyle and mind-body approaches are becoming increasingly used in the patient treatment pathway.
MethodsIn this study, PranaScience Institute will develop and test a group video mobile application for Yogic Breathing (YB). YB is shown to reduce symptomatic conditions associated with several conditions including breast cancer. For this initial feasibility study, PranaScience will collaborate with the Medical University of South Carolina to implement the study app-based program in breast cancer survivors. This research is aimed to understand if the YB could be delivered via an app, if participants are able to practice it satisfactorily, and if there is any symptom relief by the YB practice. In the control group, participants will be directed to the Attention Control (AC) feature of the app, which guides users to focus on a mindfulness activity not involving YB. Participants will be randomly assigned to the YB or AC study plan (N=20 per group). Breast cancer survivors who have completed radiation therapy within last 6 months will be recruited for this study and provided access to the app for a 12-week program. The study app will record total practice times. Virtual visits by a study yoga instructor during group video sessions will measure participant compliance with proper technique. Feasibility will be examined by evaluating intervention delivery factors and resource needs. Acceptability of using the mobile study app to support symptom management will be evaluated using a satisfaction and system usability scale. Behavioral survey measures will help guide effect sizes and power calculations for the next larger-scale study. Biomarkers in the saliva (tumor suppressors, cytokines), and fingernails (cortisol, differential proteomics) will be measured at baseline and end of study at 12 weeks.
DiscussionAll findings from this pilot study will be synthesized to refine the mobile study app in preparation for large-scale evaluation in Phase II involving all-study site participants with cancer. ClinicalTrials.gov Identifier NCT05161260. | oncology |
10.1101/2022.04.13.22273856 | Comparison of the accuracy of 9 intraocular lens power calculation formulas using partial coherence interferometry | PurposeTo compare the accuracy of 9 intraocular lens (IOL) power calculation formulas (SRK/T, Hoffer Q, Holladay 1, Haigis, Barrett Universal II, Kane, EVO 2.0, Ladas Super formula and Hill-RBF 3.0) using partial coherence interferometry (PCI).
MethodsData from patients having uncomplicated cataract surgery with the insertion of 1 of 3 IOL types were included. All preoperative biometric measurements were performed using PCI. Prediction errors (PE) were deduced from refractive outcomes evaluated 3 months after surgery. The mean prediction error (ME), mean absolute prediction error (MAE), median absolute prediction error (MedAE), and standard deviation of prediction error (SD) were calculated, as well as the percentage of eyes with a PE within {+/-}0.25, {+/-}0.50, {+/-}0.75 and {+/-}1.00D for each formula.
ResultsIncluded in the study were 126 eyes of 126 patients. Kane achieved the lowest MAE and SD across the entire sample as well as the highest percentage of PE within {+/-}0.50D, and was proven to be more accurate than Haigis and Hoffer Q (P <.001). For an axial length of more than 26.0 mm, EVO 2.0 and Barrett obtained the lowest MAEs, with EVO 2.0 and Kane showing a higher percentage of prediction at {+/-}0.50D compared to old generation formulas except for SRK/T (P =.04).
ConclusionAll investigated formulas achieved good results; there was a tendency towards better outcomes with new generation formulas, especially in atypical eyes. | ophthalmology |
10.1101/2022.04.11.22273701 | SYSTEMATIC REVIEW EXPLORING THE IMPACT OF SOCIO-CULTURAL FACTORS ON PAIN MANAGEMENT APPROACHES IN SUB- SAHARAN AFRICA | AimThe experience and expression of pain are influenced by numerous factors of which culture and the society plays a major role especially in SSA. However, few studies have focused on the impact of cultural influences on pain assessment and management in SSA. This systemic review examines pain prevalence and its intensity/severity, the socio-cultural factors that affect pain management and the extent to which socio-cultural practices influence pain assessment and management in SSA.
MethodsApplying the Preferred Reporting Items for Systemic Reviews and Meta-Analyses (PRISMA) guidelines, a systematic literature search was conducted. Seven electronic databases were searched, and a strict inclusion and exclusion criteria applied to the retrieve articles along with a robust filtering to identify eligible peer reviewed literature. The review process concluded with 24 eligible articles and following the Grades of Recommendation, Assessment, Development and Evaluation (GRADE) approach was applied to assess the quality of the included literature and thematic narrative analysis was conducted.
ResultsThe analysis findings identified that there are sociocultural barriers to effective pain management from the perspective of different subcultures in SSA. The evidence suggests that religious/spiritual and inherited beliefs, along with limited knowledge and health literacy influence the experience and pain management approaches applied in SSA. In addition, results indicate that, resource constraints and cultural and societal norms impact on access and use of pain management among the population in SSA.
ConclusionHealthcare professionals should be aware of how the society, cultural and beliefs of their patients influence their expression of pain and subsequent pain management. Under-treatment or over-treatment might occur if health workers are unaware or do not consider the cultural norms associated with pain and pain expression, due to the subjective and individual nature of pain. | pain medicine |
10.1101/2022.04.12.22273763 | Exploring the genetic overlap between 12 psychiatric disorders | The widespread comorbidity among psychiatric disorders (PDs) demonstrated in epidemiological studies1-5 is mirrored by non-zero, positive genetic correlations from large scale genetic studies6-10. We employed several strategies to uncover pleiotropic SNPs, genes and biological pathways7,8 underlying this genetic covariance. First, we conducted cross-trait meta-analysis on 12 PDs to identify pleiotropic SNPs. However, the majority of meta-analytic signal was driven by only one or a few PDs, hampering interpretation and joint biological characterization of the meta-analytic signal. Next, we performed pairwise comparisons of PDs on the SNP, gene, genomic region, gene-set, tissue-type, and cell-type level. Substantial overlap was observed, but mainly among pairs of PDs, and mainly at less stringent p-value thresholds. Only heritability enrichment for "conserved genomic regions" and "nucleotide diversity" was significant for multiple (9 out of 12) PDs. Overall, identification of shared biological mechanisms remains challenging due to variation in power and genetic architecture between PDs. | genetic and genomic medicine |
10.1101/2022.04.05.22273459 | The Contributions of Rare Inherited and Polygenic Risk to ASD in Multiplex Families | Autism Spectrum Disorder (ASD) has a complex genetic architecture involving contributions from de novo and inherited variation. Few studies have been designed to address the role of rare inherited variation, or its interaction with polygenic risk in ASD. Here, we performed whole genome sequencing of the largest cohort of multiplex families to date, consisting of 4,551 individuals in 1,004 families having 2 or more affected children with ASD. Using this study design, we identify seven novel risk genes supported primarily by rare inherited variation, finding support for a total of 74 genes in our cohort and a total of 152 genes after combining with other studies. Probands demonstrated an increased burden of mutations in 2 or more known risk genes (KARGs) -- in three families both probands inherited protein truncating variants in two KARGs. We also find that polygenic risk is over transmitted from unaffected parents to affected children with rare inherited variants, consistent with combinatorial effects in the offspring, which may explain the reduced penetrance of these rare variants in parents. We also observe that in addition to social dysfunction, language delay is associated with ASD polygenic risk over-transmission. These results are consistent with an additive complex genetic risk architecture of ASD involving rare and common variation and further suggest that language delay is a core biological feature of ASD. | genetic and genomic medicine |
10.1101/2022.04.15.22273925 | COVID-19 relevant genetic variants confirmed in an admixed population | The dissection of factors that contribute to COVID-19 infection and severity has overwhelmed the scientific community for almost 2 years. Current reports highlight the role of in disease incidence, progression, and severity. Here, we aimed to confirm the presence of previously reported genetic variants in an admixed population. Allele frequencies were assessed and compared between the general population (N=3079) for which at least 30% have not been infected with SARS-CoV2 as per July 2021 versus COVID-19 patients (N=106).
Genotyping data from the Illumina GSA array was used to impute genetic variation for 14 COVID-relevant genes, using the 1000G phase 3 as reference based on the human genome assembly hg19, following current standard protocols and recommendations for genetic imputation. Bioinformatic and statistical analyses were performed using MACH v1.0, R, and PLINK.
A total of 7953 variants were imputed on, ABO, CCR2, CCR9, CXCR6, DPP9, FYCO1, IL10RB/IFNAR2, LZTFL1, OAS1, OAS2, OAS3, SLC6A20, TYK2, and XCR1. Statistically significant allele differences were reported for 10 and 7 previously identified and confirmed variants, ABO rs657152, DPP9 rs2109069, LZTFL1 rs11385942, OAS1 rs10774671, OAS1 rs2660, OAS2 rs1293767, and OAS3 rs1859330 p<0.03. In addition, we identified 842 variants in these COVID-related genes with significant allele frequency differences between COVID patients and the general population (p-value <E-2 - E-179).
Our observations confirm the presence of genetic differences in COVID-19 patients in an admixed population and prompts for the investigation of the statistical relevance of additional variants on these and other genes that could identify local and geographical patterns of COVID-19. | genetic and genomic medicine |
10.1101/2022.04.11.22273725 | The genetic background of hydrocephalus in a population-based cohort: implication of ciliary involvement | BackgroundHydrocephalus is one of the most common congenital disorders of the central nervous system and often displays psychiatric co-morbidities, in particular autism spectrum disorder. The disease mechanisms behind hydrocephalus are complex and not well understood, but some association with dysfunctional cilia in the brain ventricles and subarachnoid space has been indicated. A better understanding of the genetic aetiology of hydrocephalus, including the role of ciliopathies, may bring insights into a potentially shared genetic aetiology. In this population-based case-cohort study we, for the first time, investigated variants of postulated hydrocephalus candidate genes. Using this data, we aimed to investigate potential involvement of the ciliome in hydrocephalus and describe genotype-phenotype associations with autism spectrum disorder, identified in the iPSYCH cohort.
MethodsOne-hundred and twenty-one hydrocephalus candidate genes were screened in a whole-exome sequenced sub cohort of the iPSYCH study, comprising 72 hydrocephalus patients and 4,181 background population controls. Candidate genes containing high-impact variants of interest were systematically evaluated for their involvement in ciliary function and autism spectrum disorder.
ResultsThe median age at diagnosis for the hydrocephalus patients was 0 years of age (range 0-27 years), the median age at analysis was 22 years (11-35 years), and 70.5 % were males. Median age for controls was 18 years (range 11-26 years) and 53.3 % were males. Fifty-two putative hydrocephalus-associated variants in 34 genes, were identified in 42 patients (58.3 %). In hydrocephalus cases, we found increased, but not significant, enrichment of high-impact protein altering variants (OR 1.51, 95% CI 0.92-2.51, p = 0.096), which was driven by a significant enrichment of rare protein truncating variants (OR 2.71, 95% CI 1.17-5.58, p = 0.011). Fourteen of the genes with high-impact variants are part of the ciliome, whereas another six genes affect cilia-dependent processes during neurogenesis. Furthermore, 15 of the 34 genes with high-impact variants and three of eight genes with protein truncating variants were associated with autism spectrum disorder.
ConclusionsBecause symptoms of other diseases may be neglected or masked by the hydrocephalus-associated symptoms and identification of co-morbidities may be of clinical significance, we suggest that patients with congenital hydrocephalus undergo clinical genetic assessment with respect to ciliopathies and autism spectrum disorder. Our results point to the significance of hydrocephalus as a ciliary disease in some cases. Future studies in brain ciliopathies may not only reveal new insights into hydrocephalus, but also, brain disease in the broadest sense, given the essential role of cilia for neurodevelopment. | genetic and genomic medicine |
10.1101/2022.04.12.22273803 | Triangulating evidence in health sciences with Annotated Semantic Queries | Integrating information from data sources representing different study designs has the potential to strengthen evidence in population health research. However, this concept of evidence "triangulation" presents a number of challenges for systematically identifying and integrating relevant information. We present ASQ (Annotated Semantic Queries), a natural language query interface to the integrated biomedical entities and epidemiological evidence in EpiGraphDB, which enables users to extract "claims" from a piece of unstructured text, and then investigate the evidence that could either support, contradict the claims, or offer additional information to the query. This approach has the potential to support the rapid review of pre-prints, grant applications, conference abstracts and articles submitted for peer review. ASQ implements strategies to harmonize biomedical entities in different taxonomies and evidence from different sources, to facilitate evidence triangulation and interpretation. ASQ is openly available at https://asq.epigraphdb.org. | health informatics |
10.1101/2022.04.13.22273750 | Mondo: Unifying diseases for the world, by the world | There are thousands of distinct disease entities and concepts, each of which are known by different and sometimes contradictory names. The lack of a unified system for managing these entities poses a major challenge for both machines and humans that need to harmonize information to better predict causes and treatments for disease. The Mondo Disease Ontology is an open, community-driven ontology that integrates key medical and biomedical terminologies, supporting disease data integration to improve diagnosis, treatment, and translational research. Mondo records the sources of all data and is continually updated, making it suitable for research and clinical applications that require up-to-date disease knowledge. | health informatics |
10.1101/2022.04.13.22273750 | Mondo: Unifying diseases for the world, by the world | There are thousands of distinct disease entities and concepts, each of which are known by different and sometimes contradictory names. The lack of a unified system for managing these entities poses a major challenge for both machines and humans that need to harmonize information to better predict causes and treatments for disease. The Mondo Disease Ontology is an open, community-driven ontology that integrates key medical and biomedical terminologies, supporting disease data integration to improve diagnosis, treatment, and translational research. Mondo records the sources of all data and is continually updated, making it suitable for research and clinical applications that require up-to-date disease knowledge. | health informatics |
10.1101/2022.04.13.22273750 | Mondo: Unifying diseases for the world, by the world | There are thousands of distinct disease entities and concepts, each of which are known by different and sometimes contradictory names. The lack of a unified system for managing these entities poses a major challenge for both machines and humans that need to harmonize information to better predict causes and treatments for disease. The Mondo Disease Ontology is an open, community-driven ontology that integrates key medical and biomedical terminologies, supporting disease data integration to improve diagnosis, treatment, and translational research. Mondo records the sources of all data and is continually updated, making it suitable for research and clinical applications that require up-to-date disease knowledge. | health informatics |
10.1101/2022.04.16.22273930 | The impact of HIV on women living with HIV and their families in low- and middle-income countries: A systematic review | HIV infection adds a significant burden to women in low- and middle-income countries (LMICs), often leading to severe detrimental impact, not only on themselves, but also on their families and communities. Given that more than half of all people living with HIV globally are females (53%), this review seeks to understand the impact of HIV infection on women living with HIV (WLHIV) and their families in LMICs, and the interrelationships between one impact and another. A systematic review was conducted to find literature using the following databases: Medline, PsycINFO, CINAL, Emcare, Scopus and ProQuest. Research articles were included if they met the following inclusion criteria: conducted in LMICs, published in English language between January 1st 1990 and October 31st 2021, had full text available, involved WLHIV (married and unmarried), and focused on the impact of HIV on these women and their families. Critical appraisal tools developed by Joanna Briggs Institute (JBI) were used to assess the methodological quality of the studies and thematic narrative synthesis was used to analyse the findings. A total of 22 articles met the inclusion criteria. The review showed that HIV has a range of negative consequences on WLHIV and their families including: (i) psychological impact, (ii) poor physical health and intimate partner violence, (iii) social impact, and (iv) economic impact. The findings indicate the need for targeted interventions, specific to WLHIV, that address the inequity and discrimination they face. These interventions should also incorporate education and sustainable support structures for WLHIV and their families. | hiv aids |
10.1101/2022.04.15.22273913 | Analysis of severe illness after post-vaccination COVID-19 breakthrough among adults with and without HIV in the United States | ImportanceUnderstanding the severity of post-vaccination COVID-19 breakthrough illness among people with HIV (PWH) can inform vaccine guidelines and risk-reduction recommendations.
ObjectiveEstimate the rate and risk of severe breakthrough illness among vaccinated PWH and people without HIV (PWoH) who experience a breakthrough infection.
Design, setting, and participantsThe Corona-Infectious-Virus Epidemiology Team (CIVET-II) collaboration consists of four US longitudinal cohorts from integrated health systems and academic centers. Adults ([≥]18 years old), in-care, fully vaccinated by June 30, 2021 with HIV, and matched PWoH (on date fully vaccinated, age group, race/ethnicity, and sex) were the source population. Those who experienced a post-vaccination SARS-CoV-2 breakthrough infection were eligible. Severe COVID-19 breakthrough illness was defined as hospitalization due to COVID-19. Discrete time proportional hazards models estimated adjusted hazard ratios (aHR) and 95% confidence intervals ([,]) of severe breakthrough illness by HIV status adjusting for demographics, COVID-19 vaccine type, and clinical factors. The proportion of patients requiring mechanical ventilation or died was compared by HIV status.
ExposureHIV infection
OutcomeSevere COVID-19 breakthrough illness, defined as hospitalization within 28 days after a breakthrough SARS-CoV-2 infection with a primary or secondary COVID-19 discharge diagnosis.
ResultsAmong 1,241 PWH and 2,408 PWoH with breakthrough infections, the cumulative incidence of severe illness in the first 28 days was low and comparable between PWoH and PWH (7.3% vs. 6.7%, respectively, risk difference=-0.67% [-2.58%, 1.23%]). The risk of severe breakthrough illness was 59% higher in PWH with CD4 counts <350 cells/mm3 compared with PWoH (aHR=1.59 [0.99, 2.46]). In multivariable analyses among PWH, being female, older, having a cancer diagnosis, and lower CD4 count increased the risk of severe breakthrough illness, while previous COVID-19 reduced the risk. Among all patients, 10% were mechanically ventilated and 8% died, with no difference by HIV status.
Conclusions and RelevanceThe risk of severe COVID-19 breakthrough illness within 28 days of a breakthrough infection was low among vaccinated PWH and PWoH. However, PWH with moderate and severe immune suppression had a higher risk of severe breakthrough infection. Recommendations for additional vaccine doses and risk-reduction strategies for PWH with moderate immune suppression may be warranted.
Key PointsO_ST_ABSQuestionC_ST_ABSIn 2021, among fully vaccinated people with COVID-19 breakthrough illness, was the risk of severe illness higher in people with HIV (PWH) compared to people without HIV (PWoH)?
FindingsPWH with <350 cells/mm3 have a 59% increased risk of severe breakthrough illness compared to PWoH.
MeaningVaccinations effectively reduce the risk of severe COVID-19 infection in both PWH and PWoH; however, PWH having a CD4 count <350 cells/mm3 are at higher risk of severe breakthrough infection compared to PWoH. PWH with moderate immune suppression should be considered for additional vaccine dosages and other risk-reduction measures. | hiv aids |
10.1101/2022.04.12.22271963 | Universal two-dimensional labeled probe-mediated melting curve analysis based on multiplex PCR for rapid typing of plasmodium in a single closed tube | Nowadays, malaria is still one of the major public health problems which commonly caused by four plasmodium species, especially in the epidemic of COVID-19 harboring similar symptoms of fever or fatigue, which easily result in misdiagnosis. The disadvantages of previous traditional detection methods, such as time-consuming, costly, complicated operation, strong professionalism, indistinguishable typing and so on, lead to the dilemma of difficulty to meet the clinical requirements of rapid, easy and accurate typing of common plasmodiums. Herein, we developed and maximally optimized a universal two-dimensional labelled probe-mediated melting curve analysis (UP-MCA) assay based on multiplex PCR for rapid and accurate typing of five plasmodiums, including novel human plasmodium, Plasmodium knowlesi (Pk), in a single closed tube following genome extraction. The assay showed the limit of detection (LOD) of 10 copies per reaction and can accurately distinguish plasmodium species from intra-plasmodium and other pathogens. In addition, we also proposed and verified different methods of fluorescence-quenching and two dimensional labelled tag for probes that are suitable for UP-MCA assay. Furthermore, its clinical performance was evaluated by 184 samples and showed sensitivity of 100% (164/164) and specificity of 100% (20/20) at 99% confidence interval, respectively, with the microscopy method as gold standard. Taken together, the UP-MCA system showed excellent sensitivity, specificity and accuracy for genotyping of plasmodium, and it meets the requirements of rapidity and convenience for plasmodium detection in clinical routine and has great potential for clinical translation. | infectious diseases |
10.1101/2022.04.15.22273922 | Linking Genotype to Phenotype: Further Exploration of Mutations in SARS-CoV-2 Associated with Mild or Severe Outcomes | We previously interrogated the relationship between SARS-CoV-2 genetic mutations and associated patient outcomes using publicly available data downloaded from GISAID in October 2020 [1]. Using high-level patient data included in some GISAID submissions, we were able to aggregate patient status values and differentiate between severe and mild COVID-19 outcomes. In our previous publication, we utilized a logistic regression model with an L1 penalty (Lasso regularization) and found several statistically significant associations between genetic mutations and COVID-19 severity. In this work, we explore the applicability of our October 2020 findings to a more current phase of the COVID-19 pandemic.
Here we first test our previous models on newer GISAID data downloaded in October 2021 to evaluate the classification ability of each model on expanded datasets. The October 2021 dataset (n=53,787 samples) is approximately 15 times larger than our October 2020 dataset (n=3,637 samples). We show limitations in using a supervised learning approach and a need for expansion of the feature sets based on progression of the COVID-19 pandemic, such as vaccination status. We then re-train on the newer GISAID data and compare the performance of our two logistic regression models. Based on accuracy and Area Under the Curve (AUC) metrics, we find that the AUC of the re-trained October 2021 model is modestly decreased as compared to the October 2020 model. These results are consistent with the increased emergence of multiple mutations, each with a potentially smaller impact on COVID-19 patient outcomes. Bioinformatics scripts used in this study are available at https://github.com/JPEO-CBRND/opendata-variant-analysis. As described in Voss et al. 2021, machine learning scripts are available at https://github.com/Digital-Biobank/covid_variant_severity. | infectious diseases |
10.1101/2022.04.15.22273881 | Can We Really Trust the Findings of the COVID-19 Research? Quality Assessment of Randomized Controlled Trials Published on COVID-19 | ObjectiveTo evaluate the quality of randomized controlled trials (RCTs) published on Coronavirus Disease-19 (COVID-19) and to investigate the reasons behind compromising the quality, if found.
MethodsA systematic literature search was performed in PubMed, Google Scholar, and Cochrane CENTRAL to identify the Randomized Controlled Trails published on Coronavirus Disease-19 between 1st Dec 2019 to 31st Aug 2021. Research articles met with study criteria were included in the study. Assessment of quality of randomized controlled trials was done using modified Jadad scale.
Results21,259 records of randomized controlled trials were identified through database searching, out of which 90 randomized controlled trials were included in the study and, 34 (37.8%) were of high-quality, 46 (51.1%) were of moderate quality, and 10 (11.1 %) were of low-quality studies. There were 40 (44.4%), 38 (42.2%), and 12 (13.3%) randomized controlled trials published in the early, middle, and late terms with Jadad score 5.12{+/-}1.67, 5.34{+/-}1.32, and 5.68{+/-}1.50 respectively (P=0.52). When comparing the blinding status, appropriate blinding, and methods to evaluate adverse events in randomized controlled trials with modified Jadad score, a significant difference was observed (P<0.001). A significant moderate positive correlation was found between the impact factor of the journal and the modified Jadad scale score (R2= 0.48, P<0.001).
ConclusionFindings from our study indicate that accelerated publication of Coronavirus Disease-19 researches along with the fast-track review process has resulted in lowering study quality scores. With the emergence of stronger evidence, Coronavirus Disease-19 clinical studies with lower methodological quality should be revisited.
Impacts on practiceO_LIThere have been numerous sacrifices and tragedies in the clinical response to covid-19. Revising the quality of randomized controlled trials published on COVID-19 as we enter the third wave of the pandemic and beyond, will improve the evidence-based practice of medications for clinical pharmacy services.
C_LIO_LICOVID-19 Patients will benefit from evidence-based pharmaceutical care through reduced drug-related problems.
C_LI | infectious diseases |
10.1101/2022.04.06.22273531 | Efficacy and Safety of Fixed Combination of Hydroxychloroquine with Azithromycin Versus Hydroxychloroquine and Placebo in Patients with Mild COVID-19: Randomized, double blind, Placebo controlled trial | To determine the efficacy and safety of fixed combination of hydroxychloroquine/azithromycin (HCQ+AZT) compared to hydroxychloroquine (HCQ) alone or placebo in mild COVID-19 outpatients to avoid hospitalization.
Materials and methodsThis randomized, parallel, double-blind clinical trial included male and female patients aged 18 and 76 years non COVID vaccinated, who were diagnosed with mild COVID-19 infection. All patients underwent liver and kidney profile test, as well as a health questionnaire and clinical revision to document that they did not have uncontrolled comorbidities. They were randomly assigned to one of the three treatment arms: 1) hydroxychloroquine with azithromycin 200 mg/250 mg every 12 hours for five days followed by hydroxychloroquine 200 mg every 12 hours for 5 days; 2) hydroxychloroquine 200 mg every 12 hours for ten days; or 3) placebo every 12 hours for ten days. The primary outcome of the study was hospitalization, while the secondary outcomes were disease progression, pneumonia, use of supplemental oxygen, and adverse events. This study was registered in clinicaltrials.gov with the NCT number of 04964583.
ResultsA total of 92 participants were randomized. Of whom, 30 received HCQ+AZT, 31 received HCQ, and 31 received placebo. The median age was 37 years, 27.2% of the participants had comorbidities, and the global incidence of hospitalization was 2.2%. The incidence of hospitalization was 6.7% (2/30) in the HCQ+AZT group compared to the HCQ or placebo groups, in which there were no hospitalizations. Progression of disease was higher in the HCQ group [RR=3.25 (95% CI, 1.19-8.87)] compared with placebo group. There was no statistical difference between the HCQ+AZT group and the placebo group in progression of disease. The incidence of pneumonia was 30% in the HCQ+AZT group, 32.2% in the HCQ group, and 9.6% in the placebo group (HCQ + AZT vs Placebo; p=0.06). There was a significant risk of pneumonia versus placebo only in the HCQ group [RR=3.33 (95% CI, 1.01-10.9)]. Supplemental oxygen was required by 20% (6/30) of the patients in the HCQ+AZT group, 6.4 (2/31) of the patients in the HCQ group, and 3.2% (1/31) of the patients in the placebo group,[(HCQ + AZT vs Placebo; p=0.100), (HCQ vs Placebo, p=0.610)]. There was no statistical difference between groups for negative test (PCR) on day 11. The most frequent adverse events were gastrointestinal symptoms. No lengthening of the QT interval was observed in patients receiving HCQ+AZT or HCQ.
ConclusionThe use of HCQ+AZT does not decrease the risk of hospitalization in patients with mild COVID-19. The use of HCQ increases the risk of progression and pneumonia. | infectious diseases |
10.1101/2022.04.13.22273817 | Seroprevalence, correlates and kinetics of SARS-CoV-2 Nucleocapsid IgG antibody in healthcare workers at a tertiary hospital: a prevaccine census study | BackgroundHealthcare workers are perceived to be a high-risk group for acquiring SAR-CoV-2 infection, and more so in countries where COVID-19 vaccination uptake is low. Serosurveillance may best determine the true extent of SARS-CoV-2 infection since most infected HCWs may be asymptomatic or present with only mild symptoms. Over time, determining the true extent of SARS-CoV-2 infection could inform hospital management and staff whether the preventive measures instituted are effective and valuable in developing targeted solutions.
MethodsThis was a census survey study conducted at the Aga Khan University Hospital, Nairobi, between November 2020 and February 2021 before the implementation of the COVID-19 vaccination. The SARS-CoV-2 nucleocapsid IgG test was performed using a chemiluminescent assay.
ResultsOne thousand six hundred thirty-one (1631) staff enrolled, totalling 60% of the workforce. The overall crude seroprevalence was 18.4% and the adjusted value (for assay sensitivity of 86%) was 21.4% (95% CI; 19.2-23.7). The HCW groups with higher prevalence included pharmacy (25.6%), outreach (24%), hospital-based nursing (22.2%) and catering staff (22.6%). Independent predictors of a positive IgG result included prior COVID-19 like symptoms, odds ratio (OR) 1.9 [95% confidence interval (CI) 1.3-2.9, p=0.002], and a prior positive SARS-CoV-2 PCR result OR 11.0 (CI: 7.2-18.0, p<0.001). Age, sex, comorbidities or working in a COVID-19 designated area were not associated with seropositivity. The odds of testing positive for IgG after a positive PCR test were lowest if the antibody test was performed more than 2 months later; OR 0.7 (CI: 0.48-0.95, p= 0.025).
ConclusionsThe prevalence of anti-SARS-CoV-2 nucleocapsid IgG among HCWs was lower than in the general population. Staff working in clinical areas were not at increased risk when compared to staff working in non-clinical areas. | infectious diseases |
10.1101/2022.04.10.22273678 | Dynamics of anti-Spike IgG antibody titer after the third BNT162b2 COVID-19 vaccination in the Japanese health care workers | IntroductionMany countries are administering a third dose of some coronavirus disease 2019 (COVID-19) vaccines, but the evaluation of vaccine-induced immunity is insufficient. This study aimed to evaluate anti-spike immunoglobulin G (IgG) titers in the health care workers after the third BNT162b2 vaccination.
MethodsDynamics of anti-spike IgG titers were assessed two months following the third BNT162b2 vaccination in 52 participants. All participants received the primary series of vaccination with BNT162b2 and received the third dose eight months after the second vaccination. Associations between anti-spike IgG titer, baseline characteristics, and adverse reactions were also evaluated.
ResultsThe geometric mean titer of anti-spike IgG one month after the third vaccination was 17400 AU/ml, which increased to approximately 30 times immediately before the third vaccination and approximately twice that one month after the second vaccination. In addition, participants with anti-spike IgG titers less than 10000 AU/ml after the second vaccination tended to have higher increases in ant-spike IgG titers before and after the third vaccination.
The decline rate of anti-spike IgG was significantly slower after the third vaccination as 35.7% than that after the second vaccination as 59.1%. The anti-spike IgG titer was significantly negatively associated with age (r = -0.31). Participants who had a headache at the vaccination showed significantly higher anti-spike IgG titer than those without a headache.
ConclusionsThe anti-spike IgG induced by primary immunization with BNT162b2 waned over time. The third dose of BNT162b2 substantially increased the anti-spike IgG with a slower decline rate. | infectious diseases |
10.1101/2022.04.13.22273855 | Rapid displacement of SARS-CoV-2 variants within Japan correlates with cycle threshold values on routine RT-PCR testing | BackgroundThe rapid spread of SARS-CoV-2 worldwide has led to the emergence of new variants due to the presence of mutations that alter viral characteristics, but there have been few studies on trends in viral lineages in Japan, an island country. We hypothesized that changes in cycle threshold (Ct) values on reverse transcription polymerase chain reaction (RT-PCR) reflect the prevalent variants during a given period.
MethodsWe performed next-generation sequencing of positive samples to identify the viral lineages in Japan in 2021 and compared variant prevalence with the average Ct values on routine RT-PCR using 4 primer sets.
ResultsBased on 3 sequencing runs, the highly transmissible Alpha variant, which prevailed over other lineages, such as R.1, from April 2021, was dominated by the even stronger Delta variant between July and August 2021 in Japan. The decrease in our routine RT-PCR Ct values with 4 primer sets correlated with these fluctuations in lineage prevalence over time.
ConclusionsWe confirmed that our RT-PCR protocol reflects the trends in SARS-CoV-2 variant prevalence over time regardless of sequence mutation. This may aid in the tracking of new variants in the population. | infectious diseases |
10.1101/2022.04.15.22273912 | A needle in a haystack: metagenomic DNA sequencing to quantify Mycobacterium tuberculosis DNA and diagnose tuberculosis | BackgroundTuberculosis (TB) remains a significant cause of mortality worldwide. Metagenomic next-generation sequencing has the potential to reveal biomarkers of active disease, identify coinfection, and improve detection for sputum-scarce or culture-negative cases.
MethodsWe conducted a large-scale comparative study of 427 plasma, urine, and oral swab samples from 334 individuals from TB endemic and non-endemic regions to evaluate the utility of a shotgun metagenomic DNA sequencing assay for tuberculosis diagnosis.
FindingsWe found that the choice of a negative, non-TB control cohort had a strong impact on the measured performance of the diagnostic test: the use of a control patient cohort from a nonendemic region led to a test with nearly 100% specificity and sensitivity, whereas controls from TB endemic regions exhibited a high background of nontuberculous mycobacterial DNA, limiting the diagnostic performance of the test. Using mathematical modeling and quantitative comparisons to matched qPCR data, we found that the burden of Mycobacterium tuberculosis DNA constitutes a very small fraction (0.04 or less) of the total abundance of DNA originating from mycobacteria in samples from TB endemic regions.
InterpretationOur findings suggest that the utility of a metagenomic sequencing assay for tuberculosis diagnostics is limited by the low burden of M. tuberculosis in extrapulmonary sites and an overwhelming biological background of nontuberculous mycobacterial DNA.
FundingThis work was supported by the National Institutes of Health, the Rainin Foundation, the National Science Foundation, and the Bill and Melinda Gates Foundation. | infectious diseases |
10.1101/2022.04.08.22273423 | Pathogen detection and characterization from throat swabs using unbiased metatranscriptomic analyses | ObjectiveInfectious diseases are common but are not easily or readily diagnosed with current methodologies. This problem is further exacerbated with the constant presence of mutated, emerging, and novel pathogens. One of the most common sites of infection by many pathogens is the human throat. Yet, there is no universal diagnostic test that can distinguish these pathogens. Metatranscriptomic (MT) analysis of the throat represents an important and novel development in infectious disease detection and characterization, as it is able to identify all pathogens in a fully unbiased approach.
DesignTo test the utility of an MT approach to pathogen detection, throat samples were collected from participants before, during, and after an acute sickness.
ResultsClear sickness-associated shifts in pathogenic microorganisms are detected in the participants along with important insights into microbial functions and antimicrobial resistance genes.
ConclusionsMT analysis of the throat represents an effective method for the unbiased identification and characterization of pathogens. Since MT data include all microorganisms in the sample, this approach should allow for not only the identification of pathogens, but also an understanding of the effects of the resident throat microbiome in the context of human health and disease. | infectious diseases |
10.1101/2022.04.12.22273679 | Early detection of patients with narcotic use disorder using a modified MEDD score based on the analysis of real-world prescription patterns | BackgroundAddiction to prescription narcotics is a global issue, and detection of individuals with a narcotic use disorder (NUD) at an early stage can help prevent narcotics misuse and abuse. We developed a novel index to detect early NUD based on a real-world prescription pattern analysis in a large hospital.
MethodsWe analyzed narcotic prescriptions of 221 887 patients prescribed by 8737 doctors from July 2000 to June 2018. For the early detection of patients who could potentially progress to developing NUD after a long history of narcotic prescription, a weighted morphine equivalent daily dose (wt-MEDD) score was developed based on the number of prescription dates on which the actual MEDD was higher than the intended MEDD. Performance of the wt-MEDD scoring system in detecting patients diagnosed with NUD by doctors was compared with that of other NUD high risk indexes such as the MEDD scoring system, number of days on prescribed narcotics, the frequency/duration of prescription, narcotics prescription across multiple doctors, and the number of early refills of narcotics.
ResultsA wt-MEDD score cut-off value of 10.5 could detect all outliers, as well as patients diagnosed with NUD with 100% sensitivity and 99.6% specificity. The wt-MEDD score showed the highest sensitivity and specificity in identifying NUD among all indexes. Further, combining the wt-MEDD score with other NUD high risk indexes improved the prediction performance.
ConclusionWe developed a novel index to distinguish patients with vulnerable use patterns of narcotics. The wt-MEDD score showed excellent performance in detecting early NUD. | addiction medicine |
10.1101/2022.04.12.22273619 | Comparative and Integrated Analysis of Plasma Extracellular Vesicles Isolations Methods in Healthy Volunteers and Patients Following Myocardial Infarction | Plasma extracellular vesicle (EV) number and composition are altered following myocardial infarction (MI), but to properly understand the significance of these changes it is essential to appreciate how the different isolation methods affect EV characteristics, proteome and sphingolipidome. Here, we compared plasma EV isolated from platelet-poor plasma from four healthy donors and six MI patients at presentation and 1-month post-MI using ultracentrifugation, polyethylene glycol precipitation, acoustic trapping, size-exclusion chromatography (SEC) or immunoaffinity capture. The isolated EV were evaluated by Nanoparticle Tracking Analysis, Western blot, transmission electron microscopy, an EV-protein array, untargeted proteomics (LC-MS/MS) and targeted sphingolipidomics (LC-MS/MS). The application of the five different plasma EV isolation methods in patients presenting with MI showed that the choice of plasma EV isolation method influenced the ability to distinguish elevations in plasma EV concentration following MI, enrichment of EV-cargo (EV-proteins and sphingolipidomics) and associations with the size of the infarct determined by cardiac magnetic resonance imaging 6 months-post-MI. Despite the selection bias imposed by each method, a core of EV associated proteins and lipids was detectable using all approaches. However, this study highlights how each isolation method comes with its own idiosyncrasies and makes the comparison of data acquired by different techniques in clinical studies problematic. | cardiovascular medicine |
10.1101/2022.04.12.22273778 | Assessment of magnitude and spectrum of cardiovascular disease admissions and outcomes in Saint Paul Hospital Millennium Medical College, Addis Ababa: A retrospective study | BackgroundCardiovascular diseases remain the leading cause of death in the world and approximately 80% of all cardiovascular-related deaths occur in low and middle income countries including Ethiopia.
MethodsThe aim of the study was to assess the magnitude and spectrum of cardiovascular admissions and its outcomes among medical patients admitted to both Medical Ward and ICU of St. Paul Teaching Hospital from 1st of Jan 2020 to 1st of Jan 2021.
ResultsOut of 1,165 annual medical admissions, the prevalence of cardiovascular diseases(CVD) was 30.3%. About 60%(212) of patients had advanced congestive heart failure of diverse causes. Hypertensive heart disease (HHD) was the next predominant diagnosis (41%), and also the leading cause of cardiac diseases followed by rheumatic valvular heart disease(RVHD) (18%) and Ischemic heart disease (IHD) (12.2%), respectively. Yong age, rural residence and female sex were associated with RVHD(p=0.001). Stroke also accounted for 20% of CVD admission (hemorrhagic stroke-17% Vs Ischemic stroke-83%). Hypertension was the predominate risk factor for CVD and present in 46.7%(168) of patients. The mean hospital stay was 12days and in hospital mortality rate was 24.3%, septic shock being the commonest immediate cause of death followed by fatal arrhythmia, brain herniation, and massive PTE.
ConclusionCardiovascular diseases were highly prevalent in the study area causing significant morbidity and mortality. Therefore, Comprehensive approach is needed to timely screen for risk reduction, delay or prevent diseases development and subsequent complications. | cardiovascular medicine |
10.1101/2022.04.07.22273577 | Predictors of Developmental Defects of Enamel in the Primary Maxillary Central Incisors using Bayesian Model Selection | Localized non-inheritable developmental defects of tooth enamel (DDE) are classified as enamel hypoplasia (EH), opacity (OP) and post-eruptive breakdown (PEB) using the Enamel Defects Index. To better understand the etiology of DDE, and in particular possibly modifiable variables, we assessed the linkages amongst exposome variables during the specific time duration of the development of the DDE. In general, the human primary central maxillary incisor teeth develop between 13-14 weeks in utero and 3-4 weeks postpartum of a full-term delivery, followed by tooth eruption at about 1 year of age. We utilized existing datasets of mother and child dyad data that encompassed 12 weeks gestation through birth and early infancy, and child DDE outcomes from digital images of the erupted primary maxillary central incisor teeth. We applied a Bayesian modeling paradigm to assess the important predictors of EH, OP, and PEB. The results of Gibbs variable selection showed a key set of predictors: mothers pre-pregnancy body mass index (BMI); maternal serum levels of calcium and phosphorus at gestational week 28; childs gestational age; and both mothers and childs functional vitamin D deficiency (FVDD). In this sample of healthy mothers and children, significant predictors for OP included the child having a gestational period > 36 weeks and FVDD at birth, and for PEB included a mothers pre-pregnancy BMI < 21.5 and higher serum phosphorus level at week 28. | dentistry and oral medicine |