id
stringlengths
16
27
title
stringlengths
18
339
abstract
stringlengths
95
38.7k
category
stringlengths
7
44
10.1101/2020.12.15.20248130
Determinants of SARS-CoV-2 transmission to guide vaccination strategy in an urban area
BackgroundTransmission chains within small urban areas (accommodating[~]30% of the European population) greatly contribute to case burden and economic impact during the ongoing COVID-19 pandemic, and should be a focus for preventive measures to achieve containment. Here, at very high spatio-temporal resolution, we analysed determinants of SARS-CoV-2 transmission in a European urban area, Basel-City (Switzerland). Methodology. We combined detailed epidemiological, intra-city mobility, and socioeconomic data-sets with whole-genome-sequencing during the first SARS-CoV-2 wave. For this, we succeeded in sequencing 44% of all reported cases from Basel-City and performed phylogenetic clustering and compartmental modelling based on the dominating viral variant (B.1-C15324T; 60% of cases) to identify drivers and patterns of transmission. Based on these results we simulated vaccination scenarios and corresponding healthcare-system burden (intensive-care-unit occupancy). Principal Findings. Transmissions were driven by socioeconomically weaker and highly mobile population groups with mostly cryptic transmissions, whereas amongst more senior population transmission was clustered. Simulated vaccination scenarios assuming 60-90% transmission reduction, and 70-90% reduction of severe cases showed that prioritizing mobile, socioeconomically weaker populations for vaccination would effectively reduce case numbers. However, long-term intensive-care-unit occupation would also be effectively reduced if senior population groups were prioritized, provided there were no changes in testing and prevention strategies. Conclusions. Reducing SARS-CoV-2 transmission through vaccination strongly depends on the efficacy of the deployed vaccine. A combined strategy of protecting risk groups by extensive testing coupled with vaccination of the drivers of transmission (i.e. highly mobile groups) would be most effective at reducing the spread of SARS-CoV-2 within an urban area. Author summaryWe examined SARS-CoV-2 transmission patterns within a European city (Basel, Switzerland) to infer drivers of the transmission during the first wave in spring 2020. The combination of diverse data (serological, genomic, transportation, socioeconomic) allowed us to combine phylogenetic analysis with mathematical modelling on related cases that were mapped to a residential address. As a result we could evaluate population groups driving SARS-CoV-2 transmission and quantify their effect on the transmission dynamics. We found traceable transmission chains in wealthier or more senior population groups and cryptic transmissions in the mobile, young or socioeconomic weaker population groups - these were identified as transmission drivers of the first wave. Based on this insight, we simulated vaccination scenarios for various vaccine efficacies to reflect different approaches undertaken to handle the epidemic. We conclude that vaccination of the mobile inherently younger population group would be most effective to handle following waves.
public and global health
10.1101/2020.12.15.20248235
Exploring Patient and Staff Experiences of Video Consultations During COVID-19 in an English Outpatient Care Setting: Secondary Data Analysis of Routinely Collected Feedback Data
BackgroundVideo consultations (VCs) were rapidly implemented in response to COVID-19, despite modest progress prior to the pandemic. ObjectivesTo explore staff and patient experiences of VCs implemented during COVID-19, and use feedback insights to support quality improvement and service development. MethodsSecondary data analysis was conducted on 955 (22.6%) patient responses and 521 (12.3%) staff responses routinely collected following a VC between June-July 2020 in a rural, aging and outpatient care setting at a single NHS Trust. Patient and staff feedback were summarised using descriptive statistics and inductive thematic analysis and presented to Trust stakeholders. ResultsMost (93.2%) patients reported having good (n=210, 22.0%), or very good (n=680, 71.2%) experience with VCs and felt listened to and understood (n=904, 94.7%). Most patients accessed their VC alone (n=806, 84.4%), except for those aged 71+ (n=23/58, 39.7%), with ease of joining VCs negatively associated with age (P<.001). Despite more difficulties joining, older people were most likely to be satisfied with the technology (n=46/58, 79.3%). Both patients and staff generally felt patients needs had been met (n=860, 90.1%, n=453, 86.9% respectively), although staff appeared to overestimate patient dissatisfaction with VC outcome (P=.021). Patients (n=848, 88.8%) and staff (n=419, 80.5%) generally felt able to communicate everything they wanted, although patients were significantly more positive than staff (P<.001). Patient satisfaction with communication was positively associated with technical performance satisfaction (P<.001). Most staff (89.8%) reported positive (n=185, 35.5%), or very positive (n=281, 54.3%) experiences of joining and managing a VC. Staff reported reductions in carbon footprint (n=380, 72.9%) and time (n=373, 71.6%). Most (n=880, 92.1%) patients would choose VCs again. Inductive thematic analysis of patient and staff responses identified three themes: i) barriers including technological difficulties, patient information and suitability concerns; ii) potential benefits including reduced stress, enhanced accessibility, cost and time savings; and iii) suggested improvements including trial calls, turning music off, photo uploads, expanding written character limit, supporting other internet browsers and shared interactive screen. This routine feedback, including evidence to suggest patients were more satisfied than clinicians had anticipated, was presented to relevant Trust stakeholders allowing improved processes and supporting development of a business case to inform the Trust decision on continuing VCs beyond COVID-19 restrictions. ConclusionsFindings highlight the importance of regularly reviewing and responding to routine feedback following the implementation of a new digital service. Feedback helped the Trust improve the VC service, challenge clinician held assumptions about patient experience and inform future use of VCs. The feedback has focussed improvement efforts on patient information, technological improvements such as blurred backgrounds and interactive white boards, and responding to the needs of patients with dementia, communication or cognitive impairment or lack of appropriate technology. Findings have implications for other health providers.
public and global health
10.1101/2020.12.15.20248273
Quantifying the importance and location of SARS-CoV-2 transmission events in large metropolitan areas
Detailed characterization of SARS-CoV-2 transmission across different settings can help design less disruptive interventions. We used real-time, privacy-enhanced mobility data in the New York City and Seattle metropolitan areas to build a detailed agent-based model of SARS-CoV-2 infection to estimate the where, when, and magnitude of transmission events during the pandemics first wave. We estimate that only 18% of individuals produce most infections (80%), with about 10% of events that can be considered super-spreading events (SSEs). Although mass-gatherings present an important risk for SSEs, we estimate that the bulk of transmission occurred in smaller events in settings like workplaces, grocery stores, or food venues. The places most important for transmission change during the pandemic and are different across cities, signaling the large underlying behavioral component underneath them. Our modeling complements case studies and epidemiological data and indicates that real-time tracking of transmission events could help evaluate and define targeted mitigation policies.
epidemiology
10.1101/2020.12.17.20248382
Sub-national forecasts of COVID-19 vaccine acceptance across the UK: a large-scale cross-sectional spatial modelling study
The rollout of COVID-19 vaccines has begun to at-risk populations around the world. It is currently unclear whether rejection of the vaccine will pose challenges for achieving herd/community immunity either through large-scale rejection or localised pockets. Here we predict uptake of the vaccine at unprecedented spatial resolution across the UK using a large-scale survey of over 17,000 individuals. Although the majority of the UK population would likely take the vaccine, there is substantial heterogeneity in uptake intent across the UK. Large urban areas, including London and North West England, females, Black or Black British ethnicities, and Polish-speakers are among the least accepting. This study helps identify areas and socio-demographic groups where vaccination levels may not reach those levels required for herd immunity. Identifying clusters of non-vaccinators is extremely important in the context of achieving herd immunity as vaccination "cold-spots" can amplify epidemic spread and disproportionately increase vaccination levels required for herd protection.
public and global health
10.1101/2020.12.15.20248289
Short-term acute exposure to wildfire smoke and lung function in a Royal Canadian Mounted Police (RCMP) cohort
RationaleThe increasing incidence of extreme wildfire is becoming a concern for public health. Although long-term exposure to wildfire smoke is associated with respiratory illnesses, reports on the association between short-term occupational exposure to wildfire smoke and lung function remain scarce. MethodsIn this cross-sectional study, we analyzed data of 218 Royal Canadian Mounted Police officers (mean age: 38{+/-}9 years) deployed at the Fort McMurray wildfire in 2016. Individual exposure to air pollutants was calculated by integrating the duration of exposure with the air quality parameters obtained from the nearest air quality monitoring station during the phase of deployment. Lung function was measured using spirometry and body plethysmography. Association between exposure and lung function was examined using principal component linear regression analysis, adjusting for potential confounders. ResultsThe participants were predominantly male (71%). Mean forced expiratory volume in 1 second (FEV1), and residual volume (RV) were 76.5{+/-}5.9 and 80.1{+/-}19.5 (% predicted). A marginal association was observed between air pollution and higher RV [{beta}: 1.55; 95%CI: -0.28 to 3.37 per interquartile change of air pollution index], but not with other lung function indices. The association between air pollution index and RV was significantly higher in participants who were screened within the first three months of deployment [2.80; 0.91 to 4.70] than those screened later [-0.28; -2.58 to 2.03], indicating a more acute effect of air pollution on peripheral airways. ConclusionAcute short-term exposure to wildfire-associated air pollutants may impose subtle but clinically important deleterious respiratory effects, particularly in the peripheral airways.
respiratory medicine
10.1101/2020.12.15.20248275
Development of a Conceptual Model of Childhood Asthma to Inform Asthma Prevention Policies
BackgroundThere is no definitive cure for asthma; as such, prevention remains a major goal. Decision-analytic models are routinely used to evaluate the value-for-money proposition of interventions. Following best practice standards in decision-analytic modeling, the objective of this study was to solicit expert opinion to develop a concept map for a policy model for primary prevention of asthma. MethodsWe reviewed currently available decision-analytic models for asthma prevention. A steering committee of economic modelers, allergists, and respirologists was then convened to draft a conceptual model of pediatric asthma. A modified Delphi method was followed to define the context of the problem at hand (evaluation of asthma prevention strategies) and develop the concept map of the model. ResultsConsensus was achieved after three rounds of discussions, followed by concealed voting. In the final conceptual model, asthma diagnosis was based on three domains of lung function, atopy, and their symptoms. The panel recommended several markers for each domain. These domains were in turn affected by several risk factors. The panel clustered all risk factors under three groups of patient characteristic, family history, and environmental factors. To be capable of modeling the interplay among risk factors, the panel recommended the use of microsimulation, with an open-population approach that would enable modeling phased implementation and gradual and incomplete uptake of the intervention. ConclusionsEconomic evaluation of childhood interventions for preventing asthma will require modeling of several co-dependent risk factors and multiple domains that affect the diagnosis. The conceptual model can inform the development and validation of a policy model for childhood asthma prevention. FundingGenome Canada Large-Scale Applied Research Project
respiratory medicine
10.1101/2020.12.16.20248329
Increased visual and cognitive demands emphasize the importance of meeting visual needs at all distances while driving
Having an optimal quality of vision as well as adequate cognitive capacities is known to be essential for driving safety. However, the interaction between vision and cognitive mechanisms while driving remains unclear. We hypothesized that, in a context of high cognitive load, reduced visual acuity would have a negative impact on driving behavior, even when the acuity corresponds to the legal threshold for obtaining a driving license in Canada, and that the impact observed on driving performance would be greater with the increase in the threshold of degradation of visual acuity. In order to investigate this relationship, we examined driving behavior in a driving simulator under optimal and reduced vision conditions through two scenarios involving different levels of cognitive demand. These were: 1. a simple rural driving scenario with some pre-programmed events and 2. a highway driving scenario accompanied by a concurrent task involving the use of a navigation device. Two groups of visual quality degradation (lower/ higher) were evaluated according to their driving behavior. The results support the hypothesis: A dual task effect was indeed observed provoking less stable driving behavior, but in addition to this, by statistically controlling the impact of cognitive load, the effect of visual load emerged in this dual task context. These results support the idea that visual quality degradation impacts driving behavior when combined with a high mental workload driving environment while specifying that this impact is not present in the context of low cognitive load driving condition.
ophthalmology
10.1101/2020.12.16.20248300
Systematic review reveals multiple sexually antagonistic polymorphisms affecting human disease and complex traits
An evolutionary model for sex differences in disease risk posits that alleles conferring higher risk in one sex may be protective in the other. These sexually antagonistic (SA) alleles are predicted to be maintained at frequencies higher than expected under purifying selection against unconditionally deleterious alleles, but there are apparently no examples in humans. Discipline-specific terminology, rather than a genuine lack of such alleles, could explain this disparity. We undertook a two-stage review of evidence for SA polymorphisms in humans using search terms from (i) evolutionary biology and (ii) biomedicine. While the first stage returned no eligible studies, the second revealed 51 genes with sex-opposite effects, 22 increased disease risk or severity in one sex but protected the other. Those with net positive effects occurred at higher frequencies. None were referred to as SA. Our review reveals significant communication barriers to fields as a result of discipline-specific terminology.
genetic and genomic medicine
10.1101/2020.12.16.20248300
Systematic review reveals multiple sexually antagonistic polymorphisms affecting human disease and complex traits
An evolutionary model for sex differences in disease risk posits that alleles conferring higher risk in one sex may be protective in the other. These sexually antagonistic (SA) alleles are predicted to be maintained at frequencies higher than expected under purifying selection against unconditionally deleterious alleles, but there are apparently no examples in humans. Discipline-specific terminology, rather than a genuine lack of such alleles, could explain this disparity. We undertook a two-stage review of evidence for SA polymorphisms in humans using search terms from (i) evolutionary biology and (ii) biomedicine. While the first stage returned no eligible studies, the second revealed 51 genes with sex-opposite effects, 22 increased disease risk or severity in one sex but protected the other. Those with net positive effects occurred at higher frequencies. None were referred to as SA. Our review reveals significant communication barriers to fields as a result of discipline-specific terminology.
genetic and genomic medicine
10.1101/2020.12.16.20248301
Adaptive combination of interventions required to reach population immunity due to stochastic community dynamics and limited vaccination
Reaching population immunity against COVID-19 is proving difficult even in countries with high vaccination levels. We demonstrate that this in part is due to heterogeneity and stochasticity resulting from community-specific human-human interaction and infection networks. We address this challenge by community-specific simulation of adaptive strategies. Analyzing the predicted effect of vaccination into an ongoing COVID-19 outbreak, we find that adaptive combinations of targeted vaccination and non-pharmaceutical interventions (NPIs) are required to reach population immunity. Importantly, the threshold for population immunity is not a unique number but strategy and community dependent. Furthermore, the dynamics of COVID-19 outbreaks is highly community-specific: in some communities vaccinating highly interactive people diminishes the risk for an infection wave, while vaccinating the elderly reduces fatalities when vaccinations are low due to supply or hesitancy. Similarly, while risk groups should be vaccinated first to minimize fatalities, optimality branching is observed with increasing population immunity. Bimodality emerges as the infection network gains complexity over time, which entails that NPIs generally need to be longer and stricter. Thus, we analyze and quantify the requirement for NPIs dependent on the chosen vaccination strategy. We validate our simulation platform on real-world epidemiological data and demonstrate that it can predict pathways to population immunity for diverse communities world-wide challenged by limited vaccination.
infectious diseases
10.1101/2020.12.16.20248357
Antigen-based rapid diagnostic testing or alternatives for diagnosis of symptomatic COVID-19: A simulation-based net benefit analysis
BackgroundSARS-CoV-2 antigen-detection rapid diagnostic tests (Ag-RDTs) can diagnose COVID-19 rapidly and at low cost, but their lower sensitivity than nucleic acid amplification testing (NAAT) has limited clinical adoption. MethodsWe compared Ag-RDT, NAAT, and clinical judgment alone for diagnosing symptomatic COVID-19. We considered an outpatient setting (10% COVID-19 prevalence among the patients tested, 3-day NAAT turnaround) and a hospital setting (40% prevalence, 24-hour NAAT turnaround). We simulated transmission from cases and contacts and relationships between time, viral burden, transmission, and case detection. We compared diagnostic approaches using a measure of net benefit that incorporated both clinical and public health benefits and harms of intervention. ResultsIn the outpatient setting, we estimated that using Ag-RDT instead of NAAT to test 200 individuals could have a net benefit equivalent to preventing all symptomatic transmission from one person with COVID-19 (one "transmission-equivalent"). In the hospital setting, net benefit analysis favored NAAT, and testing 25 patients with NAAT instead of Ag-RDT achieved one "transmission-equivalent" of incremental benefit. In both settings, Ag-RDT was preferred to NAAT if NAAT turnaround time exceeded two days. Both Ag-RDT and NAAT provided greater net benefit than management based on clinical judgment alone, unless intervention carried minimal harm and was provided equally regardless of diagnostic approach. ConclusionsFor diagnosis of symptomatic COVID-19, the speed of diagnosis with Ag-RDT is likely to outweigh its lower accuracy compared to NAAT wherever NAAT turnaround times are two days or longer. This advantage may be even greater if Ag-RDTs are also less expensive.
infectious diseases
10.1101/2020.12.16.20248357
Antigen-based rapid diagnostic testing or alternatives for diagnosis of symptomatic COVID-19: A simulation-based net benefit analysis
BackgroundSARS-CoV-2 antigen-detection rapid diagnostic tests (Ag-RDTs) can diagnose COVID-19 rapidly and at low cost, but their lower sensitivity than nucleic acid amplification testing (NAAT) has limited clinical adoption. MethodsWe compared Ag-RDT, NAAT, and clinical judgment alone for diagnosing symptomatic COVID-19. We considered an outpatient setting (10% COVID-19 prevalence among the patients tested, 3-day NAAT turnaround) and a hospital setting (40% prevalence, 24-hour NAAT turnaround). We simulated transmission from cases and contacts and relationships between time, viral burden, transmission, and case detection. We compared diagnostic approaches using a measure of net benefit that incorporated both clinical and public health benefits and harms of intervention. ResultsIn the outpatient setting, we estimated that using Ag-RDT instead of NAAT to test 200 individuals could have a net benefit equivalent to preventing all symptomatic transmission from one person with COVID-19 (one "transmission-equivalent"). In the hospital setting, net benefit analysis favored NAAT, and testing 25 patients with NAAT instead of Ag-RDT achieved one "transmission-equivalent" of incremental benefit. In both settings, Ag-RDT was preferred to NAAT if NAAT turnaround time exceeded two days. Both Ag-RDT and NAAT provided greater net benefit than management based on clinical judgment alone, unless intervention carried minimal harm and was provided equally regardless of diagnostic approach. ConclusionsFor diagnosis of symptomatic COVID-19, the speed of diagnosis with Ag-RDT is likely to outweigh its lower accuracy compared to NAAT wherever NAAT turnaround times are two days or longer. This advantage may be even greater if Ag-RDTs are also less expensive.
infectious diseases
10.1101/2020.12.16.20248134
No evidence of association between schools and SARS-CoV-2 second wave in Italy.
BackgroundDuring the Covid19 pandemic, school closure has been mandated in analogy to its known effect against influenza, but it is unclear whether schools are early amplifiers of Covid19 cases. MethodsWe performed a cross-sectional and prospective cohort study in Italy. We used databases from the Italian Ministry of Education containing the number of new positive SARS-CoV-2 cases per school from September 20 to November 8, 2020 to calculate incidence among students and staff. We calculated incidence across each age group using databases from the Veneto Region system of SARS-CoV-2 cases notification in the period August 26- November 24, 2020. We used a database from the Veneto Region system of SARS-CoV-2 secondary cases tracing in Verona province schools to estimate number of tests, the frequency of secondary infections at school by type of index case and the ratio positive cases/ number of tests per school institute using an adjusted multivariable generalized linear regression model. We estimated the reproduction number Rt at the regional level from the Italian Civil Protection of regional SARS-CoV-2 cases notification database in the period 6 August-2 December 2020. FindingsFrom September 12 to November 7 2020, SARS-CoV-2 incidence among students was lower than that in the general population of all but two Italian regions. Secondary infections were <1%, and clusters of >2 secondary cases per school were 5-7% in a representative November week. Incidence among teachers was greater than in the general population. However, when compared with incidence among similar age groups, the difference was not significant (P=0.23). Secondary infections among teachers were more frequent when the index case was a teacher than a student (38% vs. 11%, P=0.007). From August 28 to October 25 in Veneto where school reopened on September 14, the growth of SARS-CoV- 2 incidence was lower in school age individuals, maximal in 20-29 and 45-49 years old individuals. The delay between the different school opening dates in the different Italian regions and the increase in the regional Covid19 reproduction number Rt was not uniform. Reciprocally, school closures in two regions where they were implemented before other measures did not affect the rate of Rt decline. InterpretationOur analysis does not support a role for school opening as a driver of the second wave of SARS-CoV-2 epidemics in Italy, a large European country with high SARS-CoV-2 incidence. Research in contextO_ST_ABSEvidence before this studyC_ST_ABSThe role of schools and at large of children as amplifiers of the Covid19 pandemics is debated. Despite biological and epidemiological evidence that children play a marginal role in Sars-CoV-2 spread, policies of school closures have been predicated, mostly based on the temporal coincidence between school reopening in certain countries and Covid19 outbreaks. Whether schools contributed to the so called "second wave" of Covid19 is uncertain. Italys regionalized calendar of school reopening and databases of positivity at school allows to estimate the impact of schools on the increase of Sars-CoV-2 that occurred in autumn 2020. Added value of this studyWe found that incidence among students is lower than in the general population and that whereas incidence among teachers appears higher than that in the general population, it is comparable to that among individuals of the same age bracket. Moreover, secondary infections at school are rare and clusters even less common. The index case of a secondary teacher case is more frequently a teacher than a student. In Veneto, during the first phase of the second wave incidence among school age individuals was low as opposed to the sustained incidence among individuals of 45-49 years. Finally, the time lag between school opening and Rt increase was not uniform across different Italian regions with different school opening dates, with lag times shorter in regions where schools opened later. Implications of the available evidenceThese findings contribute to indicate that Covid19 infections rarely occur at school and that transmission from students to teachers is very rare. Moreover, they fail to support a role for school age individuals and school openings as a driver of the Covid19 second wave. Overall, our findings could help inform policy initiatives of school openings during the current Covid19 pandemic.
infectious diseases
10.1101/2020.12.16.20247684
International Comparisons of Harmonized Laboratory Value Trajectories to Predict Severe COVID-19: Leveraging the 4CE Collaborative Across 342 Hospitals and 6 Countries: A Retrospective Cohort Study
ObjectivesTo perform an international comparison of the trajectory of laboratory values among hospitalized patients with COVID-19 who develop severe disease and identify optimal timing of laboratory value collection to predict severity across hospitals and regions. DesignRetrospective cohort study. SettingThe Consortium for Clinical Characterization of COVID-19 by EHR (4CE), an international multi-site data-sharing collaborative of 342 hospitals in the US and in Europe. ParticipantsPatients hospitalized with COVID-19, admitted before or after PCR-confirmed result for SARS-CoV-2. Primary and secondary outcome measuresPatients were categorized as "ever-severe" or "never-severe" using the validated 4CE severity criteria. Eighteen laboratory tests associated with poor COVID-19-related outcomes were evaluated for predictive accuracy by area under the curve (AUC), compared between the severity categories. Subgroup analysis was performed to validate a subset of laboratory values as predictive of severity against a published algorithm. A subset of laboratory values (CRP, albumin, LDH, neutrophil count, D-dimer, and procalcitonin) was compared between North American and European sites for severity prediction. ResultsOf 36,447 patients with COVID-19, 19,953 (43.7%) were categorized as ever-severe. Most patients (78.7%) were 50 years of age or older and male (60.5%). Longitudinal trajectories of CRP, albumin, LDH, neutrophil count, D-dimer, and procalcitonin showed association with disease severity. Significant differences of laboratory values at admission were found between the two groups. With the exception of D-dimer, predictive discrimination of laboratory values did not improve after admission. Sub-group analysis using age, D-dimer, CRP, and lymphocyte count as predictive of severity at admission showed similar discrimination to a published algorithm (AUC=0.88 and 0.91, respectively). Both models deteriorated in predictive accuracy as the disease progressed. On average, no difference in severity prediction was found between North American and European sites. ConclusionsLaboratory test values at admission can be used to predict severity in patients with COVID-19. Prediction models show consistency across international sites highlighting the potential generalizability of these models.
health informatics
10.1101/2020.12.17.20248126
Oxygen saturation instability in suspected covid-19 patients; contrasting effects of reduced VA/Q and shunt.
Patients in the UK at risk of Covid-19 pneumonia, but not needing immediate hospital attention, are to be given pulse oximeters to identify a fall in oxygen saturation (SaO2 or SpO2) at home. A recent finding in Covid-19 pneumonia is a dominant reduction in ventilation to perfused alveoli (VA/Q). A mathematical model of gas exchange was used to predict the effect of shunt or reduced VA/Q on SaO2 stability inferred from the slope of the PIO2 vs SaO2 curve as it intersected the line representing ambient PIO2. A {+/-}1 kPa variation in PIO2 predicted a 1.5% and 8% change in SpO2 with 15% shunt and 0.4 VA/Q respectively. As a consistency check, two patients with pre-existing lung disease and 12 hour continuous SpO2 monitoring breathing air had gas exchange impairment analysed in terms of shunt and reduced VA/Q. The patient with 16% shunt and normal VA/Q had a stable but reduced SpO2 (circa 93{+/-}1%) throughout the 12 hr period. The patient with a VA/Q reduced to 0.48 had SpO2 ranging from 75-95% during the same period. SpO2 monitoring in suspected covid-19 patients should focus on SpO2 varying >5% in 30 minutes. Such instability in at risk patients is not diagnostic of Covid-19 pneumonia but this may be suspected from a dominant reduction in VA/Q if episodic hypoxaemia has progressed from a stable SpO2.
respiratory medicine
10.1101/2020.12.17.20248126
Oxygen saturation instability in suspected covid-19 patients; contrasting effects of reduced VA/Q and shunt.
Patients in the UK at risk of Covid-19 pneumonia, but not needing immediate hospital attention, are to be given pulse oximeters to identify a fall in oxygen saturation (SaO2 or SpO2) at home. A recent finding in Covid-19 pneumonia is a dominant reduction in ventilation to perfused alveoli (VA/Q). A mathematical model of gas exchange was used to predict the effect of shunt or reduced VA/Q on SaO2 stability inferred from the slope of the PIO2 vs SaO2 curve as it intersected the line representing ambient PIO2. A {+/-}1 kPa variation in PIO2 predicted a 1.5% and 8% change in SpO2 with 15% shunt and 0.4 VA/Q respectively. As a consistency check, two patients with pre-existing lung disease and 12 hour continuous SpO2 monitoring breathing air had gas exchange impairment analysed in terms of shunt and reduced VA/Q. The patient with 16% shunt and normal VA/Q had a stable but reduced SpO2 (circa 93{+/-}1%) throughout the 12 hr period. The patient with a VA/Q reduced to 0.48 had SpO2 ranging from 75-95% during the same period. SpO2 monitoring in suspected covid-19 patients should focus on SpO2 varying >5% in 30 minutes. Such instability in at risk patients is not diagnostic of Covid-19 pneumonia but this may be suspected from a dominant reduction in VA/Q if episodic hypoxaemia has progressed from a stable SpO2.
respiratory medicine
10.1101/2020.12.17.20248402
Error Rates in SARS-CoV-2 Testing Examined with Bayesian Inference
A literature review on SARS-CoV-2 reverse-transcription polymerase chain reaction (RT-PCR) is used to construct a clinical test confusion matrix. A simple correction method for bulk test results is then demonstrated with examples. The required sensitivity and specificity of a test are explored for societal needs and use cases, before a sequential analysis of common example scenarios is explored. The analysis suggests that many of the people with mild symptoms and positive test results are unlikely to be infected with SARS-CoV-2 in some regions. It is concluded that current and foreseen alternative tests can not be used to "clear" people as being non-infected. Recommendations are given that regional authorities must establish a programme to monitor operational test characteristics before launching large scale testing; and that large scale testing for tracing infection networks in some regions is not viable, but may be possible in a focused way that does not exceed the working capacity of the laboratories staffed by competent experts. RT-PCR tests can not be solely relied upon as the gold standard for SARS-CoV-2 diagnosis at scale, instead clinical assessment supported by a range of expert diagnostic tests should be used.
infectious diseases
10.1101/2020.12.17.20248402
Error Rates in SARS-CoV-2 Testing Examined with Bayesian Inference
A literature review on SARS-CoV-2 reverse-transcription polymerase chain reaction (RT-PCR) is used to construct a clinical test confusion matrix. A simple correction method for bulk test results is then demonstrated with examples. The required sensitivity and specificity of a test are explored for societal needs and use cases, before a sequential analysis of common example scenarios is explored. The analysis suggests that many of the people with mild symptoms and positive test results are unlikely to be infected with SARS-CoV-2 in some regions. It is concluded that current and foreseen alternative tests can not be used to "clear" people as being non-infected. Recommendations are given that regional authorities must establish a programme to monitor operational test characteristics before launching large scale testing; and that large scale testing for tracing infection networks in some regions is not viable, but may be possible in a focused way that does not exceed the working capacity of the laboratories staffed by competent experts. RT-PCR tests can not be solely relied upon as the gold standard for SARS-CoV-2 diagnosis at scale, instead clinical assessment supported by a range of expert diagnostic tests should be used.
infectious diseases
10.1101/2020.12.17.20248389
Framework for enhancing the estimation of model parameters for data with a high level of uncertainty
Reliable data is essential to obtain adequate simulations for forecasting the dynamics of epidemics. In this context, several political, economic, and social factors may cause inconsistencies in the reported data, which reflect the capacity for realistic simulations and predictions. In the case of COVID-19, for example, such uncertainties are mainly motivated by large-scale underreporting of cases due to reduced testing capacity in some locations. In order to mitigate the effects of noise in the data used to estimate parameters of models, we propose strategies capable of improving the ability to predict the spread of the diseases. Using a compartmental model in a COVID-19 study case, we show that the regularization of data by means of Gaussian Process Regression can reduce the variability of successive forecasts, improving predictive ability. We also present the advantages of adopting parameters of compartmental models that vary over time, in detriment to the usual approach with constant values.
epidemiology
10.1101/2020.12.16.20247700
Vitamins D2 and D3 have overlapping but different effects on human gene expression revealed through analysis of blood transcriptomes in a randomised double-blind placebo-controlled food-fortification trial
For the first time, we report the influence of vitamin D2 and vitamin D3 on genome-wide gene expression in whole blood from healthy women representing two ethnic groups, white European and South Asian. In this randomised placebo-controlled trial, participants were given daily physiological doses (15 {micro}g) of either vitamin D2 or D3 for 12 weeks and changes in the transcriptome were compared relative to the transcriptome at baseline. While there was some overlap in the repertoire of differentially expressed genes after supplementation with each vitamin D source, most changes were specific to either vitamin D3 or vitamin D2, suggesting that each form of the vitamin may have different effects on human physiology. Notably, following vitamin D3 supplementation, the majority of changes in gene expression reflected a down-regulation in the activity of genes, many encoding components of the innate and adaptive immune systems. These are consistent with the emerging concept that vitamin D orchestrates a shift in the immune system towards a more tolerogenic status. Moreover, divergent changes were observed following supplementation with either vitamin D3 or vitamin D2 for gene expression associated with type 1 and type 2 interferon activity. This is particularly intriguing as interferons play a critical role in the innate response to infection and aberrant type 1 interferon signalling is implicated in severe COVID-19 disease. The observed differences in gene expression after supplementation with vitamin D2 compared with vitamin D3 warrant a more intensive investigation of the biological effects of the two forms of vitamin D on human physiology. Significance statementsThis study suggests that the influence of vitamins D2 and D3 on human physiology may not be the same, as deduced from differences in gene expression within whole blood. South Asian participants were found to respond differently to vitamin D supplementation at the transcriptome level from white Europeans. The differentially expressed immune pathways identified in this study are consistent with vitamin D orchestrating a more tolerogenic immune status and this could be relevant in the context of the severity of immune response to viral infections such as Covid-19. The potential relevance of this study to severe Covid-19 disease is highlighted by our observed enhancement of type 1 interferon signalling by vitamin D3, but not vitamin D2.
nutrition
10.1101/2020.12.17.20248418
Case finding of early pregnancies at risk of preeclampsia using maternal blood leptin/ceramide ratio: multi-omics discovery and validation from a longitudinal study
ObjectiveTo evaluate whether longitudinal measurements of serological adipokines and sphingolipids can predict preeclampsia early in gestation. DesignRetrospective multi-omics discovery and longitudinal validation. SettingMaternity units in two US hospitals. MethodsA multi-omics approach integrating genomic and lipidomic discoveries was employed to identify leptin (Lep) and ceramide (Cer) as novel PE early gestational biomarkers. The levels of placental growth factor (PlGF), soluble fms-like tyrosine kinase (sFlt-1), Lep, and Cer in maternal sera were then determined by enzyme-linked immunosorbent (ELISA) and liquid chromatography-tandem mass spectrometric (LC/MS/MS) assays. Main outcome measuresInterval from positive prediction to confirmative diagnosis. ResultsGenomic meta-analysis compiled six PE placental cohorts with 78 PE and 95 non-PE control placentas. The Testing Cohort included sera from 7 non-PE and 8 PE women collected at confirmatory diagnosis. The Validation Cohort included sera from 20 non-PE and 20 PE women collected longitudinally through gestation. Our findings revealed a marked elevation of maternal serum Leptin/Ceramide (d18:1/25:0) ratio from early gestation (a median of 23 weeks) when comparing later PE-complicated with uncomplicated pregnancies. The maternal Lep/Cer (d18:1/25:0) ratio significantly outperformed the established sFlt-1/PlGF ratio in predicting PE for sensitivity (85% vs. 40%), positive predictive value (89% vs. 42%), and AUC (0.92 vs. 0.52) from 5 to 25 weeks of gestation. ConclusionsNon-invasive longitudinal assessment by serological evaluation of Lep/Cer (d18:1/25:0) ratio can case find early pregnancies at risk of preeclampsia, outperforming sFlt-1/PlGF ratio test. Tweetable abstractNon-invasive longitudinal assessment by serological evaluation of Lep and Cer ratio can predict preeclampsia early in gestation.
obstetrics and gynecology
10.1101/2020.12.17.20248194
An Efficient and Accurate Distributed Learning Algorithm for Modeling Multi-Site Zero-Inflated Count Outcomes
Clinical research networks (CRNs), made up of multiple healthcare systems each with patient data from several care sites, are beneficial for studying rare outcomes and increasing generalizability of results. While CRNs encourage sharing aggregate data across healthcare systems, individual systems within CRNs often cannot share patient-level data due to privacy regulations, prohibiting multi-site regression which requires an analyst to access all individual patient data pooled together. Meta-analysis is commonly used to model data stored at multiple institutions within a CRN; while relatively simple to implement, meta-analysis can result in biased estimation, notably in rare-event contexts. We present a communication-efficient, privacy-preserving algorithm for modeling multi-site zero-inflated count outcomes within a CRN. Our method, a one-shot distributed algorithm for performing hurdle regression (ODAH), models zero-inflated count data stored in multiple sites without sharing patient-level data across sites, resulting in estimates closely approximating those that would be obtained in a pooled patient-level data analysis. We evaluate our method through extensive simulations and two realworld data applications using electronic health records (EHRs): examining risk factors associated with pediatric avoidable hospitalization and modeling serious adverse event frequency associated with a colorectal cancer therapy. Relative to existing methods for distributed data analysis, ODAH offers a highly accurate, computationally efficient method for modeling multi-site zero-inflated count data.
health informatics
10.1101/2020.12.18.20248434
Excess deaths among Latino people in California during the COVID-19 pandemic
BackgroundLatino people in the US are experiencing higher excess deaths during the COVID-19 pandemic than any other racial/ethnic group, but it is unclear which subgroups within this diverse population are most affected. Such information is necessary to target policies that prevent further excess mortality and reduce inequities. MethodsUsing death certificate data for January 1, 2016 through February 29, 2020 and time-series models, we estimated the expected weekly deaths among Latino people in California from March 1 through October 3, 2020. We quantified excess mortality as observed minus expected deaths and risk ratios (RR) as the ratio of observed to expected deaths. We considered subgroups defined by age, sex, place of birth, education, occupation, and combinations of these factors. FindingsDuring the first seven months of the pandemic, Latino deaths in California exceeded expected deaths by 10,316, a 31% increase. Excess death rates were greatest for individuals born in Mexico (RR 1.44; 95% PI, 1.41, 1.48) or Central America (RR 1.49; 95% PI, 1.37, 1.64), with less than a high school degree (RR 1.41; 95% PI, 1.35, 1.46), or in food-and-agriculture (RR 1.60; 95% PI, 1.48, 1.74) or manufacturing occupations (RR 1.59; 95% PI, 1.50, 1.69). Immigrant disadvantages in excess death were magnified among working-age Latinos in essential occupations. InterpretationThe pandemic has disproportionately impacted mortality among Latino immigrants and Latinos in unprotected essential jobs; Interventions to reduce these disparities should include early vaccination, workplace safety enforcement, and expanded access to medical care. FundingNational Institute on Aging; UCSF RESEARCH IN CONTEXTO_ST_ABSEvidence before this studyC_ST_ABSSeveral articles have suggested all-cause excess mortality estimates are superior to official COVID-19 counts for assessing the impact of the pandemic on marginalized populations that lack access to testing and healthcare. We searched PubMed, Google scholar, and the medRxiv preprint database through December 22, 2020 for studies of ("excess mortality" or "excess death") AND ("COVID-19" or "coronavirus") set in the United States and we identified two empirical studies with estimates of excess mortality among Latinos during the pandemic. The study set in California (from our research team) found per capita excess mortality was highest among Black and Latino people. The national study found percent excess mortality was significantly higher among Latino people than any other racial/ethnic group. Neither study further disaggregated the diverse Latino population or provided subgroup estimates to clarify why excess pandemic mortality is so high in this population. In the U.S., official COVID-19 statistics are rarely disaggregated by place of birth, education, or occupation which has resulted in a lack of evidence of how these factors have impacted mortality during the pandemic. No study to date of excess mortality in the U.S. has provided estimates for immigrant or occupational subgroups. Added value of this studyOur population-based observational study of all-cause mortality during the COVID-19 pandemic provides the first estimates of within-group heterogeneity among the Latino population in California - one of the populations hardest hit by COVID-19 in the U.S. We provide the first subgroup estimates by place of birth and occupational sector, in addition to combined estimates by foreign-birth and participation in an essential job and education. In doing so, we reveal that Latino immigrants in essential occupations have the highest risk of excess death during the pandemic among working-age Latinos. We highlight the heightened risk of excess mortality associated with food/agriculture and manufacturing occupational sectors, essential sectors in which workers may lack COVID-19 protections. Implications of all the available evidenceOur study revealed stark disparities in excess mortality during the COVID-19 pandemic among Latinos, pointing to the particularly high vulnerability of Latino immigrants and Latinos in essential jobs. These findings may offer insight into the disproportionate COVID-19 mortality experienced by immigrants or similarly marginalized groups in other contexts. Interventions to reduce these disparities should include policies enforcing occupational safety, especially for immigrant workers, early vaccination, and expanded access to medical care.
epidemiology
10.1101/2020.12.17.20248433
Research Letter: Reactive balance responses after mild traumatic brain injury (mTBI): a scoping review
ObjectiveBalance testing after concussion or mild traumatic brain injury (mTBI) can be useful in determining acute and chronic neuromuscular deficits that are unapparent from symptom scores or cognitive testing alone. Current assessments of balance do not comprehensively evaluate all three classes of balance: maintaining a posture, voluntary movement, and reactive postural response. Despite the utility of reactive postural responses in predicting fall risk in other balance impaired populations, the effect of mTBI on reactive postural responses remains unclear. This review sought to (1) examine the extent and range of available research on reactive postural responses in people post-mTBI and (2) determine if reactive postural responses (balance recovery) are affected by mTBI. DesignPRISMA Scoping review. MethodsStudies were identified using Medline, Embase, CINAHL, Cochrane Library, Dissertations and Theses Global, PsycINFO, SportDiscus, and Web of Science. Inclusion criteria were: injury classified as mTBI with no confounding central or peripheral nervous system dysfunction beyond those stemming from the mTBI, quantitative measure of reactive postural response, and a discrete, externally driven perturbation was used to test reactive postural response. ResultsA total of 4,747 publications were identified and a total of three studies (5 publications) were included in the review. ConclusionThe limited number of studies available on this topic highlight the lack of knowledge on reactive postural responses after mTBI. This review provides a new direction for balance assessments after mTBI and recommends incorporating all three classes of postural control in future research.
rehabilitation medicine and physical therapy
10.1101/2020.12.17.20248362
History of premorbid depression is a risk factor for COVID-related mortality: Analysis of a retrospective cohort of 1,387 COVID+ patients
BackgroundThe goal of the present work was to examine risk factors for mortality in a 1,387 COVID+ patients admitted to a hospital in Suffolk County, NY. MethodsData were collated by the hospital epidemiological service for patients admitted from 3/7/2020-9/1/2020. Time until final discharge or death was the outcome. Cox proportional hazards models were used to estimate time until death among admitted patients. FindingsIn total, 99.06% of cases had resolved leading to 1,179 discharges and 211 deaths. Length of stay was significantly longer in those who died as compared to those who did not p=0.007). Of patients who had been discharged (n=1,179), 54 were readmitted and 9 subsequently died. Multivariable-adjusted Cox proportional hazards regression revealed that in addition to older age, male sex, and heart failure that a history of premorbid depression was a risk factors for COVI-19 mortality (HR = 2.64 [1.54-4.54] P<0.001), and that this association remained after adjusting for age and for neuropsychiatric conditions as well as medical comorbidities including cardiovascular disease and pulmonary conditions. Sex-stratified analyses revealed that associations between mortality and depression was strongest in males (aHR = 4.45 [2.04-9.72], P<0.001), and that the association between heart failure and mortality was strongest in participants aged <65 years old (aHR = 30.50 [9.17-101.48], P<0.001). InterpretationWhile an increasing number of studies have identified a number of comorbid medical conditions and age of patient as risk factors for mortality in COVID+ patients, this study reports that history of depression is a risk factor for COVID mortality. FundingNo funding was received for this study.
infectious diseases
10.1101/2020.12.17.20248362
History of premorbid depression is a risk factor for COVID-related mortality: Analysis of a retrospective cohort of 1,387 COVID+ patients
BackgroundThe goal of the present work was to examine risk factors for mortality in a 1,387 COVID+ patients admitted to a hospital in Suffolk County, NY. MethodsData were collated by the hospital epidemiological service for patients admitted from 3/7/2020-9/1/2020. Time until final discharge or death was the outcome. Cox proportional hazards models were used to estimate time until death among admitted patients. FindingsIn total, 99.06% of cases had resolved leading to 1,179 discharges and 211 deaths. Length of stay was significantly longer in those who died as compared to those who did not p=0.007). Of patients who had been discharged (n=1,179), 54 were readmitted and 9 subsequently died. Multivariable-adjusted Cox proportional hazards regression revealed that in addition to older age, male sex, and heart failure that a history of premorbid depression was a risk factors for COVI-19 mortality (HR = 2.64 [1.54-4.54] P<0.001), and that this association remained after adjusting for age and for neuropsychiatric conditions as well as medical comorbidities including cardiovascular disease and pulmonary conditions. Sex-stratified analyses revealed that associations between mortality and depression was strongest in males (aHR = 4.45 [2.04-9.72], P<0.001), and that the association between heart failure and mortality was strongest in participants aged <65 years old (aHR = 30.50 [9.17-101.48], P<0.001). InterpretationWhile an increasing number of studies have identified a number of comorbid medical conditions and age of patient as risk factors for mortality in COVID+ patients, this study reports that history of depression is a risk factor for COVID mortality. FundingNo funding was received for this study.
infectious diseases
10.1101/2020.12.17.20248445
Classification of the infection status of COVID-19 in 190 countries
We propose a simple method to determine the infection rate from the time dependence of the daily confirmed new cases, in which the logarithm of the rate is fitted by piece-wise quadratic functions. Exploiting this method, we analyze the time dependence of the outbreak of COVID-19 in 190 countries around the world and determine the status of the outbreak in each country by the dependence of the infection rate on the number of new cases. We show that the infection status of each country can be completely classified into nine different states and that the infection status of countries succeeded in controlling COVID-19 implies the importance of the quarantine and/or self-isolation measure.
epidemiology
10.1101/2020.12.18.20224733
Assessing the causal role of sleep traits on glycated haemoglobin: a Mendelian randomization study
ObjectiveTo examine the effects of sleep traits on glycated haemoglobin (HbA1c). DesignObservational multivariable regression (MVR), one-sample Mendelian randomization (1SMR), and two-sample summary data Mendelian randomization (2SMR). SettingUK Biobank (UKB) prospective cohort study and genome-wide association studies from the Meta-Analyses of Glucose and Insulin-related traits Consortium (MAGIC). ParticipantsIn MVR and 1SMR, participants were adults (mean (SD) age 57 (8) years; 54% female) from the UKB (n=336,999); in 2SMR, participants were adults (53 (11) years; 52% female) from MAGIC (n=46,368). All participants were adults of European ancestry. ExposuresSelf-reported insomnia frequency (usually vs sometimes or rarely/never); sleep duration: 24-hour sleep duration (hours/day); short sleep ([&le;]6 hours vs 7-8 hours) and long sleep ([&ge;]9 hours vs 7-8 hours); daytime sleepiness and daytime napping (each consisting of 3 categories: never/rarely, sometimes, usually); chronotype (5 categories from definite morning to definite evening preference). Main outcome measureHbA1c in standard deviation (SD) units. ResultsAcross MV, 1SMR, 2SMR, and their sensitivity analyses we found a higher frequency of insomnia (usually vs sometimes or rarely/never) was associated with higher HbA1c (MVR: 0.053 SD units, 95% confidence interval (0.046 to 0.061), 1SMR: 0.52, (0.42 to 0.63), 2SMR: 0.22, (0.10 to 0.35)). Results remained significant but point estimates were somewhat attenuated after excluding people with diagnosed diabetes. For other sleep traits, there was less consistency with significant associations when using some, but not all methods. ConclusionsThis study suggests that insomnia increases HbA1c levels. These findings could have important implications for developing and evaluating strategies that improve sleep habits to reduce hyperglycaemia and prevent diabetes. SUMMARY BOXO_ST_ABSWhat is already known on this topicC_ST_ABSO_LIIn observational data, insomnia, short sleep duration, and evening preference are associated with higher risk for type 2 diabetes. C_LIO_LIMendelian randomization (MR) studies have not found evidence of a causal effect of short sleep on type 2 diabetes or glycaemic traits but have indicated an effect of insomnia on type 2 diabetes. It is unclear whether insomnia influences HbA1c levels, a marker of long-term hyperglycaemia, in the general population. C_LIO_LIRecently identified genetic variants robustly associated with insomnia, sleep duration, daytime sleepiness, napping, and chronotype can be used in MR studies to explore causal effects of these sleep traits on HbA1c levels. C_LI What this study addsO_LIThis study suggests that a higher frequency of insomnia increases HbA1c levels in the general population and after excluding people with diabetes. C_LIO_LIWe found no robust evidence for causal effects of other sleep traits on HbA1c levels. C_LIO_LIThese findings improve our understanding of the impact of sleep traits on HbA1c levels and have important implications for developing and evaluating strategies that improve sleep habits to reduce hyperglycaemia and prevent diabetes. C_LI
epidemiology
10.1101/2020.12.18.20245068
Bayesian modeling for the detection of adverse events underreporting in clinical trials
IntroductionSafety underreporting is a recurrent issue in clinical trials that can impact patient safety and data integrity. Clinical Quality Assurance (QA) practices used to detect underreporting rely on on-site audits, however adverse events underreporting remains a recurrent issue. In a recent project, we developed a predictive model that enables oversight of Adverse Event (AE) reporting for clinical Quality Program Leads (QPL). However, there were limitations to using solely a machine learning model. ObjectiveOur primary objective was to propose a robust method to compute the probability of AE underreporting that could complement our machine learning model. Our model was developed to enhance patients safety while reducing the need for on-site and manual QA activities in clinical trials. MethodsWe used a Bayesian hierarchical model to estimate the site reporting rates and assess the risk of underreporting. We designed the model with Project Data Sphere clinical trial data that is public and anonymized. ResultsWe built a model that infers the site reporting behavior from patient-level observations and compares them across a study to enable a robust detection of outliers between clinical sites. ConclusionThe new model will be integrated into the current dashboard designed for clinical Quality Program Leads. This approach reduces the need for on-site audits, shifting focus from source data verification (SDV) to pre-identified, higher risk areas. It will enhance further quality assurance activities for safety reporting from clinical trials and generate quality evidence during pre-approval inspections. The preprint version of this work is available on MedRxiv: https://doi.org/10.1101/2020.12.18.20245068 Key pointsO_LISafety underreporting is a recurrent issue in clinical trials that can impact patient safety and data integrity C_LIO_LIWe used a Bayesian hierarchical model to estimate the site reporting rates and assess the risk of underreporting. C_LIO_LIThis model complements our previously published machine learning approach and is used by clinical quality professionals to better detect safety underreporting. C_LI
health systems and quality improvement
10.1101/2020.12.18.20248255
Modeling effectiveness of testing strategies to prevent COVID-19 in nursing homes--United States, 2020
BackgroundSARS-CoV-2 outbreaks in nursing homes can be large with high case fatality. Identifying asymptomatic individuals early through serial testing is recommended to control COVID-19 in nursing homes, both in response to an outbreak ("outbreak testing" of residents and healthcare personnel) and in facilities without outbreaks ("non-outbreak testing" of healthcare personnel). The effectiveness of outbreak testing and isolation with or without non-outbreak testing was evaluated. MethodsUsing published SARS-CoV-2 transmission parameters, the fraction of SARS-CoV-2 transmissions prevented through serial testing (weekly, every three days, or daily) and isolation of asymptomatic persons compared to symptom-based testing and isolation was evaluated through mathematical modeling using a Reed-Frost model to estimate the percentage of cases prevented (i.e., "effectiveness") through either outbreak testing alone or outbreak plus non-outbreak testing. The potential effect of simultaneous decreases (by 10%) in the effectiveness of isolating infected individuals when instituting testing strategies was also evaluated. ResultsModeling suggests that outbreak testing could prevent 54% (weekly testing with 48-hour test turnaround) to 92% (daily testing with immediate results and 50% relative sensitivity) of SARS-CoV-2 infections. Adding non-outbreak testing could prevent up to an additional 8% of SARS-CoV-2 infections (depending on test frequency and turnaround time). However, added benefits of non-outbreak testing were mostly negated if accompanied by decreases in infection control practice. ConclusionsWhen combined with high-quality infection control practices, outbreak testing could be an effective approach to preventing COVID-19 in nursing homes, particularly if optimized through increased test frequency and use of tests with rapid turnaround. SummaryMathematical modeling evaluated the effectiveness of serially testing asymptomatic persons in a nursing home in response to a SARS-CoV-2 outbreak with or without serial testing of asymptomatic staff in the absence of known SARS-CoV-2 infections.
infectious diseases
10.1101/2020.12.18.20248499
Mental Disorder Prevalence Among Populations Impacted by Coronavirus Pandemics: A Multilevel Meta-Analytic Study of COVID-19, MERS & SARS
ObjectiveThrough a systematic review and meta-analysis of research on COVID-19, severe acute respiratory syndrome (SARS) and middle east respiratory syndrome (MERS) pandemics, we investigated whether mental disorder prevalence: (a) was elevated among populations impacted by coronavirus pandemics (relative to unselected populations reported in the literature), and (b) varied by disorder (undifferentiated psychiatric morbidity, anxiety, depressive, posttraumatic stress disorders [PTSD]) and impacted population (community, infected/recovered, healthcare provider, quarantined). MethodFrom 68 publications (N=87,586 participants), 808 estimates were included in a series of multilevel meta-analyses/regressions including random effects to account for estimates nested within studies. ResultsMedian summary point prevalence estimates varied by disorder and population. Psychiatric morbidity (20%-56%), PTSD (10-26%) and depression (9-27%) were most prevalent in most populations. The highest prevalence of each disorder was found among infected/recovered adults (18-56%), followed by healthcare providers (11-28%) and community adults (11-20%). Prevalence estimates were often notably higher than reported for unselected samples. Sensitivity analyses demonstrated that overall prevalence estimates moderately varied by pandemic, study location, and mental disorder measure type. ConclusionCoronavirus pandemics are associated with multiple mental disorders in several impacted populations. Needed are investigations of causal links between specific pandemic-related stressors, threats, and traumas and mental disorders.
psychiatry and clinical psychology
10.1101/2020.12.18.20248404
Consensus based framework for digital mobility monitoring
Digital mobility assessment using wearable sensor systems has the potential to capture walking performance in a patients natural environment. It enables the monitoring of health status and disease progression and the evaluation of interventions in real-world situations. In contrast to laboratory settings, real-world walking occurs in non-conventional environments and under unconstrained and uncontrolled conditions. Despite the general understanding, there is a lack of agreed definitions about what constitutes real-world walking, impeding the comparison and interpretation of the acquired data across systems and studies. Hence, there is a need for a terminological framework guiding further implementation of digital measures for gait assessment. We used an objective methodology based on an adapted Delphi process to obtain consensus on specific terminology related to real-world walking by asking a diverse panel of clinical, scientific, and industrial stakeholders. Six constituents ( real-world, walking, purposeful, walking bout, walking speed, turning) have successfully been defined in two feedback rounds. The identification of a consented set of real-world walking definitions has important implications for the development of assessment and analysis protocols, as well as for the reporting and comparison of digital mobility outcomes across studies and systems. The definitions will serve as a common framework for implementing digital and mobile technologies for gait assessment.
neurology
10.1101/2020.12.19.20248564
Efficacy and safety of TiNO-coated stents versus drug-eluting coronary stents. Systematic literature review and meta-analysis.
ObjectivesTo compare clinical outcomes after percutaneous coronary intervention (PCI) using titanium-nitride-oxide coated stents (TiNOS) versus drug-eluting stents (DES) in coronary artery disease (CAD) including acute coronary syndrome (ACS). DesignProspective systematic literature (SLR) conducted according to PRISMA. Medline, Embase, Cochrane, Web of Science were searched in March 2018 and updated. SettingInterventional cardiology. ParticipantsPatients with CAD, including ACS, requiring PCI. InterventionsAll prospective randomized controlled trials (RCTs) that compared clinical outcomes after PCI with DES versus TiNOS. Outcome measuresThe pooled risk ratios (RR), TiNOS over DES, with 95% confidence intervals (CI) are computed for device-oriented Major Adverse Cardiac Events (MACE), non-fatal myocardial infarction (MI), cardiac death (CD), clinically driven target lesion revascularization (TLR), probable or definite stent thrombosis (ST), total mortality, at one to five years after PCI. Pooled RRs are stratified according to baseline ACS versus other CAD. Sensitivity analysis (SA) and certainty of the evidence are rated per GRADE. ResultsFive RCTs are eligible with 1,855 patients with TiNOS versus 1,363 with DES at 1-year follow-up and 783 versus 771 at 5-year. Three RCTs included patients with ACS only. One-year RRs in ACS are: MACE 0.93 [0.72, 1.20], MI 0.48 [0.31, 0.73], CD 0.66 [0.33, 1.31], TLR 1.55 [1.10, 2.19] and ST 0.35 [0.20, 0.64]. One-year MACE, MI, and ST are robust to SA. The certainty of the evidence is high in MACE, moderate in MI, and low or very low in the other endpoints. There are too few observations to conclude about other CAD and 5-year outcomes. However, 5-year interim results are consistent with 1-year conclusions. ConclusionsA similar risk of MACE is found in TiNOS and DES, with potentially fewer MI and ST but more TLR in TiNOS. TiNOS are safe and effective in ACS at 1-year follow-up. RegistrationPROSPERO CRD42018090622 Strengths and limitations of this study- Strengths: O_LIThe level of certainty of the evidence is high for the primary endpoint at one-year follow-up in patients treated for acute coronary syndrome. C_LIO_LIThe primary endpoint and critical secondary endpoints are robust to sensitivity analysis. C_LI O_LI- Limitations: O_LIOutcomes in patients treated for chronic coronary artery disease cannot be analyzed. C_LIO_LIThe level of certainty of the evidence of secondary endpoints is moderate or low. C_LIO_LIAnalysis of five-year outcomes is still at an interim stage. C_LI C_LI
cardiovascular medicine
10.1101/2020.12.19.20248564
Efficacy and safety of TiNO-coated stents versus drug-eluting coronary stents. Systematic literature review and meta-analysis.
ObjectivesTo compare clinical outcomes after percutaneous coronary intervention (PCI) using titanium-nitride-oxide coated stents (TiNOS) versus drug-eluting stents (DES) in coronary artery disease (CAD) including acute coronary syndrome (ACS). DesignProspective systematic literature (SLR) conducted according to PRISMA. Medline, Embase, Cochrane, Web of Science were searched in March 2018 and updated. SettingInterventional cardiology. ParticipantsPatients with CAD, including ACS, requiring PCI. InterventionsAll prospective randomized controlled trials (RCTs) that compared clinical outcomes after PCI with DES versus TiNOS. Outcome measuresThe pooled risk ratios (RR), TiNOS over DES, with 95% confidence intervals (CI) are computed for device-oriented Major Adverse Cardiac Events (MACE), non-fatal myocardial infarction (MI), cardiac death (CD), clinically driven target lesion revascularization (TLR), probable or definite stent thrombosis (ST), total mortality, at one to five years after PCI. Pooled RRs are stratified according to baseline ACS versus other CAD. Sensitivity analysis (SA) and certainty of the evidence are rated per GRADE. ResultsFive RCTs are eligible with 1,855 patients with TiNOS versus 1,363 with DES at 1-year follow-up and 783 versus 771 at 5-year. Three RCTs included patients with ACS only. One-year RRs in ACS are: MACE 0.93 [0.72, 1.20], MI 0.48 [0.31, 0.73], CD 0.66 [0.33, 1.31], TLR 1.55 [1.10, 2.19] and ST 0.35 [0.20, 0.64]. One-year MACE, MI, and ST are robust to SA. The certainty of the evidence is high in MACE, moderate in MI, and low or very low in the other endpoints. There are too few observations to conclude about other CAD and 5-year outcomes. However, 5-year interim results are consistent with 1-year conclusions. ConclusionsA similar risk of MACE is found in TiNOS and DES, with potentially fewer MI and ST but more TLR in TiNOS. TiNOS are safe and effective in ACS at 1-year follow-up. RegistrationPROSPERO CRD42018090622 Strengths and limitations of this study- Strengths: O_LIThe level of certainty of the evidence is high for the primary endpoint at one-year follow-up in patients treated for acute coronary syndrome. C_LIO_LIThe primary endpoint and critical secondary endpoints are robust to sensitivity analysis. C_LI O_LI- Limitations: O_LIOutcomes in patients treated for chronic coronary artery disease cannot be analyzed. C_LIO_LIThe level of certainty of the evidence of secondary endpoints is moderate or low. C_LIO_LIAnalysis of five-year outcomes is still at an interim stage. C_LI C_LI
cardiovascular medicine
10.1101/2020.12.21.20248648
Adverse Cardiovascular Complications Following Prescription of Programmed Cell Death 1 (PD-1) and Programmed Cell Death Ligand 1 (PD-L1) Inhibitors: A Propensity-Score Matched Cohort Study with Competing Risk Analysis
BackgroundProgrammed death-1 (PD-1) and programmed death-ligand 1 (PD-L1) inhibitors, such as pembrolizumab, nivolumab and atezolizumab, are major classes of immune checkpoint inhibitors that are increasingly used for cancer treatment. However, their use is associated with adverse cardiovascular events. We examined the incidence of new-onset cardiac complications in patients receiving PD-1 or PD-L1 inhibitors. MethodsPatients receiving PD-1 or PD-L1 inhibitors since their launch up to 31st December 2019 at publicly funded hospitals of Hong Kong, China, without pre-existing cardiac complications were included. The primary outcome was a composite of incident heart failure, acute myocardial infarction, atrial fibrillation or atrial flutter with the last follow-up date of 31st December 2020. Propensity score matching between PD-L1 inhibitor use and PD-1 inhibitor use with a 1:2 ratio for patient demographics, past comorbidities and non-PD-1/PD-L1 medications was performed. ResultsA total of 1959 patients were included. Over a median follow-up of 247 days (interquartile range [IQR]: 72-506), 320 (incidence rate [IR]: 16.31%) patients met the primary outcome after PD-1/PD-L1 treatment: 244 (IR: 12.57%) with heart failure, 38 (IR: 1.93%) with acute myocardial infarction, 54 (IR: 2.75%) with atrial fibrillation, 6 (IR: 0.31%) with atrial flutter. Compared with PD-1 inhibitor treatment, PD-L1 inhibitor treatment was significantly associated with lower risks of the composite outcome both before (hazard ratio [HR]: 0.32, 95% CI: [0.18-0.59], P value=0.0002) and after matching (HR: 0.34, 95% CI: [0.18-0.65], P value=0.001), and lower all-cause mortality risks before matching (HR: 0.77, 95% CI: [0.64-0.93], P value=0.0078) and after matching (HR: 0.80, 95% CI: [0.65-1.00], P value=0.0463). Patients who developed cardiac complications had shorter average readmission intervals and a higher number of hospitalizations after treatment with PD-1/PD-L1 inhibitors in both the unmatched and matched cohorts (P value<0.0001). Competing risk analysis with cause-specific and subdistribution hazard models and multiple approaches based on the propensity score all confirmed these observations. ConclusionsCompared with PD-1 treatment, PD-L1 treatment was significantly associated with lower risk of new onset cardiac complications and all-cause mortality both before and after propensity score matching.
cardiovascular medicine
10.1101/2020.12.18.20248346
The Association Between Alpha-1 Adrenergic Receptor Antagonists and In-Hospital Mortality from COVID-19
Effective therapies for coronavirus disease 2019 (COVID-19) are urgently needed, and preclinical data suggest alpha-1 adrenergic receptor antagonists (1-AR antagonists) may be effective in reducing mortality related to hyperinflammation independent of etiology. Using a retrospective cohort design with patients in the Department of Veterans Affairs healthcare system, we use doubly robust regression and matching to estimate the association between baseline use of 1-AR antagonists and likelihood of death due to COVID-19 during hospitalization. Having an active prescription for any 1-AR antagonist (tamsulosin, silodosin, prazosin, terazosin, doxazosin, or alfuzosin) at the time of admission had a significant negative association with in-hospital mortality (relative risk reduction 18%; odds ratio 0.73; 95% CI 0.63 to 0.85; p [&le;] 0.001) and death within 28 days of admission (relative risk reduction 17%; odds ratio 0.74; 95% CI 0.65 to 0.84; p [&le;] 0.001). In a subset of patients on doxazosin specifically, an inhibitor of all three alpha-1 adrenergic receptors, we observed a relative risk reduction for death of 74% (odds ratio 0.23; 95% CI 0.03 to 0.94; p = 0.028) compared to matched controls not on any 1-AR antagonist at the time of admission. These findings suggest that use of 1-AR antagonists may reduce mortality in COVID-19, supporting the need for randomized, placebo-controlled clinical trials in patients with early symptomatic infection.
epidemiology
10.1101/2020.12.19.20248374
Airborne Transmission of Virus-Laden Aerosols inside a Music Classroom: Effects of Portable Purifiers and Aerosol Injection Rates
The ongoing COVID-19 pandemic has shifted attention to the airborne transmission of exhaled droplet nuclei within indoor environments. The spread of aerosols through singing and musical instruments in music performances has necessitated precautionary methods such as masks and portable purifiers. This study investigates the effects of placing portable air purifiers at different locations inside a classroom, as well as the effects of different aerosol injection rates (e.g., with and without masks, different musical instruments and different injection modes). Aerosol deposition, airborne concentration and removal are analyzed in this study. It was found that using purifiers could help in achieving ventilation rates close to the prescribed values by the World Health Organization (WHO), while also achieving aerosol removal times within the Center of Disease Control and Prevention (CDC) recommended guidelines. This could help in deciding break periods between classroom sessions, which was around 25 minutes through this study. Moreover, proper placement of purifiers could offer significant advantages in reducing airborne aerosol numbers (offering orders of magnitude higher aerosol removal when compared to nearly zero removal when having no purifiers), and improper placement of the purifiers could worsen the situation. The study suggests the purifier to be placed close to the injector to yield a benefit, and away from the people to be protected. The injection rate was found to have an almost linear correlation with the average airborne aerosol suspension rate and deposition rate, which could be used to predict the trends for scenarios with other injection rates.
epidemiology
10.1101/2020.12.19.20248374
Airborne Transmission of Virus-Laden Aerosols inside a Music Classroom: Effects of Portable Purifiers and Aerosol Injection Rates
The ongoing COVID-19 pandemic has shifted attention to the airborne transmission of exhaled droplet nuclei within indoor environments. The spread of aerosols through singing and musical instruments in music performances has necessitated precautionary methods such as masks and portable purifiers. This study investigates the effects of placing portable air purifiers at different locations inside a classroom, as well as the effects of different aerosol injection rates (e.g., with and without masks, different musical instruments and different injection modes). Aerosol deposition, airborne concentration and removal are analyzed in this study. It was found that using purifiers could help in achieving ventilation rates close to the prescribed values by the World Health Organization (WHO), while also achieving aerosol removal times within the Center of Disease Control and Prevention (CDC) recommended guidelines. This could help in deciding break periods between classroom sessions, which was around 25 minutes through this study. Moreover, proper placement of purifiers could offer significant advantages in reducing airborne aerosol numbers (offering orders of magnitude higher aerosol removal when compared to nearly zero removal when having no purifiers), and improper placement of the purifiers could worsen the situation. The study suggests the purifier to be placed close to the injector to yield a benefit, and away from the people to be protected. The injection rate was found to have an almost linear correlation with the average airborne aerosol suspension rate and deposition rate, which could be used to predict the trends for scenarios with other injection rates.
epidemiology
10.1101/2020.12.21.20248594
Estimating the risk of incident SARS-CoV-2 infection among healthcare workers in quarantine hospitals: the Egyptian example
In response to the COVID-19 epidemic, Egypt established a unique care model based on quarantine hospitals where only externally-referred confirmed COVID-19 patients were admitted, and healthcare workers resided continuously over 1-to 2-week working shifts. While the COVID-19 risk for HCWs has been widely reported in standard healthcare settings, it has not been evaluated yet in quarantine hospitals. Here, we relied on longitudinal data, including results of routine RT-PCR tests, collected within three quarantine hospitals located in Cairo and Fayoum, Egypt. Using a model-based approach that accounts for the time-since-exposure variation in false-negative rates of RT-PCR tests, we computed the incidence of SARS-CoV-2 infection among HCWs. Over a total follow-up of 6,064 person-days (PD), we estimated an incidence rate (per 100 PD) of 1.05 (95% CrI: 0.58-1.65) at Hospital 1, 1.92 (95% CrI: 0.93-3.28) at Hospital 2 and 7.62 (95% CrI: 3.47-13.70) at Hospital 3. The probability for an HCW to be infected at the end of a shift was 13.7% (95% CrI: 7.8%-20.8%) and 23.8% (95% CrI: 12.2%-37.3%) for a 2-week shift at Hospital 1 and Hospital 2, respectively, which lies within the range of risk levels previously documented in standard healthcare settings, whereas it was >3-fold higher for a 7-day shift at Hospital 2 (42.6%, 95%CrI: 21.9%-64.4%). Our model-based estimates unveil a proportion of undiagnosed infections among HCWs of 46.4% (95% CrI: 18.8%-66.7%), 45.0% (95% CrI: 5.6%-70.8%) and 59.2% (95% CrI: 34.8%-78.8%), for Hospitals 1 to 3, respectively. The large variation in SARS-CoV-2 incidence we document here suggests that HCWs from quarantine hospitals may face a high occupational risk of infection, but that, with sufficient anticipation and infection control measures, this risk can be brought down to levels similar to those observed in standard healthcare settings. WHAT THIS PAPER ADDSO_ST_ABSWhat is already known on this topicC_ST_ABSPrevious studies conducted in standard care settings have documented that frontline healthcare workers (HCWs) face high risk of COVID-19. Whether risk levels differ in alternative care models, such as COVID-19 quarantine hospitals in Egypt where HCWs resided in the hospital days and nights for various durations, is unknown. What this study addsCOVID-19 risk for HCWs in quarantine hospitals varies substantially between facilities, from risk levels that are in the range of those documented in standard healthcare settings to levels that were approximatively 3 times higher. How this study might affect research, practice or policyWith sufficient anticipation and infection control measures, occupational COVID-19 risk for HCWs working in quarantine hospitals can be brought down to levels similar to those observed in standard healthcare settings.
epidemiology
10.1101/2020.12.21.20248594
Estimating the risk of incident SARS-CoV-2 infection among healthcare workers in quarantine hospitals: the Egyptian example
In response to the COVID-19 epidemic, Egypt established a unique care model based on quarantine hospitals where only externally-referred confirmed COVID-19 patients were admitted, and healthcare workers resided continuously over 1-to 2-week working shifts. While the COVID-19 risk for HCWs has been widely reported in standard healthcare settings, it has not been evaluated yet in quarantine hospitals. Here, we relied on longitudinal data, including results of routine RT-PCR tests, collected within three quarantine hospitals located in Cairo and Fayoum, Egypt. Using a model-based approach that accounts for the time-since-exposure variation in false-negative rates of RT-PCR tests, we computed the incidence of SARS-CoV-2 infection among HCWs. Over a total follow-up of 6,064 person-days (PD), we estimated an incidence rate (per 100 PD) of 1.05 (95% CrI: 0.58-1.65) at Hospital 1, 1.92 (95% CrI: 0.93-3.28) at Hospital 2 and 7.62 (95% CrI: 3.47-13.70) at Hospital 3. The probability for an HCW to be infected at the end of a shift was 13.7% (95% CrI: 7.8%-20.8%) and 23.8% (95% CrI: 12.2%-37.3%) for a 2-week shift at Hospital 1 and Hospital 2, respectively, which lies within the range of risk levels previously documented in standard healthcare settings, whereas it was >3-fold higher for a 7-day shift at Hospital 2 (42.6%, 95%CrI: 21.9%-64.4%). Our model-based estimates unveil a proportion of undiagnosed infections among HCWs of 46.4% (95% CrI: 18.8%-66.7%), 45.0% (95% CrI: 5.6%-70.8%) and 59.2% (95% CrI: 34.8%-78.8%), for Hospitals 1 to 3, respectively. The large variation in SARS-CoV-2 incidence we document here suggests that HCWs from quarantine hospitals may face a high occupational risk of infection, but that, with sufficient anticipation and infection control measures, this risk can be brought down to levels similar to those observed in standard healthcare settings. WHAT THIS PAPER ADDSO_ST_ABSWhat is already known on this topicC_ST_ABSPrevious studies conducted in standard care settings have documented that frontline healthcare workers (HCWs) face high risk of COVID-19. Whether risk levels differ in alternative care models, such as COVID-19 quarantine hospitals in Egypt where HCWs resided in the hospital days and nights for various durations, is unknown. What this study addsCOVID-19 risk for HCWs in quarantine hospitals varies substantially between facilities, from risk levels that are in the range of those documented in standard healthcare settings to levels that were approximatively 3 times higher. How this study might affect research, practice or policyWith sufficient anticipation and infection control measures, occupational COVID-19 risk for HCWs working in quarantine hospitals can be brought down to levels similar to those observed in standard healthcare settings.
epidemiology
10.1101/2020.12.22.20248622
The interplay between vaccination and social distancing strategies affects COVID19 population-level outcomes
Social distancing is an effective population-level mitigation strategy to prevent COVID19 propagation but it does not reduce the number of susceptible individuals and bears severe social consequences--a dire situation that can be overcome with the recently developed vaccines. Although a combination of these interventions should provide greater benefits than their isolated deployment, a mechanistic understanding of the interplay between them is missing. To tackle this challenge we developed an age-structured deterministic model in which vaccines are deployed during the pandemic to individuals who, in the eye of public health, are susceptible (do not show symptoms). The model allows for flexible and dynamic prioritization strategies with shifts between target groups. We find a strong interaction between social distancing and vaccination in their effect on the proportion of hospitalizations. In particular, prioritizing vaccines to elderly (60+) before adults (20-59) is more effective when social distancing is applied to adults or uniformly. In addition, the temporal reproductive number Rt is only affected by vaccines when deployed at sufficiently high rates and in tandem with social distancing. Finally, the same reduction in hospitalization can be achieved via different combination of strategies, giving decision makers flexibility in choosing public health policies. Our study provides insights into the factors that affect vaccination success and provides methodology to test different intervention strategies in a way that will align with ethical guidelines. Author summaryA major question in epidemiology is how to combine intervention methods in an optimal way. With the recent deployment of COVID19 vaccine, this question is now particularly relevant. Using a data-driven model in which vaccines are deployed during the pandemic and their prioritization can shift between target groups we show that there is a strong interplay between these interventions. For example, prioritizing vaccines to elderly--the common strategy worldwide--results in a larger reduction in hospitalizations when social distancing is applied to adults than to elderly. Importantly, reduction in hospitalizations can be achieved via multiple combination of intervention strategies, allowing for flexible public health policies.
epidemiology
10.1101/2020.12.21.20248431
How optimal allocation of limited testing capacity changes epidemic dynamics
Insufficient testing capacity continues to be a critical bottleneck in the worldwide fight against COVID-19. Optimizing the deployment of limited testing resources has therefore emerged as a keystone problem in pandemic response planning. Here, we use a modified SEIR model to optimize testing strategies under a constraint of limited testing capacity. We define pre-symptomatic, asymptomatic, and symptomatic infected classes, and assume that positively tested individuals are immediately moved into quarantine. We further define two types of testing. Clinical testing focuses only on the symptomatic class. Non-clinical testing detects pre- and asymptomatic individuals from the general population, and an "information" parameter governs the degree to which such testing can be focused on high infection risk individuals. We then solve for the optimal mix of clinical and non-clinical testing as a function of both testing capacity and the information parameter. We find that purely clinical testing is optimal at very low testing capacities, supporting early guidance to ration tests for the sickest patients. Additionally, we find that a mix of clinical and non-clinical testing becomes optimal as testing capacity increases. At high but empirically observed testing capacities, a mix of clinical testing and unfocused (information=0) non-clinical testing becomes optimal. We further highlight the advantages of early implementation of testing programs, and of combining optimized testing with contact reduction interventions such as lockdowns, social distancing, and masking.
epidemiology
10.1101/2020.12.21.20248665
The EEG multiverse of schizophrenia
Research on schizophrenia typically focuses on one paradigm, for which clear-cut differences between patients and controls are established. Great care is taken to understand the underlying genetical, neurophysiological, and cognitive mechanism, which eventually may explain the clinical outcome. One tacit assumption of these deep rooting approaches is that paradigms tap into common and representative aspects of the disorder. Here, we analyzed the resting-state electroencephalogram (EEG) of 121 schizophrenia patients and 75 controls. Using multiple signal processing methods, we extracted 194 EEG features. Sixty-nine out of the 194 EEG features showed a significant difference between patients and controls indicating that these features detect an important aspect of schizophrenia. Surprisingly, the correlations between these features were very low, suggesting that each feature picks up a different aspect of the disorder. We propose that complementing deep with shallow rooting approaches, where many roughly independent features are extracted from one paradigm (or several paradigms), will strongly improve diagnosis and potential treatment of schizophrenia.
psychiatry and clinical psychology
10.1101/2020.12.21.20248665
The EEG multiverse of schizophrenia
Research on schizophrenia typically focuses on one paradigm, for which clear-cut differences between patients and controls are established. Great care is taken to understand the underlying genetical, neurophysiological, and cognitive mechanism, which eventually may explain the clinical outcome. One tacit assumption of these deep rooting approaches is that paradigms tap into common and representative aspects of the disorder. Here, we analyzed the resting-state electroencephalogram (EEG) of 121 schizophrenia patients and 75 controls. Using multiple signal processing methods, we extracted 194 EEG features. Sixty-nine out of the 194 EEG features showed a significant difference between patients and controls indicating that these features detect an important aspect of schizophrenia. Surprisingly, the correlations between these features were very low, suggesting that each feature picks up a different aspect of the disorder. We propose that complementing deep with shallow rooting approaches, where many roughly independent features are extracted from one paradigm (or several paradigms), will strongly improve diagnosis and potential treatment of schizophrenia.
psychiatry and clinical psychology
10.1101/2020.12.19.20248557
Intracranial EEG biomarkers for seizure lateralization in rapidly-bisynchronous epilepsy after laser corpus callosotomy.
ObjectiveIt has been asserted that high-frequency analysis of intracranial EEG (iEEG) data may yield information useful in localizing epileptogenic foci. MethodsWe tested whether proposed biomarkers could predict lateralization based on iEEG data collected prior to corpus callostomy (CC) in patients with bisynchronous epilepsy, whose seizures lateralized definitively post-CC. Lateralization data derived from algorithmically-computed ictal phase-locked high gamma (PLHG), high gamma amplitude (HGA) and line length (LL), as well as interictal high-frequency oscillation (HFO) and interictal epileptiform discharge (IED) rate metrics were compared against ground-truth lateralization from post-CC ictal iEEG. ResultsPre-CC unilateral IEDs were more frequent on the more-pathologic side in all subjects. HFO rate predicted lateralization in one subject, but was sensitive to detection threshold. On pre-CC data, no ictal metric showed better predictive power than any other. All post-corpus callosotomy seizures lateralized to the pathological hemisphere using PLHG, HGA and LL metrics. ConclusionsWhile quantitative metrics of IED rate and ictal HGA, PHLG and LL all accurately lateralize based on post-CC iEEG, only IED rate consistently does so based on pre-CC data. SignificanceQuantitative analysis of IEDs may be useful in localizing seizure pathology. More work is needed to develop reliable techniques for high-frequency iEEG analysis. HighlightsO_LIWe evaluated intracranial EEG biomarkers in corpus callostomy patients with bisynchronous seizures pre-operatively. C_LIO_LIDespite testing more contemporary metrics, only interictal epileptiform discharge counting consistently lateralized seizure foci. C_LIO_LIHigh-frequency metrics, especially high-frequency oscillation counting, appear to be sensitive to parameter selection. C_LI
neurology
10.1101/2020.12.19.20248493
Optimizing ventilation cycles to control airborne transmission risk of SARS-CoV2 in school classrooms
Open schools in winter in highly epidemic areas pose a controversial issue: ventilation of classrooms (an essential mitigation factor for airborne transmission) is expected to sensibly decrease due to outdoor temperatures getting colder and regulators going to allow less restrictive policies on windows closure. Fundamental questions to be addressed are therefore: to which extent can we contain airborne transmission risk in schools? what would be the optimal ventilation strategy during the cold season considering the fact that most schools are not provided with mechanical ventilation systems? To try answering these questions a risk model for airborne transmission of covid-19 in classrooms has been develped based on previous models for tubercolosis and influenza. The separate cases of infective student and infective teacher, as well as infective teacher with microphone are investigated. We explored 3500 different air ventilation cycles for different lesson+break times and carried out a numerical optimization of the risk function. Safety risk-zones for breaks and lessons durations were estimated combining the effect of surgical masks and optimal windows opening cycles.
occupational and environmental health
10.1101/2020.12.19.20248493
Optimizing ventilation cycles to control airborne transmission risk of SARS-CoV2 in school classrooms
Open schools in winter in highly epidemic areas pose a controversial issue: ventilation of classrooms (an essential mitigation factor for airborne transmission) is expected to sensibly decrease due to outdoor temperatures getting colder and regulators going to allow less restrictive policies on windows closure. Fundamental questions to be addressed are therefore: to which extent can we contain airborne transmission risk in schools? what would be the optimal ventilation strategy during the cold season considering the fact that most schools are not provided with mechanical ventilation systems? To try answering these questions a risk model for airborne transmission of covid-19 in classrooms has been develped based on previous models for tubercolosis and influenza. The separate cases of infective student and infective teacher, as well as infective teacher with microphone are investigated. We explored 3500 different air ventilation cycles for different lesson+break times and carried out a numerical optimization of the risk function. Safety risk-zones for breaks and lessons durations were estimated combining the effect of surgical masks and optimal windows opening cycles.
occupational and environmental health
10.1101/2020.12.19.20248484
GARD is a pan-cancer predictor of clinical outcome in radiation treated patients.
BackgroundDespite advances in cancer genomics, radiation therapy (RT) is still prescribed based on an empiric one-size-fits-all paradigm. Previously, we proposed a novel algorithm using the genomic adjusted radiation dose (GARD) to personalize RT prescription dose based on the biological effect of a given physical RT dose, calculated using individual tumor genomics. We hypothesize that GARD will reveal interpatient heterogeneity associated with opportunities to improve outcomes compared to physical RT dose alone. To test this hypothesis, and the GARD-based RT dosing paradigm, we performed a pooled pan-cancer analysis in 11 separate clinical cohorts of 1,615 unique patients with 7 different cancer types that represent all available cohorts with the data required to calculate GARD, together with clinical outcome. MethodsUsing 11 previously-published datasets of cancers including breast, head and neck, non-small cell lung, pancreas, endometrium, melanoma and glioma, we defined two clinical endpoints: (i) time to first recurrence and (ii) overall survival, comprising 1,298 (982 +RT, 316 -RT) and 677 patients (424 +RT, 253 -RT), respectively. We used Cox regression stratified by cohort to test association between GARD and outcome with separate models using RT dose and sham-GARD for comparison. Interaction tests between GARD and treatment (+/- RT) were performed using the Wald statistic. ResultsPooled analysis of all available data reveal that GARD as a continuous variable is associated with recurrence (HR = 0.982, CI [0.970, 0.994], p = 0.002) and survival (HR = 0.970, CI [0.953, 0.988], p = 0.001). The interaction test revealed the effect of GARD on survival depends on whether or not that patient received RT (Wald statistic: p=0.011). Physical RT dose and sham-GARD were not significantly associated with either outcome. ConclusionsThe biologic effect of radiation therapy, as quantified by GARD, is significantly associated with recurrence and survival for those patients treated with radiation: it is predictive of RT benefit; and physical RT dose is not. We propose integration of genomics into radiation dosing decisions, using a GARD-based framework, as the new paradigm for personalizing RT prescription dose.
oncology
10.1101/2020.12.19.20248484
GARD is a pan-cancer predictor of radiation therapy benefit
BackgroundDespite advances in cancer genomics, radiation therapy (RT) is still prescribed based on an empiric one-size-fits-all paradigm. Previously, we proposed a novel algorithm using the genomic adjusted radiation dose (GARD) to personalize RT prescription dose based on the biological effect of a given physical RT dose, calculated using individual tumor genomics. We hypothesize that GARD will reveal interpatient heterogeneity associated with opportunities to improve outcomes compared to physical RT dose alone. To test this hypothesis, and the GARD-based RT dosing paradigm, we performed a pooled pan-cancer analysis in 11 separate clinical cohorts of 1,615 unique patients with 7 different cancer types that represent all available cohorts with the data required to calculate GARD, together with clinical outcome. MethodsUsing 11 previously-published datasets of cancers including breast, head and neck, non-small cell lung, pancreas, endometrium, melanoma and glioma, we defined two clinical endpoints: (i) time to first recurrence and (ii) overall survival, comprising 1,298 (982 +RT, 316 -RT) and 677 patients (424 +RT, 253 -RT), respectively. We used Cox regression stratified by cohort to test association between GARD and outcome with separate models using RT dose and sham-GARD for comparison. Interaction tests between GARD and treatment (+/- RT) were performed using the Wald statistic. ResultsPooled analysis of all available data reveal that GARD as a continuous variable is associated with recurrence (HR = 0.982, CI [0.970, 0.994], p = 0.002) and survival (HR = 0.970, CI [0.953, 0.988], p = 0.001). The interaction test revealed the effect of GARD on survival depends on whether or not that patient received RT (Wald statistic: p=0.011). Physical RT dose and sham-GARD were not significantly associated with either outcome. ConclusionsThe biologic effect of radiation therapy, as quantified by GARD, is significantly associated with recurrence and survival for those patients treated with radiation: it is predictive of RT benefit; and physical RT dose is not. We propose integration of genomics into radiation dosing decisions, using a GARD-based framework, as the new paradigm for personalizing RT prescription dose.
oncology
10.1101/2020.12.18.20248439
Spatial Allocation of Scarce Vaccine and Antivirals for COVID-19
The COVID-19 Vaccines Global Access (COVAX) is an initiative led by the World Health Organization (WHO) and other partners that aims for an equitable access of COVID-19 vaccines. Despite a potential heterogeneous disease burden across space, countries receiving allotments of vaccines via COVAX may want to follow WHOs allocation rule and distribute vaccines to their jurisdictions based on the jurisdictions relative population size. Utilizing economic-epidemiological modeling, we benchmark the performance of this ad hoc allocation rule by comparing it to the rule that minimizes the economic damages and expenditures over time, including a penalty cost representing the social costs of deviating from the ad hoc allocation. Under different levels of vaccine scarcity and different demographic characteristics, we consider scenarios where length of immunity and compliance to travel restrictions vary, and consider the robustness of the rules when assumptions regarding these factors are incorrect. The benefits from deviating are especially high when immunity is permanent, when there is compliance to travel restrictions, when the supply of vaccine is low, and when there is heterogeneity in demographic characteristics. Interestingly, a lack of compliance to travel restrictions pushes the optimal allocations of vaccine towards the ad hoc and improves the relative robustness of the ad hoc rule, as the mixing of the populations reduces the spatial heterogeneity in disease burden. JEL ClassificationC61, H12, H84, I18, Q54
health economics
10.1101/2020.12.18.20248439
Spatial Allocation of Scarce COVID-19 Vaccines
The COVID-19 Vaccines Global Access (COVAX) is an initiative led by the World Health Organization (WHO) and other partners that aims for an equitable access of COVID-19 vaccines. Despite a potential heterogeneous disease burden across space, countries receiving allotments of vaccines via COVAX may want to follow WHOs allocation rule and distribute vaccines to their jurisdictions based on the jurisdictions relative population size. Utilizing economic-epidemiological modeling, we benchmark the performance of this ad hoc allocation rule by comparing it to the rule that minimizes the economic damages and expenditures over time, including a penalty cost representing the social costs of deviating from the ad hoc allocation. Under different levels of vaccine scarcity and different demographic characteristics, we consider scenarios where length of immunity and compliance to travel restrictions vary, and consider the robustness of the rules when assumptions regarding these factors are incorrect. The benefits from deviating are especially high when immunity is permanent, when there is compliance to travel restrictions, when the supply of vaccine is low, and when there is heterogeneity in demographic characteristics. Interestingly, a lack of compliance to travel restrictions pushes the optimal allocations of vaccine towards the ad hoc and improves the relative robustness of the ad hoc rule, as the mixing of the populations reduces the spatial heterogeneity in disease burden. JEL ClassificationC61, H12, H84, I18, Q54
health economics
10.1101/2020.12.18.20248439
Spatial Allocation of Scarce COVID-19 Vaccines
The COVID-19 Vaccines Global Access (COVAX) is an initiative led by the World Health Organization (WHO) and other partners that aims for an equitable access of COVID-19 vaccines. Despite a potential heterogeneous disease burden across space, countries receiving allotments of vaccines via COVAX may want to follow WHOs allocation rule and distribute vaccines to their jurisdictions based on the jurisdictions relative population size. Utilizing economic-epidemiological modeling, we benchmark the performance of this ad hoc allocation rule by comparing it to the rule that minimizes the economic damages and expenditures over time, including a penalty cost representing the social costs of deviating from the ad hoc allocation. Under different levels of vaccine scarcity and different demographic characteristics, we consider scenarios where length of immunity and compliance to travel restrictions vary, and consider the robustness of the rules when assumptions regarding these factors are incorrect. The benefits from deviating are especially high when immunity is permanent, when there is compliance to travel restrictions, when the supply of vaccine is low, and when there is heterogeneity in demographic characteristics. Interestingly, a lack of compliance to travel restrictions pushes the optimal allocations of vaccine towards the ad hoc and improves the relative robustness of the ad hoc rule, as the mixing of the populations reduces the spatial heterogeneity in disease burden. JEL ClassificationC61, H12, H84, I18, Q54
health economics
10.1101/2020.12.18.20248315
EBV deletions as biomarkers of response to treatment of Chronic Active Epstein Barr Virus
Chronic active Epstein Barr Virus (CAEBV) is a rare condition occurring in previously healthy individuals associated with persistent EBV viraemia, fever, lymphadenopathy and hepatosplenomegaly. Viral deletions have been found in CAEBV and other lymphomas. However, it is unclear how stable these deletions are, whether they are present in different sites and how they evolve overtime. We sequenced fourteen longitudinal blood samples from three European CAEBV patients and compared with CAEBV saliva samples and other sequences from EBV-related conditions. We observed large EBV deletions in blood, but not saliva from CAEBV patients. Deletions were stable over time but were lost following successful treatment. Our results are consistent with the likelihood that certain deletions in the virus from CAEBV patients are associated with the evolution and persistence of haematological clones. We propose that the loss of deletions following successful treatment should be investigated as a potential biomarker to aid CAEBV management.
infectious diseases
10.1101/2020.12.21.20248409
A national cohort study of COVID-19 in-hospital mortality in South Africa: the intersection of communicable and non-communicable chronic diseases in a high HIV prevalence setting
BackgroundThe interaction between COVID-19, non-communicable diseases, and chronic infectious diseases such as HIV and tuberculosis (TB) are unclear, particularly in low- and middle-income countries in Africa. South Africa has a national adult HIV prevalence of 19% and TB prevalence of 0.7%. Using a nationally representative hospital surveillance system in South Africa, we investigated the factors associated with in-hospital mortality among individuals with COVID-19. MethodsUsing data from national active hospital surveillance, we describe the demographic characteristics, clinical features, and in-hospital mortality among hospitalised individuals testing positive for SARS-CoV-2, during 5 March 2020 to 27 March 2021. Chained equation multiple imputation was used to account for missing data and random effect multivariable logistic regression models were used to assess the role of HIV-status and underlying comorbidities on in-hospital COVID-19 mortality. FindingsAmong the 219,265 individuals admitted with laboratory confirmed SARS-Cov-2, 51,037 (23.3%) died. Most commonly observed comorbidities among individuals with available data were hypertension (61,098/163,350; 37.4%), diabetes (43,885/159,932; 27.4%), and HIV (13,793/151,779; %), while TB was reported in 3.6% (5,282/146,381) of individuals. While age was the most important predictor, other factors associated with in-hospital COVID-19 mortality were HIV infection [aOR 1.34, 95% CI: 1.27-1.43), past TB [aOR 1.26, 95% CI: 1.15-1.38), current TB [aOR 1.42, 95% CI: 1.22-1.64) and both past and current TB [aOR 1.48, 95% CI: 1.32-1.67) compared to never TB, as well as other described risk factors for COVID-19, such as male sex, non-white race, and chronic underlying hypertension, diabetes, chronic cardiac disease, chronic renal disease, and malignancy. After adjusting for other factors, PLWH not on ART [aOR 1.45, 95% CI: 1.22-1.72] were more likely to die in-hospital compared to PLWH on ART. Among PLWH, the prevalence of other comorbidities was 29.2% compared to 30.8% among HIV-uninfected individuals. Increasing number of comorbidities was associated with increased mortality risk in both PLWH and HIV-uninfected individuals. InterpretationIdentified high risk individuals (older individuals and those with chronic comorbidities and PLWH, particularly those not on ART) would benefit from COVID-19 prevention programmes such as vaccine prioritisation, as well as early referral and treatment. FundingSouth African National Government Research in contextO_ST_ABSEvidence before this studyC_ST_ABSSince the emergence of the COVID-19 pandemic, studies have identified older age, male sex and presence of underlying comorbidities including heart disease and diabetes as risk factors for severe disease and death. There are very few studies, however, carried out in low- and middle-income countries (LMIC) in Africa, many of whom have high poverty rates, limited access to healthcare, and high prevalence of chronic communicable diseases, such as HIV and tuberculosis (TB). Data are also limited from settings with limited access to HIV treatment programmes. Early small cohort studies mainly from high income countries were not conclusive on whether HIV or TB are risk factors for disease severity and death in COVID-19 patients. Large population cohort studies from South Africas Western Cape province and the United Kingdom (UK) have found people living with HIV (PLWH) to have a moderately increased risk of COVID-19 associated mortality. Of these, only the Western Cape study presented data on mortality risk associated with presence of high viral load or immunosuppression, and found similar levels of severity irrespective of these factors. Recent meta-analyses have confirmed the association of HIV with COVID-19 mortality. No studies reported on the interaction between HIV-infection and other non-communicable comorbidities on COVID-19 associated mortality. We performed separate literature searches on PubMed using the following terms: "COVID-19" "risk factors" and "mortality"; "HIV" "COVID-19" and "mortality"; "TB" "COVID-19" and "mortality". All searches included publications from December 1, 2019 until May 5, 2021, without language restrictions. Pooled together, we identified 2,786 published papers. Additionally, we performed two literature searches on MedRxiv using the terms "HIV" "COVID-19" and "mortality", and "TB" "COVID-19" and "mortality" from April 25, 2020 until May 5, 2021, without language restrictions. Pooled together, we identified 7,744 pre-prints. Added value of this studyAmong a large national cohort of almost 220,000 individuals hospitalised with COVID-19 in a setting with 19% adult HIV prevalence and 0.7%TB prevalence, we found that along with age, sex and other comorbidities, HIV and TB were associated with a moderately increased risk of in-hospital mortality. We found increasing risk of in-hospital mortality among PLWH not on ART compared to those on ART. Among PLWH, the prevalence of other comorbidities was high (29%) and the effect of increasing numbers of comorbidities on mortality was similar in PLWH and HIV-uninfected individuals. Our study included 13,793 PLWH from all provinces in the country with varying levels of access to HIV treatment programmes. Implications of all the available evidenceThe evidence suggests that PLWH and TB-infected individuals should be prioritised for COVID-19 prevention and treatment programmes, particularly those with additional comorbidities. Increasing age and presence of chronic underlying illness are important additional factors associated with COVID-19 mortality in a middle-income African setting. The completeness of data is a limitation of this national surveillance system, and additional data are needed to confirm these findings.
infectious diseases
10.1101/2020.12.19.20248567
SARS-CoV-2 neutralizing antibodies; longevity, breadth, and evasion by emerging viral variants
The SARS-CoV-2 antibody neutralization response and its evasion by emerging viral variants are unknown. Antibody immunoreactivity against SARS-CoV-2 antigens and Spike variants, inhibition of Spike-driven virus-cell fusion, and infectious SARS-CoV-2 neutralization were characterized in 807 serial samples from 233 RT-PCR-confirmed COVID-19 individuals with detailed demographics and followed up to seven months. A broad and sustained polyantigenic immunoreactivity against SARS-CoV-2 Spike, Membrane, and Nucleocapsid proteins, along with high viral neutralization were associated with COVID-19 severity. A subgroup of high responders maintained high neutralizing responses over time, representing ideal convalescent plasma therapy donors. Antibodies generated against SARS-CoV-2 during the first COVID-19 wave had reduced immunoreactivity and neutralization potency to emerging Spike variants. Accurate monitoring of SARS-CoV-2 antibody responses would be essential for selection of optimal plasma donors and vaccine monitoring and design. One Sentence SummaryNeutralizing antibody responses to SARS-CoV-2 are sustained, associated with COVID19 severity, and evaded by emerging viral variants
infectious diseases
10.1101/2020.12.21.20248670
Large variation in the association between seasonal antibiotic use and resistance across multiple bacterial species and antibiotic classes
Understanding how antibiotic use drives resistance is crucial for guiding effective strategies to limit the spread of resistance, but the use-resistance relationship across pathogens and antibiotics remains unclear. We applied sinusoidal models to evaluate the seasonal use-resistance relationship across 3 species (Staphylococcus aureus, Escherichia coli, and Klebsiella pneumoniae) and 5 antibiotic classes (penicillins, macrolides, quinolones, tetracyclines, and nitrofurans) in Boston, Massachusetts. Use of all 5 classes and resistance in 9 of 15 species-antibiotic combinations showed statistically significant amplitudes of seasonality (false discovery rate < 0.05). While seasonal peaks in use varied by class, resistance in all 9 species-antibiotic combinations peaked in the winter and spring. The correlations between seasonal use and resistance thus varied widely, with resistance to all antibiotic classes being most positively correlated with use of the winter-peaking classes (penicillins and macrolides). These findings challenge the simple model of antibiotic use independently selecting for resistance and suggest that stewardship strategies will not be equally effective across all species and antibiotics. Rather, seasonal selection for resistance across multiple antibiotic classes may be dominated by use of the most highly prescribed antibiotic classes, penicillins and macrolides.
infectious diseases
10.1101/2020.12.21.20248670
Large variation in the association between seasonal antibiotic use and resistance across multiple bacterial species and antibiotic classes
Understanding how antibiotic use drives resistance is crucial for guiding effective strategies to limit the spread of resistance, but the use-resistance relationship across pathogens and antibiotics remains unclear. We applied sinusoidal models to evaluate the seasonal use-resistance relationship across 3 species (Staphylococcus aureus, Escherichia coli, and Klebsiella pneumoniae) and 5 antibiotic classes (penicillins, macrolides, quinolones, tetracyclines, and nitrofurans) in Boston, Massachusetts. Use of all 5 classes and resistance in 9 of 15 species-antibiotic combinations showed statistically significant amplitudes of seasonality (false discovery rate < 0.05). While seasonal peaks in use varied by class, resistance in all 9 species-antibiotic combinations peaked in the winter and spring. The correlations between seasonal use and resistance thus varied widely, with resistance to all antibiotic classes being most positively correlated with use of the winter-peaking classes (penicillins and macrolides). These findings challenge the simple model of antibiotic use independently selecting for resistance and suggest that stewardship strategies will not be equally effective across all species and antibiotics. Rather, seasonal selection for resistance across multiple antibiotic classes may be dominated by use of the most highly prescribed antibiotic classes, penicillins and macrolides.
infectious diseases
10.1101/2020.12.22.20248707
How to coordinate vaccination and social distancing to mitigate SARS-CoV-2 outbreaks
Most countries have started vaccinating people against COVID-19. However, due to limited production capacities and logistical challenges it will take months/years until herd immunity is achieved. Therefore, vaccination and social distancing have to be coordinated. In this paper, we provide some insight on this topic using optimization-based control on an age-differentiated compartmental model. For real-life decision making, we investigate the impact of the planning horizon on the optimal vaccination/social distancing strategy. We find that in order to reduce social distancing in the long run, without overburdening the healthcare system, it is essential to vaccinate the people with the highest contact rates first. That is also the case if the objective is to minimize fatalities provided that the social distancing measures are sufficiently strict. However, for short-term planning it is optimal to focus on the high-risk group.
epidemiology
10.1101/2020.12.18.20248479
Population Changes in Seroprevalence among a Statewide Sample in the United States
Antibody surveillance provides essential information for public health officials to work with communities to discuss the spread and impact of COVID-19. At the start of the new severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic in the United States, diagnostic testing was limited with many asymptomatic and thus undetected cases. Irrespective of symptom severity, antibodies develop within two to three weeks after exposure and may persist 6 months or more.; Thus, antibody surveillance is an important tool for tracking trends in past infections across diverse populations. This study includes adults and children ([&ge;]12 years old) recruited from a statewide sample of past 2014-2020 Survey of the Health of Wisconsin (SHOW) participants. SHOW, an ongoing population-based health examination study including a randomly selected sample of households, partnered with the Wisconsin Department of Health Services and the Wisconsin State Laboratory of Hygiene to conduct longitudinal antibody surveillance using the Abbott Architect SARS-CoV-2 IgG antibody test, which detects antibodies against the nucleocapsid protein. Three WAVES of sample collection were completed in 2020-2021, tracking mid-summer, late fall, and early spring COVID-19 trends prior to vaccine availability. Crude estimates of seroprevalence in the total study population increased ten-fold from 1.4% during WAVE I to 11.5% in WAVE III. Within the statewide probability sample, weighted estimates increased from 1.6% (95% CI:0.6-2.5%), to 6.8% (95% CI:4.3-9.4%) in WAVE II and to 11.4% (95% CI:8.2, 14.6%) in WAVE III. Longitudinal trends in seroprevalence match statewide case counts. Local seroprevalence showed variation by state health region with increasing prevalence among higher income (>200% poverty income ratio), and rural health regions of the state seeing the highest increase in COVID-19 prevalence over time. Significant disparities in prevalence by racial and ethnic groups also exist, with greater than two times seroprevalence among Latino and black participants compared to non-Hispanic whites. This public health and academic partnership provides critical data for the ongoing pandemic response and lays the foundation for future research into longer-term immunity, health impacts and population-level disparities.
epidemiology
10.1101/2020.12.17.20248426
Source-level EEG and graph theory reveal widespread functional network alterations in focal epilepsy
ObjectiveThe hypersynchronous neuronal activity associated with epilepsy causes widespread functional network disruptions extending beyond the epileptogenic zone. This altered functional network topology is considered a mediator from which non-seizure symptoms arise, such as cognitive impairment. The aim of the present study was to demonstrate the presence of functional network alterations in focal epilepsy patients with good seizure control and high quality of life. MethodsWe compared twenty-two focal epilepsy patients and sixteen healthy controls on graph metrics derived from functional connectivity (phase-locking value) of source reconstructed resting-state EEG. Graph metrics were calculated over a predefined range of network densities in five frequency bands. ResultsIn terms of global network topology alterations, we observed a significantly increased small world index in epilepsy patients relative to the healthy controls. On the local level, two left-hemisphere regions displayed a shift towards greater alpha band "hubness". ConclusionsSubtle widespread functional network alterations are evident in focal epilepsy, even in a cohort characterised by successful anti-seizure medication therapy and high quality of life. These findings suggest a possible clinical relevance of functional network analysis in epilepsy. SignificanceFocal epilepsy is accompanied by global and local functional network aberrancies which might be implied in the sustenance of non-seizure symptoms. HighlightsO_LIFocal epilepsies are associated with widespread interictal functional network alterations, extending beyond the epilepsy focus. C_LIO_LIGlobal and local graph theoretical analyses of source-space EEG functional connectivity networks capture these network changes, and might thus be of clinical relevance. C_LIO_LIGroup-level differences in network metrics are relatively stable across network analysis parameters. C_LI
neurology
10.1101/2020.12.16.20246710
A demonstration of cone function plasticity after gene therapy in achromatopsia
Recent advances in regenerative therapy have placed the treatment of many previously incurable eye diseases within arms-reach (Ciulla et al., 2020). Achromatopsia (ACHM) is a severe monogenic heritable retinal disease that disrupts cone function from gestation, leaving patients with complete colour blindness, low acuity, photosensitivity, and nystagmus (Hirji, Aboshiha, et al., 2018). In non-primate animal models of ACHM, retinal gene-replacement therapy has successfully induced cone function in the young (Alexander et al., 2007; Carvalho et al., 2011), but it was yet to be determined if and when these therapies could effectively impact cone-mediated pathways in the human brain. Here we demonstrate in children with ACHM that gene therapy can yield substantial improvement in cone-mediated vision, via cascading effects on signal transmission from retina to cortex. To measure the effects of treatment in children with ACHM (CNGA3- and CNGB3-associated, all aged 10+ years), we developed novel visual stimuli, calibrated to selectively activate cone photoreceptors. We used these in behavioural psychophysics and functional MRI with population receptive field mapping, pre- and post-treatment. The results of treatment, contextualized against data from 12 untreated ACHM patients and 25 normal-sighted, revealed that six months post-therapy, two patients displayed novel responses to our cone-selective stimuli in the visual cortex, with a retinotopic organisation characteristic of normal-sighted individuals, not present in untreated ACHM. This was paired with significant improvement in cone-mediated perception specific to the treated eye, and self-reports of improved vision. Two other patients did not show a post-treatment effect, potentially reflecting individual differences in therapeutic outcome. Together, these data show that gene replacement therapy in humans with ACHM can activate dormant cone pathways despite long-term deprivation. This offers great promise for regenerative therapies, and their ability to trigger the neural plasticity needed to cure congenital vision loss in human patients.
ophthalmology
10.1101/2020.12.22.20248691
Automated processing of thermal imaging to detect COVID-19
Rapid and sensitive screening tools for SARS-CoV-2 infection are essential to limit the spread of COVID-19 and to properly allocate national resources. Here, we developed a new point-of-care, non-contact thermal imaging tool to detect COVID-19, based on image-processing algorithms and machine learning analysis. We captured thermal images of the back of individuals with and without COVID-19 using a portable thermal camera that connects directly to smartphones. Our novel image processing algorithms automatically extracted multiple texture and shape features of the thermal images and achieved an area under the curve (AUC) of 0.85 in detecting COVID-19 with up to 92% sensitivity. Thermal imaging scores were inversely correlated with clinical variables associated with COVID-19 disease progression. We show, for the first time, that a hand-held thermal imaging device can be used to detect COVID-19. Non-invasive thermal imaging could be used to screen for COVID-19 in out-of-hospital settings, especially in low-income regions with limited imaging resources. HIGHLIGHTSO_LIAutomated processing of thermal images of the back can be used to detect COVID-19 with up to 92% sensitivity. C_LIO_LIThe extracted texture features of the thermal image are associated with COVID-19 disease progression and lung injury. C_LIO_LIA portable thermal camera that connects directly to smartphones can be used to detect COVID-19. C_LIO_LINon-invasive thermal imaging could be used to screen for COVID-19 in out-of-hospital settings and regions with limited imaging resources. C_LI GRAPHICAL ABSTRACT O_FIG O_LINKSMALLFIG WIDTH=200 HEIGHT=68 SRC="FIGDIR/small/20248691v2_ufig1.gif" ALT="Figure 1"> View larger version (15K): [email protected]@14e5a5eorg.highwire.dtl.DTLVardef@10ede4corg.highwire.dtl.DTLVardef@1248470_HPS_FORMAT_FIGEXP M_FIG C_FIG
radiology and imaging
10.1101/2020.12.22.20248270
Building a Best-in-Class De-identification Tool for Electronic Medical Records Through Ensemble Learning
The natural language portions of electronic health records (EHRs) communicate critical information about disease and treatment progression. However, the presence of personally identifiable information (PII) in this data constrains its broad reuse. Despite continuous improvements in methods for the automated detection of PII, the presence of residual identifiers in clinical notes requires manual validation and correction. However, manual intervention is not a scalable solution for large EHR datasets. Here, we describe an automated de-identification system that employs an ensemble architecture, incorporating attention-based deep learning models and rule-based methods, supported by heuristics for detecting PII in EHR data. Upon detection of PII, the system transforms these detected identifiers into plausible, though fictional, surrogates to further obfuscate any leaked identifier. We evaluated the system with a publicly available dataset of 515 notes from the I2B2 2014 de-identification challenge and a dataset of 10,000 notes from the Mayo Clinic. In comparison with other existing tools considered best-in-class, our approach outperforms them with a recall of 0.992 and 0.994 and a precision of 0.979 and 0.967 on the I2B2 and the Mayo Clinic data, respectively. The automated de-identification system presented here can enable the generation of de-identified patient data at the scale required for modern machine learning applications to help accelerate medical discoveries.
health informatics
10.1101/2020.12.22.20244061
Predicting the Evolution of COVID-19 Mortality Risk: a Recurrent Neural Network Approach
AO_SCPLOWBSTRACTC_SCPLOWO_ST_ABSBackgroundC_ST_ABSThe propagation of COVID-19 in Spain prompted the declaration of the state of alarm on March 14, 2020. On 2 December 2020, the infection had been confirmed in 1,665,775 patients and caused 45,784 deaths. This unprecedented health crisis challenged the ingenuity of all professionals involved. Decision support systems in clinical care and health services management were identified as crucial in the fight against the pandemic. MethodsThis study applies Deep Learning techniques for mortality prediction of COVID-19 patients. Two datasets with clinical information (medication, laboratory tests, vital signs etc.) were used. They are comprised of 2,307 and 3,870 COVID-19 infected patients admitted to two Spanish hospital chains. Firstly, we built a sequence of temporal events gathering all the clinical information for each patient. Next, we used the temporal sequences to train a Recurrent Neural Network (RNN) model with an attention mechanism exploring interpretability. We conducted extensive experiments and trained the RNNs in different settings, performing hyperparameter search and cross-validation. We ensembled resulting RNNs to reduce variability and enhance sensitivity. ResultsWe assessed the performance of our models using global metrics, by averaging the performance across all the days in the sequences. We also measured day-by-day metrics starting from the day of hospital admission and the outcome day and evaluated the daily predictions. Regarding sensitivity, when compared to more traditional models, our best two RNN ensemble models outperform a Support Vector Classifier in 6 and 16 percentage points, and Random Forest in 23 and 18 points. For the day-by-day predictions from the outcome date, the models also achieved better results than baselines showing its ability towards early predictions. ConclusionsWe have shown the feasibility of our approach to predict the clinical outcome of patients infected with SARS-CoV-2. The result is a time series model that can support decision-making in healthcare systems and aims at interpretability. The system is robust enough to deal with real world data and it is able to overcome the problems derived from the sparsity and heterogeneity of the data. In addition, the approach was validated using two datasets showing substantial differences. This not only validates the robustness of the proposal but also meets the requirements of a real scenario where the interoperability between hospitals datasets is difficult to achieve.
infectious diseases
10.1101/2020.12.22.20248698
MODELLING THE POTENTIAL ROLE OF MEDIA CAMPAIGNS ON THE CONTROL OF LISTERIOSIS
Human Listeria infection is a food-borne disease caused by the consumption of contaminated food products by the bacterial pathogen, Listeria. In this paper, we propose a mathematical model to analyze the impact of media campaigns on the spread and control of Listeriosis. The model exhibited three equilibria namely; disease-free, Listeria-free and endemic equilibria. The food contamination threshold ([R]f) is determined and the local stability analyses of the model discussed. Sensitivity analysis is done to determine the model parameters that most affect the severity of the disease. Numerical simulations were carried out to assess the role of media campaigns on the Listeriosis spread. The results show that; an increase in the intensity of the media awareness campaigns, the removal rate of contaminated food products, a decrease in the contact rate of Listeria by humans results in fewer humans getting infected, thus leading to the disease eradication. An increase in the depletion of media awareness campaigns results in more humans being infected with Listeriosis. These findings may significantly impact policy and decision making in the control of Listeriosis disease.
epidemiology
10.1101/2020.12.22.20248698
MODELLING THE POTENTIAL ROLE OF MEDIA CAMPAIGNS ON THE CONTROL OF LISTERIOSIS
Human Listeria infection is a food-borne disease caused by the consumption of contaminated food products by the bacterial pathogen, Listeria. In this paper, we propose a mathematical model to analyze the impact of media campaigns on the spread and control of Listeriosis. The model exhibited three equilibria namely; disease-free, Listeria-free and endemic equilibria. The food contamination threshold ([R]f) is determined and the local stability analyses of the model discussed. Sensitivity analysis is done to determine the model parameters that most affect the severity of the disease. Numerical simulations were carried out to assess the role of media campaigns on the Listeriosis spread. The results show that; an increase in the intensity of the media awareness campaigns, the removal rate of contaminated food products, a decrease in the contact rate of Listeria by humans results in fewer humans getting infected, thus leading to the disease eradication. An increase in the depletion of media awareness campaigns results in more humans being infected with Listeriosis. These findings may significantly impact policy and decision making in the control of Listeriosis disease.
epidemiology
10.1101/2020.12.23.20248612
Coronavirus GenBrowser for monitoring the transmission and evolution of SARS-CoV-2
Genomic epidemiology is important to study the COVID-19 pandemic and more than two million SARS-CoV-2 genomic sequences were deposited into public databases. However, the exponential increase of sequences invokes unprecedented bioinformatic challenges. Here, we present the Coronavirus GenBrowser (CGB) based on a highly efficient analysis framework and a movie maker strategy. In total, 1,002,739 high quality genomic sequences with the transmission-related metadata were analyzed and visualized. The size of the core data file is only 12.20 MB, efficient for clean data sharing. Quick visualization modules and rich interactive operations are provided to explore the annotated SARS-CoV-2 evolutionary tree. CGB binary nomenclature is proposed to name each internal lineage. The pre-analyzed data can be filtered out according to the user-defined criteria to explore the transmission of SARS-CoV-2. Different evolutionary analyses can also be easily performed, such as the detection of accelerated evolution and on-going positive selection. Moreover, the 75 genomic spots conserved in SARS-CoV-2 but non-conserved in other coronaviruses were identified, which may indicate the functional elements specifically important for SARS-CoV-2. The CGB not only enables users who have no programming skills to analyze millions of genomic sequences, but also offers a panoramic vision of the transmission and evolution of SARS-CoV-2.
genetic and genomic medicine
10.1101/2020.12.23.20248612
Coronavirus GenBrowser for monitoring the transmission and evolution of SARS-CoV-2
Genomic epidemiology is important to study the COVID-19 pandemic and more than two million SARS-CoV-2 genomic sequences were deposited into public databases. However, the exponential increase of sequences invokes unprecedented bioinformatic challenges. Here, we present the Coronavirus GenBrowser (CGB) based on a highly efficient analysis framework and a movie maker strategy. In total, 1,002,739 high quality genomic sequences with the transmission-related metadata were analyzed and visualized. The size of the core data file is only 12.20 MB, efficient for clean data sharing. Quick visualization modules and rich interactive operations are provided to explore the annotated SARS-CoV-2 evolutionary tree. CGB binary nomenclature is proposed to name each internal lineage. The pre-analyzed data can be filtered out according to the user-defined criteria to explore the transmission of SARS-CoV-2. Different evolutionary analyses can also be easily performed, such as the detection of accelerated evolution and on-going positive selection. Moreover, the 75 genomic spots conserved in SARS-CoV-2 but non-conserved in other coronaviruses were identified, which may indicate the functional elements specifically important for SARS-CoV-2. The CGB not only enables users who have no programming skills to analyze millions of genomic sequences, but also offers a panoramic vision of the transmission and evolution of SARS-CoV-2.
genetic and genomic medicine
10.1101/2020.12.23.20248612
Coronavirus GenBrowser for monitoring the transmission and evolution of SARS-CoV-2
Genomic epidemiology is important to study the COVID-19 pandemic and more than two million SARS-CoV-2 genomic sequences were deposited into public databases. However, the exponential increase of sequences invokes unprecedented bioinformatic challenges. Here, we present the Coronavirus GenBrowser (CGB) based on a highly efficient analysis framework and a movie maker strategy. In total, 1,002,739 high quality genomic sequences with the transmission-related metadata were analyzed and visualized. The size of the core data file is only 12.20 MB, efficient for clean data sharing. Quick visualization modules and rich interactive operations are provided to explore the annotated SARS-CoV-2 evolutionary tree. CGB binary nomenclature is proposed to name each internal lineage. The pre-analyzed data can be filtered out according to the user-defined criteria to explore the transmission of SARS-CoV-2. Different evolutionary analyses can also be easily performed, such as the detection of accelerated evolution and on-going positive selection. Moreover, the 75 genomic spots conserved in SARS-CoV-2 but non-conserved in other coronaviruses were identified, which may indicate the functional elements specifically important for SARS-CoV-2. The CGB not only enables users who have no programming skills to analyze millions of genomic sequences, but also offers a panoramic vision of the transmission and evolution of SARS-CoV-2.
genetic and genomic medicine
10.1101/2020.12.23.20248612
Coronavirus GenBrowser for monitoring the transmission and evolution of SARS-CoV-2
Genomic epidemiology is important to study the COVID-19 pandemic and more than two million SARS-CoV-2 genomic sequences were deposited into public databases. However, the exponential increase of sequences invokes unprecedented bioinformatic challenges. Here, we present the Coronavirus GenBrowser (CGB) based on a highly efficient analysis framework and a movie maker strategy. In total, 1,002,739 high quality genomic sequences with the transmission-related metadata were analyzed and visualized. The size of the core data file is only 12.20 MB, efficient for clean data sharing. Quick visualization modules and rich interactive operations are provided to explore the annotated SARS-CoV-2 evolutionary tree. CGB binary nomenclature is proposed to name each internal lineage. The pre-analyzed data can be filtered out according to the user-defined criteria to explore the transmission of SARS-CoV-2. Different evolutionary analyses can also be easily performed, such as the detection of accelerated evolution and on-going positive selection. Moreover, the 75 genomic spots conserved in SARS-CoV-2 but non-conserved in other coronaviruses were identified, which may indicate the functional elements specifically important for SARS-CoV-2. The CGB not only enables users who have no programming skills to analyze millions of genomic sequences, but also offers a panoramic vision of the transmission and evolution of SARS-CoV-2.
genetic and genomic medicine
10.1101/2020.12.23.20248757
The spatial spread of HIV in Malawi: An individual-based mathematical model
The prevalence of HIV varies greatly between and within countries. We therefore developed a flexible individual-based mathematical model for HIV transmission, that comprises a spatial representation and individual-level determinants. We tested this model by calibrating it to the HIV epidemic in Malawi and exploring whether the heterogeneity in HIV prevalence could be caused without accounting for heterogeneity in behaviour. We ran the model for Malawi between years 1975-2030 with five alternative realizations of the geographical structure and mobility: (I) no geographical structure; 28 administrative districts including (II) only permanent relocations between districts, (III) permanent relocations and between-district casual sexual relationships, or (IV) permanent relocations between districts and to/from abroad and between-district casual sex; and (V) a grid of 10x10km2 cells, with permanent relocations and between-cell casual relationships. We assumed HIV was present in 1975 in the districts with >10% prevalence in 2010. We calibrated the models to national and district-level prevalence estimates. Reaching the national prevalence required all adults to have at least 20 casual sex acts/year until 1990. Models II, III and V reproduced the geographical heterogeneity in prevalence to some extent if between-district relationships were either excluded (Model II) or restricted to minimum (Models III, V). Long-distance casual partnership mixing (Models III-V) mitigated the differences in prevalence substantially; with international migration the differences disappeared completely (Model IV). National prevalence was projected to decrease to 4-5% by 2030. Our model sustained the major differences in HIV prevalence across Malawi, if casual relationships between districts were kept at sufficiently low level. An earlier introduction of HIV into the Southern part of Malawi may thus be one of the explanations to the present heterogeneity in HIV prevalence. Author summaryThe prevalence of HIV varies greatly across the settings, both globally and within countries. The ability of the commonly used compartmental models to account for the geographical structure and individual-level determinants that cause this heterogeneity is limited. In this project, we developed an individual-based simulation framework for modelling HIV transmission in a real setting. We built the model to take into account an unlimited number of individual-level characteristics, and a geographical representation of the setting that can be defined using an arbitrary resolution and distance matrices. We demonstrate the use of this model by simulating the HIV epidemic of Malawi 1975-2030 and exploring whether the observed heterogeneity could be preserved without taking into account any spatial heterogeneity in sexual behaviour. A relatively simple version of the model reproduced the broad-scale differences in HIV prevalence, but the detailed differences will need further investigation.
hiv aids
10.1101/2020.12.23.20248757
The spatial spread of HIV in Malawi: An individual-based mathematical model
The prevalence of HIV varies greatly between and within countries. We therefore developed a flexible individual-based mathematical model for HIV transmission, that comprises a spatial representation and individual-level determinants. We tested this model by calibrating it to the HIV epidemic in Malawi and exploring whether the heterogeneity in HIV prevalence could be caused without accounting for heterogeneity in behaviour. We ran the model for Malawi between years 1975-2030 with five alternative realizations of the geographical structure and mobility: (I) no geographical structure; 28 administrative districts including (II) only permanent relocations between districts, (III) permanent relocations and between-district casual sexual relationships, or (IV) permanent relocations between districts and to/from abroad and between-district casual sex; and (V) a grid of 10x10km2 cells, with permanent relocations and between-cell casual relationships. We assumed HIV was present in 1975 in the districts with >10% prevalence in 2010. We calibrated the models to national and district-level prevalence estimates. Reaching the national prevalence required all adults to have at least 20 casual sex acts/year until 1990. Models II, III and V reproduced the geographical heterogeneity in prevalence to some extent if between-district relationships were either excluded (Model II) or restricted to minimum (Models III, V). Long-distance casual partnership mixing (Models III-V) mitigated the differences in prevalence substantially; with international migration the differences disappeared completely (Model IV). National prevalence was projected to decrease to 4-5% by 2030. Our model sustained the major differences in HIV prevalence across Malawi, if casual relationships between districts were kept at sufficiently low level. An earlier introduction of HIV into the Southern part of Malawi may thus be one of the explanations to the present heterogeneity in HIV prevalence. Author summaryThe prevalence of HIV varies greatly across the settings, both globally and within countries. The ability of the commonly used compartmental models to account for the geographical structure and individual-level determinants that cause this heterogeneity is limited. In this project, we developed an individual-based simulation framework for modelling HIV transmission in a real setting. We built the model to take into account an unlimited number of individual-level characteristics, and a geographical representation of the setting that can be defined using an arbitrary resolution and distance matrices. We demonstrate the use of this model by simulating the HIV epidemic of Malawi 1975-2030 and exploring whether the observed heterogeneity could be preserved without taking into account any spatial heterogeneity in sexual behaviour. A relatively simple version of the model reproduced the broad-scale differences in HIV prevalence, but the detailed differences will need further investigation.
hiv aids
10.1101/2020.12.22.20248755
Response of human liver tissue to innate immune stimuli
Precision-cut human liver slice cultures (PCLS) has become an important alternative immunological platform in preclinical testing. To further evaluate the capacity of PCLS, we investigated the innate immune response to TLR3 agonist (poly-I:C) and TLR4 agonist (LPS) with the normal and pathological liver tissue. Pathological liver tissue was obtained from patients with active chronic HCV infection, and patients with former chronic HCV infection cured by recent Direct-Acting Antiviral (DAA) drug therapy. We found that hepatic innate immunity was not suppressed but enhanced in the HCV-infected tissue, compared with the healthy controls. Furthermore, despite recent HCV elimination, DAA-cured liver tissue manifested ongoing abnormalities in liver immunity. Sustained abnormal immune gene expression in DAA-cured samples were identified from direct ex vivo measurements and with TLR3 and TLR4 stimulation assays. Those genes that were up-regulated in chronic HCV-infected liver tissue were enriched in expression in the liver non-parenchymal cell compartment. These results demonstrated the utility of PCLS in studying the liver pathology and innate immunity.
allergy and immunology
10.1101/2020.12.22.20246868
Effective design of barrier enclosure to contain aerosol emissions from COVID-19 patients
Facing shortages of personal protective equipment, some clinicians have advocated the use of barrier enclosures (typically mounted over the head, with and without suction) to contain aerosol emissions from coronavirus disease 2019 (COVID-19) patients. There is however little evidence for its usefulness. To test the effectiveness of such a device, we built a manikin that can expire micron-sized aerosols at flow rates close to physiological conditions. We then placed the manikin inside the enclosure and used a laser sheet to visualize the aerosol leaking out. We show that with sufficient suction, it is possible to effectively contain aerosol from the manikin even at high flow rates (up to 60 L min-1) of oxygen, reducing aerosol exposure outside the enclosure by 99%. In contrast, a passive barrier without suction only reduces aerosol exposure by 60%.
intensive care and critical care medicine
10.1101/2020.12.22.20244806
Aesthetic evaluation of the need for orthodontic treatment / Perception among university students
IntroductionAesthetics is a relevant part of procedures in healthcare, often influencing treatment planning in tandem with a healthy function. Orthodontic treatment (OT) is one of many solutions and is, sometimes, wanted purely for aesthetic reasons. In 1989, Brook and Shaw proposed the Index of Orthodontic Treatment Need (IOTN), which has been largely used. This study aims to verify the main motivations of university students to look for OT and, based on the Aesthetic Component of the IOTN, weigh the aesthetics influence to seek for it. It was compared the opinion of students from various areas of study - Dentistry, Science and Nature (ScN), Arts and Humanity (AtH) - at the beginning (Initiated Students) and end (Advanced Students) of their graduation; the same question was also analysed taking into account their nationalities and training schools. Materials and MethodsIn a collaboration between the University of Porto (Portugal) and the University of Medicine and Pharmacy of Cluj-Napoca (Romania), a sample of 1071 individuals was gathered. Participants responded to an online survey, based on IOTN pictures, about what would motivate them to seek OT. The ratings were analysed using the T-Test and alpha error of 5%. ResultsThe results showed that the total Dentistry students registered a higher Oral Esthetical Sensibility (OES) than the total ScN and AtH students. All groups, except Dentistry Advanced Students, registered a higher OES for Self-Perception than for Perception of Others. DiscussionAmong other factors, the pictures used from the IOTN, taken in 1989, may have influenced the participants responses. However, it is the most used index, and it is validated. ConclusionIn conclusion, for the studied populations, the main motivations for OT demand are primarily and respectively in this order: functional reasons, doctors advice, and aesthetical reasons. OES is influenced by Dentistry studies, specifically in Advanced Students. OES is not influenced by the country of the students origin nor the country they are graduating at.
dentistry and oral medicine
10.1101/2020.12.22.20248150
Comparing non-communicable disease risk factors between Asian migrants and native Koreans: an observational study
ImportanceRegarding international migrants, the theories of healthy migration effect and sick migration effect both exist; thus, assessing the health of international migrants is crucial in the Republic of Korea, Asia, and even worldwide. ObjectiveTo compare non-communicable disease risk factors among Asian migrants in Korea and the Korean population. DesignA cross-sectional (2015) and longitudinal (2009{square}2015) observational study. SettingPopulation-wide analysis using the National Health Information Database of the Korean National Health Insurance Service for 2009{square}2015. ParticipantsAsian migrants (n=987,214) in Korea and Korean nationals (n=1,693,281) aged [&ge;]20 years were included. In addition, Asian migrants were divided into Chinese, Japanese, Filipino, Vietnamese, and other Asian migrants. ExposureThe nationality of Asian migrants compared with Koreans. Main Outcomes and MeasuresThe prevalence of non-communicable disease risk factors, such as current smoking, obesity, diabetes, and hypertension, in 2015 was analyzed. Regarding the age-adjusted prevalence, direct age standardization was conducted separately by sex using 10-year age bands; the World Standard Population was used as the standard population. ResultsAmong participants aged [&ge;]20 years, the age-adjusted prevalence of current smoking was higher among Chinese migrant men than among Korean men (P<0.001) and among other Asian migrant women than among Korean women (P<0.001). The age-adjusted prevalence of obesity was higher in Chinese, Filipino, and other Asian migrant women than in Korean women (P<0.001, P=0.002, and P<0.001, respectively). Among participants aged 20-49 years, the age-adjusted prevalence of diabetes and hypertension was higher in Filipino migrant women than in Korean women (P=0.009 and P<0.001, respectively). Conclusion and RelevanceThe current status of smoking and obesity among Asian migrants of specific nationalities is worse than that among native Koreans. Moreover, the health inequalities among Filipino migrant women in Korea, especially those aged 20-49 years, should be addressed. Key PointsO_ST_ABSQuestionC_ST_ABSDo international migrants in Korea have a health advantage regarding non-communicable disease risk factors? FindingsAmong participants aged [&ge;]20 years, the problem of current smoking among Chinese migrant men and other Asian migrant women and that of obesity among Chinese, Filipino, and other Asian migrant women in Korea needs to be addressed. The prevalence of diabetes and hypertension was higher among Filipino migrant women than among Korean women in the 20-49-year age group. MeaningThe relationships between Asian migrant nationality and non-communicable disease risk factors provide evidence for targeting high-risk groups and improving policy development in Korea.
public and global health
10.1101/2020.12.22.20248150
Comparing non-communicable disease risk factors between Asian migrants and native Koreans: an observational study
ImportanceRegarding international migrants, the theories of healthy migration effect and sick migration effect both exist; thus, assessing the health of international migrants is crucial in the Republic of Korea, Asia, and even worldwide. ObjectiveTo compare non-communicable disease risk factors among Asian migrants in Korea and the Korean population. DesignA cross-sectional (2015) and longitudinal (2009{square}2015) observational study. SettingPopulation-wide analysis using the National Health Information Database of the Korean National Health Insurance Service for 2009{square}2015. ParticipantsAsian migrants (n=987,214) in Korea and Korean nationals (n=1,693,281) aged [&ge;]20 years were included. In addition, Asian migrants were divided into Chinese, Japanese, Filipino, Vietnamese, and other Asian migrants. ExposureThe nationality of Asian migrants compared with Koreans. Main Outcomes and MeasuresThe prevalence of non-communicable disease risk factors, such as current smoking, obesity, diabetes, and hypertension, in 2015 was analyzed. Regarding the age-adjusted prevalence, direct age standardization was conducted separately by sex using 10-year age bands; the World Standard Population was used as the standard population. ResultsAmong participants aged [&ge;]20 years, the age-adjusted prevalence of current smoking was higher among Chinese migrant men than among Korean men (P<0.001) and among other Asian migrant women than among Korean women (P<0.001). The age-adjusted prevalence of obesity was higher in Chinese, Filipino, and other Asian migrant women than in Korean women (P<0.001, P=0.002, and P<0.001, respectively). Among participants aged 20-49 years, the age-adjusted prevalence of diabetes and hypertension was higher in Filipino migrant women than in Korean women (P=0.009 and P<0.001, respectively). Conclusion and RelevanceThe current status of smoking and obesity among Asian migrants of specific nationalities is worse than that among native Koreans. Moreover, the health inequalities among Filipino migrant women in Korea, especially those aged 20-49 years, should be addressed. Key PointsO_ST_ABSQuestionC_ST_ABSDo international migrants in Korea have a health advantage regarding non-communicable disease risk factors? FindingsAmong participants aged [&ge;]20 years, the problem of current smoking among Chinese migrant men and other Asian migrant women and that of obesity among Chinese, Filipino, and other Asian migrant women in Korea needs to be addressed. The prevalence of diabetes and hypertension was higher among Filipino migrant women than among Korean women in the 20-49-year age group. MeaningThe relationships between Asian migrant nationality and non-communicable disease risk factors provide evidence for targeting high-risk groups and improving policy development in Korea.
public and global health
10.1101/2020.12.22.20248150
Comparing non-communicable disease risk factors between Asian migrants and native Koreans: an observational study
ImportanceRegarding international migrants, the theories of healthy migration effect and sick migration effect both exist; thus, assessing the health of international migrants is crucial in the Republic of Korea, Asia, and even worldwide. ObjectiveTo compare non-communicable disease risk factors among Asian migrants in Korea and the Korean population. DesignA cross-sectional (2015) and longitudinal (2009{square}2015) observational study. SettingPopulation-wide analysis using the National Health Information Database of the Korean National Health Insurance Service for 2009{square}2015. ParticipantsAsian migrants (n=987,214) in Korea and Korean nationals (n=1,693,281) aged [&ge;]20 years were included. In addition, Asian migrants were divided into Chinese, Japanese, Filipino, Vietnamese, and other Asian migrants. ExposureThe nationality of Asian migrants compared with Koreans. Main Outcomes and MeasuresThe prevalence of non-communicable disease risk factors, such as current smoking, obesity, diabetes, and hypertension, in 2015 was analyzed. Regarding the age-adjusted prevalence, direct age standardization was conducted separately by sex using 10-year age bands; the World Standard Population was used as the standard population. ResultsAmong participants aged [&ge;]20 years, the age-adjusted prevalence of current smoking was higher among Chinese migrant men than among Korean men (P<0.001) and among other Asian migrant women than among Korean women (P<0.001). The age-adjusted prevalence of obesity was higher in Chinese, Filipino, and other Asian migrant women than in Korean women (P<0.001, P=0.002, and P<0.001, respectively). Among participants aged 20-49 years, the age-adjusted prevalence of diabetes and hypertension was higher in Filipino migrant women than in Korean women (P=0.009 and P<0.001, respectively). Conclusion and RelevanceThe current status of smoking and obesity among Asian migrants of specific nationalities is worse than that among native Koreans. Moreover, the health inequalities among Filipino migrant women in Korea, especially those aged 20-49 years, should be addressed. Key PointsO_ST_ABSQuestionC_ST_ABSDo international migrants in Korea have a health advantage regarding non-communicable disease risk factors? FindingsAmong participants aged [&ge;]20 years, the problem of current smoking among Chinese migrant men and other Asian migrant women and that of obesity among Chinese, Filipino, and other Asian migrant women in Korea needs to be addressed. The prevalence of diabetes and hypertension was higher among Filipino migrant women than among Korean women in the 20-49-year age group. MeaningThe relationships between Asian migrant nationality and non-communicable disease risk factors provide evidence for targeting high-risk groups and improving policy development in Korea.
public and global health
10.1101/2020.12.22.20248150
Comparing non-communicable disease risk factors between Asian migrants and native Koreans: an observational study
ImportanceRegarding international migrants, the theories of healthy migration effect and sick migration effect both exist; thus, assessing the health of international migrants is crucial in the Republic of Korea, Asia, and even worldwide. ObjectiveTo compare non-communicable disease risk factors among Asian migrants in Korea and the Korean population. DesignA cross-sectional (2015) and longitudinal (2009{square}2015) observational study. SettingPopulation-wide analysis using the National Health Information Database of the Korean National Health Insurance Service for 2009{square}2015. ParticipantsAsian migrants (n=987,214) in Korea and Korean nationals (n=1,693,281) aged [&ge;]20 years were included. In addition, Asian migrants were divided into Chinese, Japanese, Filipino, Vietnamese, and other Asian migrants. ExposureThe nationality of Asian migrants compared with Koreans. Main Outcomes and MeasuresThe prevalence of non-communicable disease risk factors, such as current smoking, obesity, diabetes, and hypertension, in 2015 was analyzed. Regarding the age-adjusted prevalence, direct age standardization was conducted separately by sex using 10-year age bands; the World Standard Population was used as the standard population. ResultsAmong participants aged [&ge;]20 years, the age-adjusted prevalence of current smoking was higher among Chinese migrant men than among Korean men (P<0.001) and among other Asian migrant women than among Korean women (P<0.001). The age-adjusted prevalence of obesity was higher in Chinese, Filipino, and other Asian migrant women than in Korean women (P<0.001, P=0.002, and P<0.001, respectively). Among participants aged 20-49 years, the age-adjusted prevalence of diabetes and hypertension was higher in Filipino migrant women than in Korean women (P=0.009 and P<0.001, respectively). Conclusion and RelevanceThe current status of smoking and obesity among Asian migrants of specific nationalities is worse than that among native Koreans. Moreover, the health inequalities among Filipino migrant women in Korea, especially those aged 20-49 years, should be addressed. Key PointsO_ST_ABSQuestionC_ST_ABSDo international migrants in Korea have a health advantage regarding non-communicable disease risk factors? FindingsAmong participants aged [&ge;]20 years, the problem of current smoking among Chinese migrant men and other Asian migrant women and that of obesity among Chinese, Filipino, and other Asian migrant women in Korea needs to be addressed. The prevalence of diabetes and hypertension was higher among Filipino migrant women than among Korean women in the 20-49-year age group. MeaningThe relationships between Asian migrant nationality and non-communicable disease risk factors provide evidence for targeting high-risk groups and improving policy development in Korea.
public and global health
10.1101/2020.12.22.20248150
Comparing non-communicable disease risk factors between Asian migrants and native Koreans: an observational study
ImportanceRegarding international migrants, the theories of healthy migration effect and sick migration effect both exist; thus, assessing the health of international migrants is crucial in the Republic of Korea, Asia, and even worldwide. ObjectiveTo compare non-communicable disease risk factors among Asian migrants in Korea and the Korean population. DesignA cross-sectional (2015) and longitudinal (2009{square}2015) observational study. SettingPopulation-wide analysis using the National Health Information Database of the Korean National Health Insurance Service for 2009{square}2015. ParticipantsAsian migrants (n=987,214) in Korea and Korean nationals (n=1,693,281) aged [&ge;]20 years were included. In addition, Asian migrants were divided into Chinese, Japanese, Filipino, Vietnamese, and other Asian migrants. ExposureThe nationality of Asian migrants compared with Koreans. Main Outcomes and MeasuresThe prevalence of non-communicable disease risk factors, such as current smoking, obesity, diabetes, and hypertension, in 2015 was analyzed. Regarding the age-adjusted prevalence, direct age standardization was conducted separately by sex using 10-year age bands; the World Standard Population was used as the standard population. ResultsAmong participants aged [&ge;]20 years, the age-adjusted prevalence of current smoking was higher among Chinese migrant men than among Korean men (P<0.001) and among other Asian migrant women than among Korean women (P<0.001). The age-adjusted prevalence of obesity was higher in Chinese, Filipino, and other Asian migrant women than in Korean women (P<0.001, P=0.002, and P<0.001, respectively). Among participants aged 20-49 years, the age-adjusted prevalence of diabetes and hypertension was higher in Filipino migrant women than in Korean women (P=0.009 and P<0.001, respectively). Conclusion and RelevanceThe current status of smoking and obesity among Asian migrants of specific nationalities is worse than that among native Koreans. Moreover, the health inequalities among Filipino migrant women in Korea, especially those aged 20-49 years, should be addressed. Key PointsO_ST_ABSQuestionC_ST_ABSDo international migrants in Korea have a health advantage regarding non-communicable disease risk factors? FindingsAmong participants aged [&ge;]20 years, the problem of current smoking among Chinese migrant men and other Asian migrant women and that of obesity among Chinese, Filipino, and other Asian migrant women in Korea needs to be addressed. The prevalence of diabetes and hypertension was higher among Filipino migrant women than among Korean women in the 20-49-year age group. MeaningThe relationships between Asian migrant nationality and non-communicable disease risk factors provide evidence for targeting high-risk groups and improving policy development in Korea.
public and global health
10.1101/2020.12.23.20248763
Spotlight on the dark figure: Exhibiting dynamics in the case detection ratio of COVID-19 infections in Germany
The case detection ratio of COVID-19 infections varies over time due to changing testing capacities, modified testing strategies and also, apparently, due to the dynamics in the number of infected itself. In this paper we investigate these dynamics by jointly looking at the reported number of detected COVID-19 infections with non-fatal and fatal outcomes in different age groups in Germany. We propose a statistical approach that allows us to spotlight the case detection ratio and quantify its changes over time. With this we can adjust the case counts reported at different time points so that they become comparable. Moreover we can explore the temporal development of the real number of infections, shedding light on the dark number. The results show that the case detection ratio has increased and, depending on the age group, is four to six times higher at the beginning of the second wave compared to what it was at the peak of the first wave. The true number of infection in Germany in October was considerably lower as during the peak of the first wave, where only a small fraction of COVID-19 infections were detected. Our modelling approach also allows quantifying the effects of different testing strategies on the case detection ratio. The analysis of the dynamics in the case detection rate and in the true infection figures enables a clearer picture of the course of the COVID-19 pandemic.
epidemiology
10.1101/2020.12.22.20248716
The main factors influencing COVID-19 spread and deaths in Mexico: A comparison between Phases I and II
This article investigates the geographical spread of confirmed COVID-19 cases and deaths across municipalities in Mexico. It focuses on the spread dynamics and containment of the virus between Phase I (from March 23 to May 31, 2020) and Phase II (from June 1 to August 22, 2020) of the social distancing measures. It also examines municipal-level factors associated with cumulative COVID-19 cases and deaths to understand the spatial determinants of the pandemic. The analysis of the geographic pattern of the pandemic via spatial scan statistics revealed a fast spread among municipalities. During Phase I, clusters of infections and deaths were mainly located at the countrys center, whereas in Phase II, these clusters dispersed to the rest of the country. The regression results from the zero-inflated negative binomial regression analysis suggested that income inequality, the prevalence of obesity and diabetes, and concentration of fine particulate matter (PM 2.5) are strongly positively associated with confirmed cases and deaths regardless of lockdown.
epidemiology
10.1101/2020.12.22.20248737
Improving on estimates of the potential relative harm to health from using modern ENDS (vaping) compared to tobacco smoking
BackgroundAlthough the harm to health from electronic nicotine delivery systems (ENDS) compared to smoked tobacco remains highly uncertain, society and governments still need to know the likely range of the relative harm to inform regulatory policies for ENDS and smoking. MethodsWe identified biomarkers with specificity of association with different disease groupings e.g., volatile organic compound (VOCs) for chronic obstructive pulmonary disease; and tobacco-specific N-nitrosamines (TSNAs) and polycyclic aromatic hydrocarbons (PAHs) for all cancers. We conducted a review of recent studies (post January 2017) that compared these biomarkers between people exclusively using ENDS and those exclusively smoking tobacco. The percentage differences in these biomarkers, weighted by study size and adjusted for acrolein from other sources, were used as a proxy for the assumed percentage difference in disease harm between ENDS and smoking. These relative differences were applied to previously modelled estimates of smoking-related health loss (in health-adjusted life-years; HALYs). ResultsThe respective relative biomarker levels (ENDS vs smoking) were: 28% for respiratory diseases (five results, three studies); 42% for cancers (five results, four studies); and 35% for cardiovascular (seven results, four studies). When integrated with the HALY impacts by disease, the overall harm to health from ENDS was estimated to be 33% that of smoking. ConclusionsThis analysis, suggests that the use of modern ENDS devices (vaping) could be a third as harmful to health as smoking in a high-income country setting. But this estimate is based on a limited number of biomarker studies and is best be considered a likely upper level of ENDS risk given potential biases in our method (i.e., the biomarkers used being correlated with more unaccounted for toxicants in smoking compared to with using ENDS).
epidemiology
10.1101/2020.12.23.20248425
Mechanical ventilation affects respiratory microbiome of COVID-19 patients and its interactions with the host
Understanding the pathology of COVID-19 is a global research priority. Early evidence suggests that the respiratory microbiome may be playing a role in disease progression, yet current studies report contradictory results. Here, we examine potential confounders in COVID-19 respiratory microbiome studies by analyzing the upper (n=58) and lower (n=35) respiratory tract microbiome in well-phenotyped COVID-19 patients and controls combining microbiome sequencing, viral load determination, and immunoprofiling. We found that time in the intensive care unit and the type of oxygen support, both of which are associated to additional treatments such as antibiotic usage, explained the most variation within the upper respiratory tract microbiome, while SARS-CoV-2 viral load had a reduced impact. Specifically, mechanical ventilation was linked to altered community structure, lower species- and higher strain-level diversity, and significant shifts in oral taxa previously associated with COVID-19. Single-cell transcriptomic analysis of the lower respiratory tract of mechanically ventilated COVID-19 patients identified specific oral bacteria, different to those observed in controls. These oral taxa were found physically associated with proinflammatory immune cells, which showed higher levels of inflammatory markers. Overall, our findings suggest confounders are driving contradictory results in current COVID-19 microbiome studies and careful attention needs to be paid to ICU stay and type of oxygen support, as bacteria favored in these conditions may contribute to the inflammatory phenotypes observed in severe COVID-19 patients.
infectious diseases
10.1101/2020.12.23.20248425
Clinical practices underlie COVID-19 patient respiratory microbiome composition and its interactions with the host
Understanding the pathology of COVID-19 is a global research priority. Early evidence suggests that the respiratory microbiome may be playing a role in disease progression, yet current studies report contradictory results. Here, we examine potential confounders in COVID-19 respiratory microbiome studies by analyzing the upper (n=58) and lower (n=35) respiratory tract microbiome in well-phenotyped COVID-19 patients and controls combining microbiome sequencing, viral load determination, and immunoprofiling. We found that time in the intensive care unit and the type of oxygen support, both of which are associated to additional treatments such as antibiotic usage, explained the most variation within the upper respiratory tract microbiome, while SARS-CoV-2 viral load had a reduced impact. Specifically, mechanical ventilation was linked to altered community structure, lower species- and higher strain-level diversity, and significant shifts in oral taxa previously associated with COVID-19. Single-cell transcriptomic analysis of the lower respiratory tract of mechanically ventilated COVID-19 patients identified specific oral bacteria, different to those observed in controls. These oral taxa were found physically associated with proinflammatory immune cells, which showed higher levels of inflammatory markers. Overall, our findings suggest confounders are driving contradictory results in current COVID-19 microbiome studies and careful attention needs to be paid to ICU stay and type of oxygen support, as bacteria favored in these conditions may contribute to the inflammatory phenotypes observed in severe COVID-19 patients.
infectious diseases
10.1101/2020.12.24.20248821
Seropositivity in blood donors and pregnant women during the first year of SARS-CoV-2 transmission in Stockholm, Sweden
In Sweden, social restrictions to contain SARS-CoV-2 have to date primarily relied upon voluntary adherence to a set of recommendations and strict lockdowns/regulations have not been enforced, potentially affecting viral dissemination. To understand the levels of past SARS-CoV-2 infection in the Stockholm population before the start of mass vaccinations, healthy blood donors and pregnant women (n=5,100) were sampled at random between 14th March 2020-28th February 2021. All individuals (n=200/sampling week) were screened for anti-SARS-CoV-2 spike (S) trimer- and RBD-specific IgG responses and the results were compared with those from historical controls (n=595). Data were modelled using a probabilistic Bayesian framework that considered individual responses to both viral antigens. We found that after a steep rise at the start of the pandemic, the seroprevalence trajectory increased more steadily (over summer) in approach to the winter second-wave of infections, approaching 15% of all adults surveyed by mid-December 2020. The population seropositivity rate again increased more rapidly as cases rose over the winter period. By the end of February 2021, [~]19% ([~]one-in-five) in this study group tested seropositive. Notably, 96% of random seropositive samples screened (n=56), displayed virus neutralizing responses, with titers comparable to those engendered by recently approved mRNA vaccines, supporting that milder infections generally provoke a competent B cell response. These data offer baseline information about the level of seropositivity in this group of active adults in the Stockholm metropolitan area following a full year of SARS-CoV-2 transmission and prior to the introduction of vaccines. Structured abstractO_ST_ABSObjectivesC_ST_ABSSweden did not enforce social lockdown in response to the SARS-CoV-2 pandemic. Therefore, we sought to determine the proportion of seropositive healthy, active adults in Stockholm, the countrys most populous region. Random sampling (of blood donors and pregnant women) was carried out during the first year following virus emergence in the country and prior to vaccination of the general adult population - allowing for an estimate of seroprevalence in response to natural infection. DesignIn this cross-sectional prospective study, otherwise-healthy blood donors (n=2,600) and pregnant women(n=2,500) were sampled at random for consecutive weeks (at four intervals) between 14th March and 28th February 2021. Sera from all participants and a cohort of historical controls (n=595) were screened for IgG responses against trimers of the SARS-CoV-2 spike (S) glycoprotein and the smaller receptor-binding domain (RBD). As a complement to standard analytical approaches, a probabilistic (cut-off-independent) Bayesian framework that assigns likelihood of past infection was used to analyze data over time. The study was carried out in accordance with Swedish Ethical Review Authority: registration number 2020-01807. SettingHealthy participant samples were selected from their respective pools at random through Karolinska University Hospital. ParticipantsNone of the participants were symptomatic at sampling. No additional metadata was available from the samples. ResultsBlood donors and pregnant women showed a similar seroprevalence. After a steep rise at the start of the pandemic, the seroprevalence trajectory increased steadily in approach to the winter second-wave of infections, approaching 15% of all individuals surveyed by 13th December 2020. By the end of February 2021, when deaths were in decline and at low levels following their winter peak, 19% of the population tested seropositive. Notably, 96% of seropositive healthy donors screened (n=56) developed neutralizing antibody responses at titers comparable to, or higher than those observed in clinical trials of SARS-CoV-2 spike mRNA vaccination, supporting that mild infection engenders a competent B cell response. ConclusionsThese data indicate that in the year since the start of community transmission, seropositivity levels in metropolitan Stockholm had reached approximately one-in-five persons, providing important baseline seroprevalence information prior to the start of vaccination.
allergy and immunology
10.1101/2020.12.23.20248148
MHC Haplotyping of SARS-CoV-2 patients: HLA subtypes are not associated with the presence and severity of Covid-19 in the Israeli population
HLA haplotypes were found to be associated with increased risk for viral infections or disease severity in various diseases, including SARS. Several genetic variants are associated with Covid-19 severity. However, no clear association between HLA and Covid-19 incidence or severity has been reported. We conducted a large scale HLA analysis of Israeli individuals who tested positive for SARS-CoV-2 infection by PCR. Overall, 72,912 individuals with known HLA haplotypes were included in the study, of whom 6,413 (8.8%) were found to have SARS-CoV-2 by PCR. a Total of 20,937 subjects were of Ashkenazi origin (at least 2/4 grandparents). One hundred eighty-one patients (2.8% of the infected) were hospitalized due to the disease. None of the 66 most common HLA loci (within the five HLA subgroups; A, B, C, DQB1, DRB1) was found to be associated with SARS-CoV-2 infection or hospitalization. Similarly, no association was detected in the Ashkenazi Jewish subset. Moreover, no association was found between heterozygosity in any of the HLA loci and either infection or hospitalization. We conclude that HLA haplotypes are not a major risk/protecting factor among the Israeli population for SARS-CoV-2 infection or severity.
genetic and genomic medicine
10.1101/2020.12.22.20248753
Early start of oral clarithromycin is associated with better outcome in COVID-19 of moderate severity: the ACHIEVE open-label trial
Background: To study the efficacy of oral clarithromycin in moderate COVID-19. Methods: An open-label non-randomized trial in 90 patients with COVID-19 of moderate severity was conducted between May and October 2020. The primary endpoint was defined at the end-of-treatment (EOT) as no need for hospital re-admission and no progression into lower respiratory tract infection (LRTI) for patients with upper respiratory tract infection; and as at least 50% decrease of the respiratory symptoms score the without progression into severe respiratory failure (SRF) for patients with LRTI. Viral load, biomarkers, the function of mononuclear cells, and safety were assessed. Results: The primary endpoint was attained in 86.7% of patients treated with clarithromycin (95% CIs 78.1-92.2%); this was 91.7% and 81.4% among patients starting clarithromycin the first 5 days from symptoms onset or later (odds ratio after multivariate analysis 6.62; p: 0.030). The responses were better for patients infected by non-B1.1 variants. Clarithromycin use was associated with decreases in circulating C-reactive protein, tumour necrosis factor-alpha and interleukin (IL)-6; by increase of Th1 to Th2 mononuclear responses; and by suppression of SARS-CoV-2 viral load. No safety concerns were reported. Conclusions: Early clarithromycin treatment provides most of clinical improvement in moderate COVID-19 (Trial Registration: ClinicalTrials.gov, NCT04398004)
infectious diseases
10.1101/2020.12.24.20248826
Short-term forecasting of COVID-19 in Germany and Poland during the second wave - a preregistered study
We report insights from ten weeks of collaborative COVID-19 forecasting for Germany and Poland (12 October - 19 December 2020). The study period covers the onset of the second wave in both countries, with tightening non-pharmaceutical interventions (NPIs) and subsequently a decay (Poland) or plateau and renewed increase (Germany) in reported cases. Thirteen independent teams provided probabilistic real-time forecasts of COVID-19 cases and deaths. These were reported for lead times of one to four weeks, with evaluation focused on one- and two-week horizons, which are less affected by changing NPIs. Heterogeneity between forecasts was considerable both in terms of point predictions and forecast spread. Ensemble forecasts showed good relative performance, in particular in terms of coverage, but did not clearly dominate single-model predictions. The study was preregistered and will be followed up in future phases of the pandemic.
epidemiology
10.1101/2020.12.24.20248835
Financial Hardship and Social Assistance as Determinants of Mental Health and Food and Housing Insecurity During the COVID-19 Pandemic in the United States
BackgroundWhile social assistance through the United States (U.S.) federal Coronavirus Aid, Relief, and Economic Security (CARES) Act provided expanded unemployment insurance benefits during the coronavirus disease 2019 (COVID-19) pandemic until the summer of 2020, it is unclear whether subsequent social assistance has been or will be sufficient to meet everyday spending needs and to curb the adverse health-related sequelae of financial hardship. MethodsThis study estimated recent trends in financial hardship among working-aged Americans with job-related income loss during the pandemic. It also used multivariable logistic regression and repeated cross-sectional individual-level U.S. Household Pulse Survey data on 91,222 working-aged adults between September and December 2020 to explore the associations of financial hardship with mental health outcomes and food and housing insecurity after accounting for receipt of social assistance. ResultsExperiencing somewhat of a financial hardship (vs no hardship) was linked to 3-7 times higher odds of anxiety and depressive symptoms and a likely eviction, and 11 times higher odds of food insufficiency. Experiencing considerable financial hardship (vs no hardship) predicted 5-7 fold higher odds of anxiety and depressive symptoms, 34-fold higher odds of a likely housing eviction, and 37-fold higher odds of food insufficiency (all P values <.001). Across outcomes, these relationships were stronger at each successively higher level of financial hardship (all P values for linear trend <.001), and more than offset any corresponding benefits from social assistance. ConclusionsEven after accounting for receipt of social assistance, working-aged adults experiencing financial hardship had markedly greater odds of anxiety and depressive symptoms, food insufficiency, and an anticipated housing eviction. These findings point to the urgent need for direct and sustained cash relief well in excess of current levels of social assistance, and provide a critical baseline assessment for evaluating the impacts of federal public policy responses to economic hardships during the pandemic. It is essential that the U.S. Congress and the new Biden administration provide adequate and needs-based social policy relief measures in order to mitigate the pandemics adverse impacts on the physical, mental, and social well-being of millions of Americans.
public and global health
10.1101/2020.12.24.20248802
Characterizing Long COVID in an International Cohort: 7 Months of Symptoms and Their Impact
Growing evidence shows that a significant number of patients with COVID-19 experience prolonged symptoms, known as Long COVID. Few systematic studies exist which investigate this population, and hence, relatively little is known about the range in symptom makeup and severity, expected clinical course, impact on daily functioning, and expected return to baseline health. In this study, we analysed responses from 3,762 participants with confirmed (diagnostic/antibody positive; 1,020) or suspected (diagnostic/antibody negative or untested; 2,742) COVID-19, from 56 countries, with illness duration of at least 28 days. 3608 (96%) reported symptoms beyond 90 days. Prevalence of 205 symptoms in 10 organ systems was estimated in this cohort, with 66 symptoms traced over seven months. Except for loss of smell and taste, the prevalence and trajectory of all symptoms were similar between groups with confirmed and suspected COVID-19. Respondents experienced an average of 14.5 symptoms in an average of 9.08 organ systems. Three clusters of symptoms were identified based on their prevalence over time. The most likely early symptoms were fatigue, dry cough, shortness of breath, headaches, muscle aches, chest tightness, and sore throat. The most frequent symptoms reported after month 6 were fatigue, post-exertional malaise, and cognitive dysfunction. Majority (>85%) experienced relapses, with exercise, physical or mental activity, and stress as the main triggers. 1,700 (45.2%) reported requiring a reduced work schedule compared to pre-illness and 839 (22.3%) were not working at the time of survey due to their health conditions. Significance StatementResults from our international online survey of 3,762 individuals with suspected or confirmed COVID-19 illness suggest that Long COVID is composed of heterogeneous post-acute infection sequelae that often affect multiple organ systems, with impact on functioning and quality of life ranging from mild to severe. This study represents the largest collection of symptoms identified in the Long COVID population to date, and is the first to quantify individual symptom trajectory over time, for 7 months. Three clusters of symptoms were quantified, each with different morphologies over time. The clusters of symptoms that persist longest include a combination of the neurological/cognitive and systemic symptoms. The reduced work capacity because of cognitive dysfunction, in addition to other debilitating symptoms, translated into the loss of hours, jobs, and ability to work relative to pre-illness levels. Structured AbstractO_ST_ABSObjectiveC_ST_ABSTo characterize the symptom profile and time course in patients with Long COVID, along with the impact on daily life, work, and return to baseline health. DesignInternational web-based survey of suspected and confirmed COVID-19 cases with illness lasting over 28 days and onset prior to June 2020. SettingSurvey distribution via online COVID-19 support groups and social media Participants3,762 respondents from 56 countries completed the survey. 1166 (31.0%) were 40-49 years old, 937 (25.0%) were 50-59 years old, 905 (24.1%) were 30-39 years old, 277 (7.4%) were 18-29 years old, and 477 (12.7%) were above 60 years old. 2961 (78.9%) were women, 718 (19.1%) were men, and 63 (1.7%) were nonbinary. 317 (8.4%) reported being hospitalized. 1020 (27.1%) reported receiving a laboratory-confirmed diagnosis of COVID-19. 3608 (96%) reported symptoms beyond 90 days. ResultsPrevalence of 205 symptoms in 10 organ systems was estimated in this cohort, with 66 symptoms traced over seven months. Except for loss of smell and taste, the prevalence and trajectory of all other symptoms are similar between confirmed (diagnostic/antibody positive) and suspected groups (diagnostic/antibody negative or untested). Respondents experienced symptoms in an average of 9.08 (95% confidence interval 9.04 to 9.13) organ systems. The most frequent symptoms reported after month 6 were: fatigue (77.7%, 74.9% to 80.3%), post-exertional malaise (72.2%, 69.3% to 75.0%), and cognitive dysfunction (55.4%, 52.4% to 58.8%). These three symptoms were also the three most commonly reported overall. In those who recovered in less than 90 days, the average number of symptoms peaked at week 2 (11.4, 9.4 to 13.6), and in those who did not recover in 90 days, the average number of symptoms peaked at month 2 (17.2, 16.5 to 17.8). Respondents with symptoms over 6 months experienced an average of 13.8 (12.7 to 14.9) symptoms in month 7. 85.9% (84.8% to 87.0%) experienced relapses, with exercise, physical or mental activity, and stress as the main triggers. 86.7% (85.6% to 92.5%) of unrecovered respondents were experiencing fatigue at the time of survey, compared to 44.7% (38.5% to 50.5%) of recovered respondents. 45.2% (42.9% to 47.2%) reported requiring a reduced work schedule compared to pre-illness and 22.3% (20.5% to 24.3%) were not working at the time of survey due to their health conditions. ConclusionsPatients with Long COVID report prolonged multisystem involvement and significant disability. Most had not returned to previous levels of work by 6 months. Many patients are not recovered by 7 months, and continue to experience significant symptom burden.
infectious diseases
10.1101/2020.12.24.20248822
Estimated transmissibility and severity of novel SARS-CoV-2 Variant of Concern 202012/01 in England
A novel SARS-CoV-2 variant, VOC 202012/01 (lineage B.1.1.7), emerged in southeast England in November 2020 and is rapidly spreading towards fixation. Using a variety of statistical and dynamic modelling approaches, we estimate that this variant has a 43-90% (range of 95% credible intervals 38-130%) higher reproduction number than preexisting variants. A fitted two-strain dynamic transmission model shows that VOC 202012/01 will lead to large resurgences of COVID-19 cases. Without stringent control measures, including limited closure of educational institutions and a greatly accelerated vaccine roll-out, COVID-19 hospitalisations and deaths across England in 2021 will exceed those in 2020. Concerningly, VOC 202012/01 has spread globally and exhibits a similar transmission increase (59-74%) in Denmark, Switzerland, and the United States.
epidemiology
10.1101/2020.12.24.20248822
Estimated transmissibility and impact of SARS-CoV-2 lineage B.1.1.7 in England
A novel SARS-CoV-2 variant, VOC 202012/01 (lineage B.1.1.7), emerged in southeast England in November 2020 and is rapidly spreading towards fixation. Using a variety of statistical and dynamic modelling approaches, we estimate that this variant has a 43-90% (range of 95% credible intervals 38-130%) higher reproduction number than preexisting variants. A fitted two-strain dynamic transmission model shows that VOC 202012/01 will lead to large resurgences of COVID-19 cases. Without stringent control measures, including limited closure of educational institutions and a greatly accelerated vaccine roll-out, COVID-19 hospitalisations and deaths across England in 2021 will exceed those in 2020. Concerningly, VOC 202012/01 has spread globally and exhibits a similar transmission increase (59-74%) in Denmark, Switzerland, and the United States.
epidemiology
10.1101/2020.12.24.20248815
VEGF-C ameliorates lymphatic drainage, portal pressure and ascites in experimental portal hypertension
Background and aimsGut lymphatic vessels are crucial in maintaining abdominal fluid homeostasis. An impaired lymphatic drainage of these vessels has been associated with the presence of ascites in liver cirrhosis. We thus explored the therapeutic effects of gut-targeted delivery of vascular endothelial growth factor-C (VEGF-C), a pro-lymphangiogenic factor, in cirrhosis and portal hypertension. MethodsVegfr3 antibody-tagged lipocarriers were used to formulate E-VEGF-C molecule for its targeted delivery in lymphatic endothelial cells (LyECs). In vitro characterization, cytotoxicity, in vivo biodistrubution and pharmacokinetic analysis of E-VEGF-C was performed. In vivo, E-VEGF-C was given orally in cirrhotic and non-cirrhotic rat models of portal hypertension. Rats given lipocarriers alone served as vehicle. Mesenteric lymphatic vessels and drainage were analyzed. Ascites and hepatic hemodynamics was measured. Molecular and histological studies of the mesentery was performed. Lymphatic channels were also enumerated in duodenal (D2) biopsies from cirrhotic patients. ResultsE-VEGF-C exhibited a size of <200nm and a zeta potential of 6mV. In vitro and in vivo, E-VEGF-C was efficiently taken up by mesenteric LyECs. E-VEGF-C treated rats displayed an increase in numbers and drainage of mesenteric lymphatic vessels and a reduction in ascites as compared to CCl4-vehicle. Portal pressures were attenuated in both cirrhotic and non-cirrhotic portal hypertensive rats treated with E-VEGF-C as compared to their respective vehicle. In patients, dilated lymphatic vessels were increased in decompensated as compared to compensated cirrhosis. ConclusionA targeted mesenteric lymphangiogenesis leading to an improved lymphatic drainage may serve as an emerging therapy for portal hypertension and ascites in patients with decompensated cirrhosis. SynopsisA lipid nanocarrier conjugated pro-lymphangiogenic molecule, VEGF-C with specificity for uptake by the gut lymphatic endothelial cells has been developed to enhance gut lymphangiogenesis. A targeted delivery of VEGF-C improves lymphatic vessel drainage, attenuating abdominal ascites and portal pressures. VEGF-C therapy holds immense potential to manage ascites in patients with decompensated cirrhosis having dilated gut lymphatic vessels.
gastroenterology
10.1101/2020.12.25.20248427
Maori and Pacific People in New Zealand have higher risk of hospitalisation for COVID-19
AimsWe aim to quantify differences in clinical outcomes from COVID-19 infection in Aotearoa New Zealand by ethnicity with a focus on risk of hospitalisation. MethodsWe used data on age, ethnicity, deprivation index, pre-existing health conditions, and clinical outcomes on 1,829 COVID-19 cases reported in New Zealand. We used a logistic regression model to calculate odds ratios for the risk of hospitalisation by ethnicity. We also consider length of hospital stay and risk of fatality. ResultsM[a]ori have 2.50 times greater odds of hospitalisation (95% CI 1.39 - 4.51) than non-M[a]ori, non-Pacific people, after controlling for age and pre-existing conditions. Pacific people have 3 times greater odds (95% CI 1.75 - 5.33). ConclusionsStructural inequities and systemic racism in the healthcare system mean that M[a]ori and Pacific communities face a much greater health burden from COVID-19. Older people and those with pre-existing health conditions are also at greater risk. This should inform future policy decisions including prioritising groups for vaccination.
infectious diseases
10.1101/2020.12.23.20248579
A Checklist for Assessing the Methodological Quality of Concurrent tES-fMRI Studies (ContES Checklist): A Consensus Study and Statement
BackgroundLow intensity transcranial electrical stimulation (tES), including alternating or direct current stimulation (tACS or tDCS), applies weak electrical stimulation to modulate the activity of brain circuits. Integration of tES with concurrent functional magnetic resonance imaging (fMRI) allows for the mapping of neural activity during neuromodulation, supporting causal studies of both brain function and tES effects. Methodological aspects of tES-fMRI studies underpin the results, and reporting them in appropriate detail is required for reproducibility and interpretability. Despite the growing number of published reports, there are no consensus-based checklists for disclosing methodological details of concurrent tES-fMRI studies. ObjectiveTo develop a consensus-based checklist of reporting standards for concurrent tES-fMRI studies to support methodological rigor, transparency, and reproducibility (ContES Checklist). MethodsA two-phase Delphi consensus process was conducted by a steering committee (SC) of 13 members and 49 expert panelists (EP) through the International Network of the tES-fMRI (INTF) Consortium. The process began with a circulation of a preliminary checklist of essential items and additional recommendations, developed by the SC based on a systematic review of 57 concurrent tES-fMRI studies. Contributors were then invited to suggest revisions or additions to the initial checklist. After the revision phase, contributors rated the importance of the 17 essential items and 42 additional recommendations in the final checklist. The state of methodological transparency within the 57 reviewed concurrent tES-fMRI studies was then assessed using the checklist. ResultsExperts refined the checklist through the revision and rating phases, leading to a checklist with three categories of essential items and additional recommendations: (1) technological factors, (2) safety and noise tests, and (3) methodological factors. The level of reporting of checklist items varied among the 57 concurrent tES-fMRI papers, ranging from 24% to 76%. On average, 53% of checklist items were reported in a given article. ConclusionsUse of the ContES checklist is expected to enhance the methodological reporting quality of future concurrent tES-fMRI studies, and increase methodological transparency and reproducibility.
psychiatry and clinical psychology
10.1101/2020.12.24.20248672
Deep longitudinal phenotyping of wearable sensor data reveals independent markers of longevity, stress, and resilience
Biological age acceleration (BAA) models based on blood tests or DNA methylation emerge as a de facto standard for quantitative characterizations of the aging process. We demonstrate that deep neural networks trained to predict morbidity risk from wearable sensor data can provide a high-quality and cheap alternative for BAA determination. The GeroSense BAA model presented here was tolerant of gaps in the data, and exhibited a superior association with life-expectancy over the average number of steps per day, e.g., in groups stratified by professional occupations. The association between the BAA and effects of lifestyles, the prevalence or future incidence of diseases was comparable to that of BAA from models based on blood test results. Wearable sensors let sampling of BAA fluctuations at time scales corresponding to days and weeks and revealed the divergence of organism state recovery time (resilience) as a function of chronological age. The number of individuals suffering from the lack of resilience increased exponentially with age at a rate compatible with Gompertz mortality law. We speculate that due to stochastic character of BAA fluctuations, its mean and auto-correlation properties together comprise the minimum set of biomarkers of aging in humans.
public and global health
10.1101/2020.12.25.20248840
The interplay between subcritical fluctuations and import: understanding COVID-19 epidemiology dynamics
The effective reproduction ratio r(t) of an epidemic, defined as the average number of secondary infected cases per infectious case in a population in the current state, including both susceptible and non-susceptible hosts, controls the transition between a subcritical threshold regime (r(t) < 1) and a supercritical threshold regime (r(t) > 1). While in subcritical regimes, an index infected case will cause an outbreak that will die out sooner or later, with large fluctuations observed when approaching the epidemic threshold, the supercritical regimes leads to an exponential growths of infection. The super- or subcritical regime of an outbreak is often not distinguished when close to the epidemic threshold, but its behaviour is of major importance to understand the course of an epidemic and public health management of disease control. In a subcritical parameter regime undetected infection, here called "imported case" or import, i.e. a susceptible individual becoming infected from outside the study area e.g., can either spark recurrent isolated outbreaks or keep the ongoing levels of infection, but cannot cause an exponential growths of infection. However, when the community transmission becomes supercritical, any index case or few "imported cases" will lead the epidemic to an exponential growths of infections, hence being distinguished from the subcritical dynamics by a critical epidemic threshold in which large fluctuations occur in stochastic versions of the considered processes. As a continuation of the COVID-19 Basque Modeling Task Force, we now investigate the role of critical fluctuations and import in basic Susceptible-Infected-Susceptible (SIS) and Susceptible-Infected-Recovered (SIR) epidemiological models on disease spreading dynamics. Without loss of generality, these simple models can be treated analytically and, when considering the mean field approximation of more complex underlying stochastic and eventually spatially extended or generalized network processes, results can be applied to more complex models used to describe the COVID-19 epidemics. In this paper, we explore possible features of the course of an epidemic, showing that the subcritical regime can explain the dynamic behaviour of COVID-19 spreading in the Basque Country, with this theory supported by empirical data data.
epidemiology
10.1101/2020.12.25.20248859
Public adherence to precautionary measures and preventive guidelines against COVID-19 in Sudan: An application of the Health Belief Model
BackgroundCorona virus disease (COVID-19) is highly infectious disease caused by the novel corona virus (SARS-CoV-2). Several public health and social protective measures that may prevent or slow down the transmission of COVID-19 were introduced. However, these measures are unfortunately neglected or deliberately ignored by some individuals. MethodsWe did a cross sectional online based survey to identify possible factors influencing intention to adhere to precautionary measures and preventive guidelines against COVID-19 during lockdown periods in Sudan. The questionnaire was used to collect socio-demographic data of study participants, their health beliefs and intention regarding adherence to precautionary measures against COVID-19 based on the constructs of the Health Belief Model. ResultsTotal of 680 respondents completed and returned the online questionnaire. Significant predictors of intention to adhere to the precautionary measures against COVID-19 were gender ({beta} =3.34, P <0.001), self-efficacy ({beta}= 0.476, P<0.001), perceived benefits ({beta}= 0.349, P<0.001) and perceived severity ({beta}= 0.113, P=0.005). These factors explained 43% of the variance in respondents intention to adhere to COVID-19 precautionary measures. Participants who were female, confident in their ability to adhere to the protective measures when available, believing in the benefits of the protective measures against COVID-19 and perceiving that the disease could have serious consequences were more likely to be willing to adhere to the protective measures. ConclusionFemale respondents and respondents having higher self-efficacy, higher perceived benefits and higher perceived severity were more likely to be willing to adhere to the protective measures against COVID-19 in Sudan.
public and global health
10.1101/2020.12.30.20248603
SARS-CoV-2 positivity in asymptomatic-screened dental patients
Enhanced community surveillance is a key pillar of the public health response to COVID-19. Asymptomatic carriage of SARS-CoV-2 is a potentially significant source of transmission, yet remains relatively poorly understood. Disruption of dental services continues with significantly reduced capacity. Ongoing precautions include pre- and/or at appointment COVID-19 symptom screening and use of enhanced personal protective equipment (PPE). This study aimed to investigate SARS-CoV-2 infection in dental patients to inform community surveillance and improve understanding of risks in the dental setting. Thirty-one dental care centres across Scotland invited asymptomatic screened patients over 5-years-old to participate. Following verbal consent and completion of sociodemographic and symptom history questionnaire, trained dental teams took a combined oropharyngeal and nasal swab sample using standardised VTM-containing testkits. Samples were processed by the Lighthouse Lab and patients informed of their results by SMS/e-mail with appropriate self-isolation guidance in the event of a positive test. Over a 13-week period (from 3August to 31October2020) n=4,032 patients, largely representative of the population, were tested. Of these n=22 (0.5%; 95%CI 0.5%, 0.8%) tested positive for SARS-CoV-2. The positivity rate increased over the period, commensurate with uptick in community prevalence identified across all national testing monitoring data streams. All positive cases were successfully followed up by the national contact tracing program. To the best of our knowledge this is the first report of a COVID-19 testing survey in asymptomatic-screened patients presenting in a dental setting. The positivity rate in this patient group reflects the underlying prevalence in community at the time. These data are a salient reminder, particularly when community infection levels are rising, of the importance of appropriate ongoing Infection Prevention Control and PPE vigilance, which is relevant as healthcare team fatigue increases as the pandemic continues. Dental settings are a valuable location for public health surveillance.
dentistry and oral medicine