id
stringlengths 16
27
| title
stringlengths 18
339
| abstract
stringlengths 95
38.7k
| category
stringlengths 7
44
|
---|---|---|---|
10.1101/2021.01.30.21250314 | Diagnostic accuracy of PanbioTM rapid antigen tests on oropharyngeal swabs for detection of SARS-CoV-2 | BackgroundAntigen-detecting rapid diagnostic tests (Ag-RDTs) for the detection of SARS-CoV-2 offer new opportunities for testing in the context of the COVID-19 pandemic. Nasopharyngeal swabs (NPS) are the reference sample type, but oropharyngeal swabs (OPS) may be a more acceptable sample type in some patients.
MethodsWe conducted a prospective study in a single screening center to assess the diagnostic performance of the Panbio COVID-19 Ag Rapid Test (Abbott) on OPS compared with reverse-transcription quantitative PCR (RT-qPCR) using NPS.
Results402 outpatients were enrolled in a COVID-19 screening center, of whom 168 (41.8%) had a positive RT-qPCR test. The oropharyngeal Ag-RDT sensitivity compared to nasopharyngeal RT-qPCR was 81% (95%CI: 74.2-86.6). Two false positives were noted out of the 234 RT-qPCR negative individuals, which resulted in a specificity of 99.1% (95%CI: 96.9-99.9) for the Ag-RDT.
For cycle threshold values [≤] 26.7 ([≥] 1E6 SARS-CoV-2 genomes copies/mL, a presumed cut-off for infectious virus), 96.3% sensitivity (95%CI: 90.7-99.0%) was obtained with the Ag-RDT using OPS.
InterpretationBased on our findings, the diagnostic performance of the Panbio Covid-19 RDT with OPS samples meet the criteria required by the WHO for Ag-RDTs (sensitivity[≥]80% and specificity [≥]97%). | infectious diseases |
10.1101/2021.01.29.21250710 | Under what circumstances could vaccination offset the harm from a more transmissible variant of SARS-COV-2 in NYC? Trade-offs regarding prioritization and speed of vaccination. | IntroductionNew York City (NYC) was a global epicenter of COVID-19. Vaccines against COVID-19 became available in December 2020 with limited supply, resulting in the need for policies regarding prioritization. The next month, SARS-CoV-2 variants were detected that were more transmissible but still vaccine-susceptible, raising scrutiny of these policies. In particular, prioritization of higher-risk people could prevent more deaths per dose of vaccine administered but could also delay herd immunity if the prioritization introduced bottlenecks that lowered vaccination speed (the number of doses that could be delivered per day). We used mathematical modeling to examine the trade-off between prioritization and the vaccination speed.
MethodsA stochastic, discrete-time susceptible-exposed-infected-recovered (SEIR) model with age- and comorbidity-adjusted COVID-19 outcomes (infections, hospitalizations, and deaths by July 1, 2021) was used to examine the trade-off between vaccination speed and whether or not vaccination was prioritized to individuals age 65+ and "essential workers," defined as including first responders and healthcare, transit, education, and public safety workers. The model was calibrated to COVID-19 hospital admissions, hospital census, ICU census, and deaths in NYC. Vaccination speed was assumed to be 10,000 doses per day starting December 15th, 2020 targeting healthcare workers and nursing home populations, and to subsequently expand at alternative starting times and speeds. We compared COVID-outcomes across alternative expansion starting times (January 15th, January 21st, or February 1st) and speeds (20,000, 30,000, 50,000, 100,000, 150,000, or 200,000 doses per day for the first dose), as well as alternative prioritization options ("yes" versus "no" prioritization of essential workers and people age 65+). Model projections were produced with and without considering the emergence of a SARS-COV-2 variant with 56% greater transmissibility over January and February, 2021.
ResultsIn the absence of a COVID-19 vaccine, the emergence of the more transmissible variant would triple the peak in infections, hospitalizations, and deaths and more than double cumulative infections, hospitalizations, and deaths. To offset the harm from the more transmissible variant would require reaching a vaccination speed of at least 100,000 doses per day by January 15th or 150,000 per day by January 21st. Prioritizing people ages 65+ and essential workers increased the number of lives saved per vaccine dose delivered: with the emergence of a more transmissible variant, 8,000 deaths could be averted by delivering 115,000 doses per day without prioritization or 71,000 doses per day with prioritization. If prioritization were to cause a bottleneck in vaccination speed, more lives would be saved with prioritization only if the bottleneck reduced vaccination speed by less than one-third of the maximum vaccine delivery capacity. These trade-offs between vaccination speed and prioritization were robust over a wide range of delivery capacity.
ConclusionsThe emergence of a more transmissible variant of SARS-CoV-2 has the potential to triple the 2021 epidemic peak and more than double the 2021 COVID-19 burden in NYC. Vaccination could only offset the harm of the more transmissible variant if high speed were achieved in mid-to late January. Prioritization of COVID-19 vaccines to higher-risk populations saves more lives only if it does not create an excessive vaccine delivery bottleneck. | infectious diseases |
10.1101/2021.01.29.21250759 | Genetically predicted serum vitamin D and COVID-19: a Mendelian randomization study | ObjectivesTo investigate causality of the association of serum vitamin D with the risk and severity of COVID-19 infection.
DesignTwo-sample Mendelian randomization study.
SettingSummary data from genome-wide analyses in the population-based UK Biobank and SUNLIGHT Consortium, applied to meta-analyzed results of genome-wide analyses in the COVID-19 Host Genetics Initiative.
Participants17,965 COVID-19 cases including 11,085 laboratory or physician confirmed cases, 7,885 hospitalized cases, and 4,336 severe respiratory cases, and 1,370,547 controls, primarily of European ancestry.
ExposuresGenetically predicted variation in serum vitamin D status, based on genome-wide significant single nucleotide polymorphisms (SNPs) associated with serum vitamin D or risk of vitamin D deficiency/insufficiency.
Main outcome measuresSusceptibility to and severity of COVID-19 infection, including severe respiratory infection and hospitalization.
ResultsMendelian randomization analysis, powered to detect moderate effects comparable to those seen in observational studies, provided little to no evidence for an effect of genetically predicted serum vitamin D on susceptibility to or severity of COVID-19 infection. Using SNPs in loci related to vitamin D metabolism as proxies for serum vitamin D concentration, the odds ratio for a standard deviation increase in serum vitamin D was 1.04 (95% confidence interval 0.92 to 1.18) for any COVID-19 infection versus population controls, 1.05 (0.84-1.31) for hospitalized COVID-19 versus population controls, 0.96 (0.64 to 1.43) for severe respiratory COVID-19 versus population controls, 1.15 (0.99 to 1.35) for COVID-19 positive versus COVID-19 negative, and 1.44 (0.75 to 2.78) for hospitalized COVID-19 versus non-hospitalized COVID-19. Results were similar in analyses that used all SNPs with genome-wide significant associations with serum vitamin D (i.e., including SNPs in loci with no known relationship to vitamin D metabolism) and in analyses using SNPs with genome-wide significant associations with risk of vitamin D deficiency or insufficiency.
ConclusionsThese findings suggest that genetically predicted differences in long-term vitamin D nutritional status do not causally affect susceptibility to and severity of COVID-19 infection, and that associations observed in previous studies may have been driven by confounding. These results do not exclude the possibility of low-magnitude causal effects, nor do they preclude potential causal effects of acute responses to therapeutic doses of vitamin D. Future directions include extension of this work to non-European ancestry populations, and high-risk populations, for example persons with comorbid disease. | infectious diseases |
10.1101/2021.01.29.21250798 | Identification of B.1.346 lineage of SARS-CoV-2 in Japan: Genomic evidence of re-entry of Clade 20C | ObjectivesWhole SARS-CoV-2 genome sequencing from COVID-19 patients is useful for infection control and regional trends evaluation. We report a lineage data collected from hospitals in the Kanto region of Japan.
MethodsWe performed whole genome sequencing in specimens of 198 COVID-19 patients at 13 collaborating hospitals in the Kanto region. Phylogenetic analysis and fingerprinting of the nucleotide substitutions underwent to differentiate and classify the viral lineages.
ResultsMore than 90% of the strains belonged to Clade 20B and two lineages (B.1.1.284 and B.1.1.214) have been detected predominantly in the Kanto region. However, one sample from a COVID-19 patient in November 2020, belonged to the B.1.346 lineage of Clade 20C, which has been prevalent in western United States. The patient had no history of overseas travel and no contact with anyone who had travelled abroad, suggesting that this strain appeared likely to have been imported from western United States, across the strict quarantine barrier.
ConclusionB.1.1.284 and B.1.1.214 have been identified predominantly in the Kanto region and B.1.346 of clade 20C in one patient was probably imported from western United States. These results illustrate that a decentralized network of hospitals can be significantly advantageous for monitoring regional molecular epidemiologic trends.
Highlights{middle dot} Whole SARS-CoV-2 genome sequencing is useful for infection control
{middle dot} B.1.1.284 and B.1.1.214 have been identified predominantly in the Kanto region
{middle dot} B.1.346 of Clade 20C was detected in one COVID-19 patient in November
{middle dot} Molecular genomic data sharing provides benefits to public health against COVID-19 | infectious diseases |
10.1101/2021.01.29.21250552 | Aerosol emission from the respiratory tract: an analysis of relative risks from oxygen delivery systems | BackgroundRisk of aerosolisation of SARS-CoV-2 directly informs organisation of acute healthcare and PPE guidance. Continuous positive airways pressure (CPAP) and high-flow nasal oxygen (HFNO) are widely used modes of oxygen delivery and respiratory support for patients with severe COVID-19, with both considered as high risk aerosol generating procedures. However, there are limited high quality experimental data characterising aerosolisation during oxygen delivery and respiratory support.
MethodsHealthy volunteers were recruited to breathe, speak, and cough in ultra-clean, laminar flow theatres followed by using oxygen and respiratory support systems. Aerosol emission was measured using two discrete methodologies, simultaneously. Hospitalised patients with COVID-19 were also recruited and had aerosol emissions measured during breathing, speaking, and coughing.
FindingsIn healthy volunteers (n = 25 subjects; 531 measures), CPAP (with exhalation port filter) produced less aerosols than breathing, speaking and coughing (even with large >50L/m facemask leaks). HFNO did emit aerosols, but the majority of these particles were generated from the HFNO machine, not the patient. HFNO-generated particles were small (<1m), passing from the machine through the patient and to the detector without coalescence with respiratory aerosol, thereby unlikely to carry viral particles. Coughing was associated with the highest aerosol emissions with a peak concentration at least 10 times greater the mean concentration generated from speaking or breathing. Hospitalised patients with COVID-19 (n = 8 subjects; 56 measures) had similar size distributions to healthy volunteers.
InterpretationIn healthy volunteers, CPAP is associated with less aerosol emission than breathing, speaking or coughing. Aerosol emission from the respiratory tract does not appear to be increased by HFNO. Although direct comparisons are complex, cough appears to generate significant aerosols in a size range compatible with airborne transmission of SARS-CoV-2. As a consequence, the risk of SARS-CoV-2 aerosolisation is likely to be high in all areas where patients with Covid-19 are coughing. Guidance on personal protective equipment policy should reflect these updated risks.
FundingNIHR-UKRI Rapid COVID call (COV003), Wellcome Trust GW4-CAT Doctoral Training Scheme (FH), MRC CARP Fellowship(JD, MR/T005114/1). Natural Environment Research Council grant (BB, NE/P018459/1)
Research in contextO_ST_ABSEvidence before this studyC_ST_ABSPubMed was searched from inception until 10/1/21 using the terms aerosol, and variations of non-invasive positive pressure ventilation and high-flow nasal oxygen therapy. Studies were included if they measured aerosol generated from volunteers or patients receiving non-invasive positive pressure ventilation (NIV) or high flow nasal oxygen therapy (HFNO), or provided experimental evidence on a simulated human setting. One study was identified (Gaeckle et al, 2020) which measured aerosol emission with one methodology (APS) but was limited by high background concentration of aerosol and a low number of participants (n = 10).
Added value of this studyThis study used multiple methodologies to measure aerosol emission from the respiratory tract before and during CPAP and high-flow nasal oxygen, in an ultra-clean, laminar flow theatre with near-zero background aerosol and recruited patients with COVID-19 to ensure similar aerosol distributions. We conclude that there is negligible aerosol generation with CPAP, that aerosol emission from HFNO is from the machine and not the patient, coughing emits aerosols consistent with airborne transmission of SARS CoV2 and that healthy volunteers are a reasonable proxy for COVID-19 patients.
Implications of all the available evidenceCPAP and HFNO should not be considered high risk aerosol generating procedures, based on our study and that of Gaeckle et al. Recorded aerosol emission from HFNO stems from the machine. Cough remains a significant aerosol risk. PPE guidance should be updated to ensure medical staff are protected with appropriate PPE in situations when patients with suspected or proven COVID-19 are likely to cough. | infectious diseases |
10.1101/2021.01.29.21250655 | Impact of public sentiments on the transmission of COVID-19 across a geographical gradient | COVID-19 is a respiratory disease caused by a recently discovered, novel coronavirus, SARS-COV2. The disease has led to over 81 million confirmed cases of COVID-19, with close to 2 million deaths. In the current social climate, the risk of COVID-19 infection is driven by individual and public perception of risk and sentiments. A number of factors influences public perception, including an individuals belief system, prior knowledge about a disease and information about a disease. In this paper, we develop a model for COVID-19 using a system of ordinary differential equations following the natural history of the infection. The model uniquely incorporates social behavioral aspects such as quarantine and quarantine violation. The model is further driven by peoples sentiments (positive and negative) which accounts for the influence of disinformation. Peoples sentiments were obtained by parsing through and analyzing COVID-19 related tweets from Twitter, a social media platform across six countries. Our results show that our model incorporating public sentiments is able to capture the trend in the trajectory of the epidemic curve of the reported cases. Furthermore, our results show that positive public sentiments reduce disease burden in the community. Our results also show that quarantine violation and early discharge of the infected population amplifies the disease burden on the community. Hence, it is important to account for public sentiment and individual social behavior in epidemic models developed to study diseases like COVID-19. | infectious diseases |
10.1101/2021.01.29.21250655 | Impact of public sentiments on the transmission of COVID-19 across a geographical gradient | COVID-19 is a respiratory disease caused by a recently discovered, novel coronavirus, SARS-COV2. The disease has led to over 81 million confirmed cases of COVID-19, with close to 2 million deaths. In the current social climate, the risk of COVID-19 infection is driven by individual and public perception of risk and sentiments. A number of factors influences public perception, including an individuals belief system, prior knowledge about a disease and information about a disease. In this paper, we develop a model for COVID-19 using a system of ordinary differential equations following the natural history of the infection. The model uniquely incorporates social behavioral aspects such as quarantine and quarantine violation. The model is further driven by peoples sentiments (positive and negative) which accounts for the influence of disinformation. Peoples sentiments were obtained by parsing through and analyzing COVID-19 related tweets from Twitter, a social media platform across six countries. Our results show that our model incorporating public sentiments is able to capture the trend in the trajectory of the epidemic curve of the reported cases. Furthermore, our results show that positive public sentiments reduce disease burden in the community. Our results also show that quarantine violation and early discharge of the infected population amplifies the disease burden on the community. Hence, it is important to account for public sentiment and individual social behavior in epidemic models developed to study diseases like COVID-19. | infectious diseases |
10.1101/2021.01.29.21250764 | Testing out of quarantine | Since SARS-CoV-2 emerged, a 14-day quarantine has been recommended based on COVID-19"s incubation period. Using an RT-PCR or rapid antigen test to "test out" of quarantine is a frequently proposed strategy to shorten duration without increasing risk. We calculated the probability that infected individuals test negative for SARS-CoV-2 on a particular day post-infection and remain symptom free for some period of time. We estimate that an infected individual has a 20.1% chance (95% CI 9.8-32.6) of testing RT-PCR negative on day five post-infection and remaining asymptomatic until day seven. We also show that the added information a test provides decreases as we move further from the test date, hence a less sensitive test that returns rapid results is often preferable to a more sensitive test with a delay. | infectious diseases |
10.1101/2021.01.29.21250762 | Predicting Prognosis in COVID-19 Patients using Machine Learning and Readily Available Clinical Data | RationalePrognostic tools for aiding in the treatment of hospitalized COVID-19 patients could help improve outcome by identifying patients at higher or lower risk of severe disease.
ObjectivesThe study objective was to develop models to stratify patients by risk of severe outcomes during COVID-19 hospitalization using readily available information at hospital admission.
MethodsHierarchical ensemble classification models were trained on a set of 229 patients hospitalized with COVID-19 to predict severe outcomes, including ICU admission, development of ARDS, or intubation, using easily attainable attributes including basic patient characteristics, vital signs at admission, and basic lab results collected at time of presentation. Each test stratifies patients into groups of increasing risk. An additional cohort of 330 patients was used for blinded, independent validation. Shapley value analysis evaluated which attributes contributed most to the models predictions of risk.
Measurements and Main ResultsTest performance was assessed using precision (positive predictive value) and recall (sensitivity) of the final risk groups. All test cut-offs were fixed prior to blinded validation. In both development and validation, the tests achieved precision in the lowest risk groups near or above 0.9. The proportion of patients with severe outcomes significantly increased across increasing risk groups. While the importance of attributes varied by test and patient, CRP, LDH, and D-dimer were often found to be important in the assignment of risk label.
ConclusionsRisk of severe outcomes for patients hospitalized with COVID-19 infection can be assessed using machine learning-based models based on attributes routinely collected at hospital admission. | intensive care and critical care medicine |
10.1101/2021.01.29.21250626 | "This is really like waiting for war and this is not good" - Intertwining between pandemic experiences, and the development of professional action of healthcare professionals in critical care at the beginning of the COVID-19 pandemic in Germany: a qualitative study | Healthcare professionals (HCPs) are facing remarkable challenges in their daily work since the outbreak of the COVID-19 pandemic. Being well prepared is crucial for dealing with such a pandemic. The aim of our study was to explore HCPs subjective perspectives on their professional action and coping strategies in critical care during the preparation and coping phase after the outbreak of the COVID-19 pandemic in Germany.
Together with HCPs working in critical care, we collaboratively designed an interview study based on an ethnomethodological approach. We performed semi-structured qualitative interviews via telephone or video call and analysed the data based on grounded theory.
Our research interest was focused on HCPs (qualified nurses, physicians, medical students) working in critical care during the first wave of the COVID-19 pandemic in Germany between April and July 2020.
Our sample consisted of 39 HCPs (19 nurses, 17 physicians, three medical students, 18/39 female) from ten German federal states. All participants were involved in the acute care of COVID-19 infected patients in hospitals and had a mean professional experience of 14.8{+/-}10.1 years, 15 participants held a management position (e.g. senior physician or head nurse). We recruited participants via personal contacts and snowballing.
Initial and focused coding resulted in seven categories: Creating structural measures, handling operational changes, dealing with personal protective equipment, building up knowledge and skills, managing information, perceiving peer support and experiencing emotions.
Professional action and subjectively perceived preparedness (professional and emotional) interacted with each other. Their interrelation was not static, but rather dynamic and ambiguous according to the situation. The findings of our study can be beneficial in developing guidelines, policy interventions or personnel and work practice strategies. | intensive care and critical care medicine |
10.1101/2021.01.29.21250626 | "This is really like waiting for war and this is not good" - Intertwining between pandemic experiences, and the development of professional action of healthcare professionals in critical care at the beginning of the COVID-19 pandemic in Germany: a qualitative study | Healthcare professionals (HCPs) are facing remarkable challenges in their daily work since the outbreak of the COVID-19 pandemic. Being well prepared is crucial for dealing with such a pandemic. The aim of our study was to explore HCPs subjective perspectives on their professional action and coping strategies in critical care during the preparation and coping phase after the outbreak of the COVID-19 pandemic in Germany.
Together with HCPs working in critical care, we collaboratively designed an interview study based on an ethnomethodological approach. We performed semi-structured qualitative interviews via telephone or video call and analysed the data based on grounded theory.
Our research interest was focused on HCPs (qualified nurses, physicians, medical students) working in critical care during the first wave of the COVID-19 pandemic in Germany between April and July 2020.
Our sample consisted of 39 HCPs (19 nurses, 17 physicians, three medical students, 18/39 female) from ten German federal states. All participants were involved in the acute care of COVID-19 infected patients in hospitals and had a mean professional experience of 14.8{+/-}10.1 years, 15 participants held a management position (e.g. senior physician or head nurse). We recruited participants via personal contacts and snowballing.
Initial and focused coding resulted in seven categories: Creating structural measures, handling operational changes, dealing with personal protective equipment, building up knowledge and skills, managing information, perceiving peer support and experiencing emotions.
Professional action and subjectively perceived preparedness (professional and emotional) interacted with each other. Their interrelation was not static, but rather dynamic and ambiguous according to the situation. The findings of our study can be beneficial in developing guidelines, policy interventions or personnel and work practice strategies. | intensive care and critical care medicine |
10.1101/2021.01.31.21250875 | Performance of Heart Failure Clinical Prediction Models: A Systematic External Validation Study | BackgroundMost heart failure (HF) clinical prediction models (CPMs) have not been externally validated.
MethodsWe performed a systematic review to identify CPMs predicting outcomes in HF, stratified by acute and chronic HF CPMs. External validations were performed using individual patient data from 8 large HF trials (1 acute, 7 chronic). CPM discrimination (c-statistic, % relative change in c-statistic), calibration (calibration slope, Harrells E, E90), and net benefit were evaluated for each CPM with and without recalibration.
ResultsOf 135 HF CPMs screened, 24 (18%) were compatible with the population, predictors and outcomes to the trials and 42 external validations were performed (14 acute HF, 28 chronic HF). The median derivation c-statistic of acute HF CPMs was 0.76 (IQR, 0.75, 0.8), validation c-statistic was 0.67 (0.65, 0.68) and model-based c-statistic was 0.68 (0.66, 0.76), Hence, most of the apparent decrement in model performance was due to narrower case-mix in the validation cohort compared with the development cohort. The median derivation c-statistic for chronic HF CPMs was 0.76 (0.74, 0.8), validation c-statistic 0.61 (0.6, 0.63) and model-based c-statistic 0.68 (0.62, 0.71), suggesting that the decrement in model performance was only partially due to case-mix heterogeneity. Calibration was generally poor - median E (standardized by outcome rate) was 0.5 (0.4, 2.2) for acute HF CPMs and 0.5 (0.3, 0.7) for chronic HF CPMs. Updating the intercept alone led to a significant improvement in calibration in acute HF CPMs, but not in chronic HF CPMs. Net benefit analysis showed potential for harm in using CPMs when the decision threshold was not near the overall outcome rate but this improved with model recalibration.
ConclusionsOnly a small minority of published CPMs contained variables and outcomes that were compatible with the clinical trial datasets. For acute HF CPMs, discrimination is largely preserved after adjusting for case-mix; however, the risk of net harm is substantial without model recalibration for both acute and chronic HF CPMs. | cardiovascular medicine |
10.1101/2021.01.29.21250732 | Paclitaxel-Coated or Uncoated Devices: Significant Differences in Patient Populations and Mortality Led to Study Incomparability | The SWEDEPAD trial reported an unplanned interim analysis to show no difference in the mortality rate between the paclitaxel-coated and uncoated groups (Nordanstig et al., 2020), which contradicts the long-term risk of paclitaxel-coated devices claimed by a meta-analysis (Katsanos et al., 2018). However, there existed significant differences in mortality rates between the SWEDEPAD trial and the trials included in the meta-analysis, which were caused by significant differences in the patient populations. As a result, the SWEDEPAD trial and meta-analysis results are not directly comparable. An updated meta-analysis including the SWEDPEPAD trial and all studies in the meta-analysis (Katsanos et al., 2018) shows marginal differences in mortality rates between the paclitaxel-coated and control groups at two years with Bayesian relative risk (RR) 1.39 (95% credible interval (CrI) [1.01, 2.39]) and frequentist RR 1.16 (95% confidence interval (CI) [0.99, 1.36]) and differences in mortality rates during the entire follow-up period with Bayesian RR 1.29 (95% CrI [1.01, 1.72]) and frequentist RR 1.13 (95% CI [0.99, 1.28]) under random-effects models. Given the relatively short follow-up thus far in the SWEDEPAD trial (with a mean follow-up of 2.49 years) and the paclitaxel-coated risk being long-term (e.g., 4 or 5 years), the interim results on the risk of paclitaxel-coated devices reported by the SWEDEPAD trial warrant further investigation. | cardiovascular medicine |
10.1101/2021.01.30.21250777 | Accuracy of four lateral flow immunoassays for anti SARS-CoV-2 antibodies: a head-to-head comparative study | BackgroundSARS-CoV-2 antibody tests are used for population surveillance and might have a future role in individual risk assessment. Lateral flow immunoassays (LFIAs) can deliver results rapidly and at scale, but have widely varying accuracy.
MethodsIn a laboratory setting, we performed head-to-head comparisons of four LFIAs: the Rapid Test Consortiums AbC-19 Rapid Test, OrientGene COVID IgG/IgM Rapid Test Cassette, SureScreen COVID-19 Rapid Test Cassette, and Biomerica COVID-19 IgG/IgM Rapid Test. We analysed blood samples from 2,847 key workers and 1,995 pre-pandemic blood donors with all four devices.
FindingsWe observed a clear trade-off between sensitivity and specificity: the IgG band of the SureScreen device and the AbC-19 device had higher specificities but OrientGene and Biomerica higher sensitivities. Based on analysis of pre-pandemic samples, SureScreen IgG band had the highest specificity (98.9%, 95% confidence interval 98.3 to 99.3%), which translated to the highest positive predictive value across any pre-test probability: for example, 95.1% (95%CI 92.6, 96.8%) at 20% pre-test probability. All four devices showed higher sensitivity at higher antibody concentrations ("spectrum effects"), but the extent of this varied by device.
InterpretationThe estimates of sensitivity and specificity can be used to adjust for test error rates when using these devices to estimate the prevalence of antibody. If tests were used to determine whether an individual has SARS-CoV-2 antibodies, in an example scenario in which 20% of individuals have antibodies we estimate around 5% of positive results on the most specific device would be false positives.
FundingPublic Health England.
Research in contextO_ST_ABSEvidence before this studyC_ST_ABSWe searched for evidence on the accuracy of the four devices compared in this study: OrientGene COVID IgG/IgM Rapid Test Cassette, SureScreen COVID-19 Rapid Test Cassette, Biomerica COVID-19 IgG/IgM Rapid Test and the UK Rapid Test Consortiums AbC-19 Rapid Test. We searched Ovid MEDLINE (In-Process & Other Non-Indexed Citations and Daily), PubMed, MedRxiv/BioRxiv and Google Scholar from January 2020 to 16th January 2021. Search terms included device names AND ((SARS-CoV-2) OR (covid)). Of 303 records assessed, data were extracted from 24 studies: 18 reporting on the accuracy of the OrientGene device, 7 SureScreen, 2 AbC-19 and 1 Biomerica. Only three studies compared the accuracy of two or more of the four devices. With the exception of our previous report on the accuracy of the AbC-19 device, which the current manuscript builds upon, sample size ranged from 7 to 684. For details, see Supplementary Materials.
The largest study compared OrientGene, SureScreen and Biomerica. SureScreen was estimated to have the highest specificity (99.8%, 95% CI 98.9 to 100%) and OrientGene the highest sensitivity (92.6%), but with uncertainty about the latter result due to small sample sizes. The other two comparative studies were small (n = 65, n = 67) and therefore provide very uncertain results.
We previously observed spectrum effects for the AbC-19 device, such that sensitivity is upwardly biased if estimated only from PCR-confirmed cases. The vast majority of previous studies estimated sensitivity in this way.
Added value of this studyWe performed a large scale (n = 4,842), head-to-head laboratory-based evaluation and comparison of four lateral flow devices, which were selected for evaluation by the UK Department of Health and Social Cares New Tests Advisory Group, on the basis of a survey of test and performance data available. We evaluated the performance of diagnosis based on both IgG and IgM bands, and the IgG band alone. We found a clear trade-off between sensitivity and specificity across devices, with the SureScreen and AbC-19 devices being more specific and OrientGene and Biomerica more sensitive. Based on analysis of 1,995 pre-pandemic blood samples, we are 99% confident that SureScreen (IgG band reading) has the highest specificity of the four devices (98.9%, 95% CI 98.3, 99.3%).
We found evidence that all four devices have reduced sensitivity at lower antibody indices, i.e. spectrum effects. However, the extent of this varies by device and appears to be less for other devices than for AbC-19.
Our estimates of sensitivity and specificity are likely to be higher than would be observed in real use of these devices, as they were based on majority readings of three trained laboratory personnel.
Implications of all the available evidenceWhen used in epidemiological studies of antibody prevalence, the estimates of sensitivity and specificity provided in this study can be used to adjust for test errors. Increased precision in error rates will translate to increased precision in seroprevalence estimates. If lateral flow devices were used for individual risk assessment, devices with maximum specificity would be preferable. However, if, for an example, 20% of the tested population had antibodies, we estimate that around 1 in 20 positive results on the most specific device would be incorrect. | epidemiology |
10.1101/2021.01.29.21250788 | COVID-19 Hospitalizations in Five California Hospitals | STRUCTURED ABSTRACTO_ST_ABSImportanceC_ST_ABSCharacterization of a diverse cohort hospitalized with COVID-19 in a health care system in California is needed to further understand the impact of SARS-CoV-2 and improve patient outcomes.
ObjectivesTo investigate the characteristics of patients hospitalized with COVID-19 and assess factors associated with poor outcomes.
DesignPatient-level retrospective cohort study
SettingUniversity of California five academic hospitals.
ParticipantsPatients [≥]18 years old with a confirmed test result for SAR-CoV-2 virus hospitalized at five UC hospitals.
ExposureConfirmed severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection by positive results on polymerase chain reaction testing of a nasopharyngeal sample among patients requiring hospital admission.
Main Outcomes and MeasuresAdmission to the intensive care unit, death during hospitalization, and the composite of both outcomes.
ResultsOutcomes were assessed for 4,730 patients who were discharged or died during a hospitalization. A total of 846 patients were treated at UC Davis, 1,564 UC Irvine, 1,283 UC Los Angeles, 471 UC San Diego, and 566 UC San Francisco. More than 20% of patients were [≥]75 years of age (75-84: 12.3%, [≥]85: 10.5%), male (56.5%), Hispanic/Latino (45.7%), and Asian (10.3%). The most common comorbidities were hypertension (35.2%), cardiac disease (33.3%), and diabetes (24.0%). The ICU admission rate was 25.2% (1194/4730), with 7.0% (329/4730) in-hospital mortality. Among patients admitted to the ICU, 18.8% (225/1194) died; 2.9% (104/3536) died without ICU admission. The rate of the composite outcome (ICU admission and/or death) was 27.4% (1,298/4,730). While controlling for comorbidities, patients of age 75-84 (OR 1.47, 95% CI: 1.11-1.93) and 85-59 (OR 1.39, 95% CI: 1.04-1.87) were more likely to experience a composite outcome than 18-34 year-olds. Males (OR 1.39, 95% CI: 1.21-1.59), and patients identifying as Hispanic/Latino (OR 1.35, 95% CI: 1.14-1.61), and Asian (OR 1.43, 95% CI: 1.23-1.82), were also more likely to experience a composite outcome than White. Patients with 5 or more comorbidities were exceedingly likely to experience a composite outcome (OR 2.74, 95% CI: 2.32-3.25).
ConclusionsMales, older patients, those with pre-existing comorbidities, and those identifying as Hispanic/Latino or Asian experienced an increased risk of ICU admission and/or death.
KEY POINTSO_ST_ABSQuestionC_ST_ABSWhat are the characteristics and outcomes of patients with SARS-CoV-2 infection hospitalized at five UC Health medical centers in California?
FindingsIn this retrospective case series of 4,730 patients requiring hospitalization for COVID-19 in UC Healths five medical centers, male (OR 1.41, 95% CI: 1.23-1.61), Hispanic/Latino (OR 1.35, 95% CI: 1.14-1.61), and Asian (OR 1.43, 95% CI: 1.12-1.82) were more likely to be admitted to the ICU and/or die after adjustment for age and comorbidity. ICU admission and/or death was more likely among older individuals and greater numbers of pre-existing conditions.
MeaningThis study describes the experience of a large, diverse cohort of patients with COVID-19 hospitalized in five hospitals in California between December 14, 2019 and January 6, 2021. | epidemiology |
10.1101/2021.01.30.21250828 | Strategies for vaccination against SARS-CoV-2 to efficiently bring R<1 | With limited availability of vaccines, an efficient use of the limited supply of vaccines in order to achieve herd immunity will be an important tool to combat the wide-spread prevalence of COVID-19. Here, we compare a selection of strategies for vaccine distribution, including a novel targeted vaccination approach (EHR) that provides a noticeable increase in vaccine impact on disease spread compared to age-prioritized and random selection vaccination schemes. Using high-fidelity individual-based computer simulations with Oslo, Norway as an example, we find that for a community reproductive number in a setting where the base pre-vaccination reproduction number R = 2.1 without population immunity, the EHR method reaches herd immunity at 48% of the population vaccinated with 90% efficiency, whereas the common age-prioritized approach needs 89%, and a population-wide random selection approach requires 61%. We find that age-based strategies have a substantially weaker impact on epidemic spread and struggle to achieve herd immunity under the majority of conditions. Furthermore, the vaccination of minors is essential to achieving herd immunity, even for ideal vaccines providing 100% protection. | epidemiology |
10.1101/2021.01.30.21250828 | Comparing the impact of vaccination strategies on spread of COVID-19, including a novel household-targeted vaccination strategy | With limited availability of vaccines, an efficient use of the limited supply of vaccines in order to achieve herd immunity will be an important tool to combat the wide-spread prevalence of COVID-19. Here, we compare a selection of strategies for vaccine distribution, including a novel targeted vaccination approach (EHR) that provides a noticeable increase in vaccine impact on disease spread compared to age-prioritized and random selection vaccination schemes. Using high-fidelity individual-based computer simulations with Oslo, Norway as an example, we find that for a community reproductive number in a setting where the base pre-vaccination reproduction number R = 2.1 without population immunity, the EHR method reaches herd immunity at 48% of the population vaccinated with 90% efficiency, whereas the common age-prioritized approach needs 89%, and a population-wide random selection approach requires 61%. We find that age-based strategies have a substantially weaker impact on epidemic spread and struggle to achieve herd immunity under the majority of conditions. Furthermore, the vaccination of minors is essential to achieving herd immunity, even for ideal vaccines providing 100% protection. | epidemiology |
10.1101/2021.01.29.21250793 | COVID-19 spread and Weather in U.S. states: a cross-correlative study on summer-autumn 2020. | An effect of weather on sars-cov-2 transmission is regularly proposed as a putative cause of unexplained fluctuations of covid-19 new cases, but clear data supporting this hypothesis remains to be presented. Here I measured longitudinal time-series correlations between outdoor temperature, humidity and covid-19 reproduction number (Rt) in the 50 U.S. states (+DC). In order to mitigate the confounding influence of varying social restriction measures, the analysis spans a 5-month period during summer and autumn 2020 when restrictions were comparatively lower and more stable. I used a cross-covariance approach to account for a variable delay between infection and case report. For a delay near 11 days, most U.S. states exhibited a negative correlation between outdoor temperature and Rt, as well as between absolute humidity and Rt (mean r = -0.35). In 21 states, the correlation was strong (r < -0.5). Individual state data are presented, and associations between cold and/or dry weather episodes and short-term new case surges are proposed. After identifying potential confounding factors, I discuss 3 possible causal mechanisms that could explain a correlation between outdoor weather and indoor disease transmission: behavioral adaptations to cold weather, respiratory tract temperature, and the importing of outdoor absolute humidity to indoor spaces. | epidemiology |
10.1101/2021.01.30.21250830 | Estimated SARS-CoV-2 Seroprevalence in Healthy Children and Those with Chronic Illnesses in The Washington Metropolitan Area as of October 2020 | The estimated SARS-CoV-2 seroprevalence in children was found to be 9.46% for the Washington Metropolitan area. Hispanic/Latinx individuals were found to have higher odds of seropositivity. While chronic medical conditions were not associated with having antibodies, previous fever and body aches were predictive symptoms. | epidemiology |
10.1101/2021.01.30.21250822 | Understanding soaring coronavirus cases and the effect of contagion policies in the UK | The number of new daily SARS-CoV-2 infections is frantically rising in almost every country of the EU. The phenomenological explanation offered is a new mutation of the virus, first identified in the UK. We use publicly available data in combination with a controlled SIR model, which captures the effects of preventive measures on the active cases, to show that the current wave of infections is consistent with a single transmission rate. This suggests that the new SARS-CoV-2 variant is as transmissible as previous strains. Our findings indicate that the relaxation of preventive measures is closely related with the ongoing surge in cases. We simulate the effects of new restrictions and vaccination campaigns in 2021, demonstrating that lockdown policies are not fully effective to flatten the curve. For effective mitigation, it is critical that the public keeps on high alert until vaccination reaches a critical threshold. | epidemiology |
10.1101/2021.01.29.21250729 | Impact of the Clinical Trials Act on noncommercial clinical research in Japan: An interrupted time-series analysis | BackgroundThe number of new noncommercial clinical studies conducted in Japan declined within the first year of the implementation of the Clinical Trials Act (CTA) on April 1, 2018. This study aimed to examine the impact of the CTAs enforcement on the number of new noncommercial clinical studies registered in the Japanese Clinical Trial Registry.
MethodsAn interrupted time-series design was used in the analysis, which was conducted for the period of April 2015 to March 2019. We collected data for trials registered in the Clinical Trial Registry, managed by the University Hospital Medical Information Network.
ResultsIn total, 35,811 studies were registered in the registry; of these, 16,455 fulfilled the eligibility criteria. The difference in the trend of monthly number of new trials after CTA enforcement decreased significantly by 15.0 trials (95% CI, -18.7 to -11.3), and the level decreased by 40.8 (95% CI, -68.2 to -13.3) from the pre-enforcement to the post-enforcement period. Multigroup analyses indicated that the act exerted a significant effect on the trend of new clinical trials, particularly those with smaller sample sizes, interventional study designs, and nonprofit funding sponsors.
ConclusionsThe number of Japanese noncommercial clinical studies declined significantly following implementation of the CTA. It is necessary to establish a system to promote clinical studies in Japan while ensuring transparency and safety. | epidemiology |
10.1101/2021.01.30.21250563 | Integrating primary care and social services for older adults with multimorbidity: A qualitative study | BackgroundGrowing demand from an ageing population, chronic preventable disease and multimorbidity has resulted in complex health and social care needs requiring more integrated services. Integrating primary care with social services could more efficiently utilise resources, and improve experiences for patients, their families and carers. There is limited evidence on progress including key barriers and drivers of integration to inform large-scale national change.
AimTo elicit stakeholder views on drivers and barriers of integrated primary care and social services. and highlight opportunities for successful implementation.
Design and settingA qualitative interview study.
MethodSemi-structured interviews with maximum variation sampling to capture stakeholder views across services and professions.
ResultsThirty-seven interviews were conducted across England including GPs, nurses, social care staff, commissioners, local government, voluntary and private sectors, patients and carers. Drivers of integration included groups of like-minded individuals supported by good leadership, expanded interface roles to bridge gaps between systems and co-location of services. Barriers included structural and interdisciplinary tension between professions, organisational self-interest and challenges in record-sharing.
ConclusionsDrivers and barriers to integration identified in other contexts are also present in primary care and social services. Benefits of integration are unlikely to be realised if these are not addressed in the design and execution of new initiatives. Efforts should go beyond local and professional level change to include wider systems and policy-level initiatives. This will support a more systems-wide approach to integrated care reform, which is necessary to meet the complex and growing needs of an ageing multimorbid population. | primary care research |
10.1101/2021.01.29.21250544 | Developing Functional Network Connectivity of the Dorsal Anterior Cingulate Cortex Mediates Externalizing Psychopathology in Adolescents with Child Neglect | Childhood adversity in form of child abuse and neglect is associated with elevated risk for psychopathologies. We investigated whether development of functional brain networks important for executive function (EF) could serve as potential mediators of this association. We analyzed data of 475 adolescents, a subsample of the multisite longitudinal NCANDA (National Consortium on Alcohol and Neurodevelopment in Adolescence) cohort with completed measures of childhood trauma, resting-state functional brain connectivity data, and internalizing/externalizing psychopathological syndrome data at baseline and follow-up years 1-4. Using parallel process latent growth models, we found that childhood adversity was associated with increased risk for externalizing/internalizing behaviors. We specifically investigated whether functional connectivity of the dorsal anterior cingulate cortex (dACC) to brain regions within the cingulo-opercular (CO) network, a well-known EF network that underlies control of attention and self-regulation, mediates the association between adversity and psychopathological behaviors. We found that childhood adversity, specifically neglect was negatively associated with functional connectivity of the dACC within the CO network, and that this connectivity mediated the association between child neglect and externalizing behaviors. Our study advances a mechanistic understanding of how childhood adversity may impact the development of psychopathology, highlighting the relevance of dACC functional networks particularly for externalizing psychopathology. | psychiatry and clinical psychology |
10.1101/2021.01.30.21250555 | Impact of COVID-19 restrictions on the postpartum experience of women living in Eastern Canada: A mixed method cohort study | ObjectivesTo (1) compare changes in self-efficacy, social support, postpartum anxiety and postpartum depression in Canadian women collected before (Cohort 1) and during the COVID-19 pandemic (Cohort 2); (2) explore the women felt related to having a newborn during the pandemic; and (3) explore ways that women coped.
MethodsPrior to the pandemic (October 1, 2019-January 1, 2020), an online survey was conducted with women had given birth within the past six months in one of the three Eastern Canadian Maritime provinces (Cohort 1). A second, similar survey was conducted between August 1, 2020 and October 31, 2020 (Cohort 2) during a period of provincial pandemic response to COVID-19.
ResultsFor Cohort 1, 561 women completed the survey and 331 women in Cohort 2. Cohorts were similar in terms of age of women, parity, and age of newborn. There were no significant differences for self-efficacy, social support, postpartum anxiety, and depression between the cohorts. Difficulties that women reported as a result of COVID-19 restrictions included lack of support from family and friends, fear of COVID-19 exposure, feeling isolated and uncertain, negative impact on perinatal care experience, and hospital restrictions. Having support from partners and families, in-person/virtual support, as well as self-care and the low prevalence of COVID-19 during the summer of 2020 helped women cope.
ConclusionWhile there was no significant difference in pre-pandemic and during pandemic psychosocial outcomes, there were still challenges and negative impacts that women identified. Consideration of vulnerable populations is important when making public health recommendations.
What is already known on this subject?Previous work has shown the importance of social support in the postpartum transition in developing parenting self-efficacy and decreasing postpartum anxiety and depression. However, during the COVID-19 pandemic, womens mental health, particularly during the perinatal period, has seen an increases in rates of postpartum anxiety and depression.
What this study adds?This study is able to compare self-efficacy, social support, postpartum anxiety and depression between two cohorts of postpartum women living in Eastern Canada - pre-COVID-19 pandemic and during. While there was no significant difference in pre-pandemic and during pandemic psychosocial outcomes, there were still challenges and negative impacts that women identified. | public and global health |
10.1101/2021.01.29.21250803 | Radiomics Analysis of Clinical Myocardial Perfusion Stress SPECT Images to Identify Coronary Artery Calcification | PurposeMyocardial perfusion stress SPECT (MPSS) is an established diagnostic test for patients suspected with coronary artery disease (CAD). Meanwhile, coronary artery calcification (CAC) scoring obtained from diagnostic CT is a highly specific test, offering incremental diagnostic information in identifying patients with significant CAD yet normal MPSS scans. However, after decades of wide utilization of MPSS, CAC is not commonly reimbursed (e.g. by the CMS), nor widely deployed in community settings. We aimed to perform radiomics analysis of normal MPSS scans to investigate the potential to predict the CAC score.
MethodsWe collected data from 428 patients with normal (non-ischemic) MPSS (99mTc-Sestamibi; consensus reading). A nuclear medicine physician verified iteratively reconstructed images (attenuation-corrected) to be free from fixed perfusion defects and artifactual attenuation. 3D images were automatically segmented into 4 regions of interest (ROIs), including myocardium and 3 vascular segments (LAD-LCX-RCA). We used our software package, standardized environment for radiomics analysis (SERA), to extract 487 radiomic features in compliance with the image biomarker standardization initiative (IBSI). Isotropic cubic voxels were discretized using fixed bin-number discretization (8 schemes). We first performed blind-to-outcome feature selection focusing on a priori usefulness, dynamic range, and redundancy of features. Subsequently, we performed univariate and multivariate machine learning analyses to predict CAC scores from i) selected radiomic features, ii) 10 clinical features, iii) combined radiomics + clinical features. Univariate analysis invoked Spearman correlation with Benjamini-Hotchberg false-discovery correction. The multivariate analysis incorporated stepwise linear regression, where we randomly selected a 15% test set and divided the other 85% of data into 70% training and 30% validation sets. Training started from a constant (intercept) model, iteratively adding/removing features (stepwise regression), invoking Akaike information criterion (AIC) to discourage overfitting. Validation was run similarly, except that the training output model was used as the initial model. We randomized training/validation sets 20 times, selecting the best model using log-likelihood for evaluation in the test set. Assessment in the test set was performed thoroughly by running the entire operation 50 times, subsequently employing Fishers method to verify the significance of independent tests.
ResultsUnsupervised feature selection significantly reduced 8x487 features to 56. In univariate analysis, no feature survived FDR to directly correlate with CAC scores. Applying Fishers method to the multivariate regression results demonstrated combining radiomics with the clinical features to enhance the significance of the prediction model across all cardiac segments. The median absolute Pearsons coefficient values / p-values for the three feature-pools (radiomics, clinical, combined) were: (0.15, 0.38, 0.41)/(0.1, 0.001, 0.0006) for myocardium, (0.24, 0.35, 0.41)/(0.05, 0.004, 0.0007) for LAD, (0.07, 0.24, 0.28)/(0.4, 0.06, 0.02), for LCX, and (0.06, 0.16, 0.24)/(0.4, 0.2, 0.05) for RCA, demonstrating consistently enhanced correlation and significance for combined radiomics and clinical features across all cardiac segments.
ConclusionsOur standardized and statistically robust multivariate analysis demonstrated significant prediction of the CAC score for all cardiac segments when combining MPSS radiomic features with clinical features, suggesting radiomics analysis can add diagnostic or prognostic value to standard MPSS for wide clinical usage. | radiology and imaging |
10.1101/2021.01.29.21250485 | IgG1 pan-neurofascin antibodies identify a severe yet treatable neuropathy with a high mortality | Guillain-Barre syndrome and chronic inflammatory demyelinating polyneuropathy are umbrella terms for a number of pathologically distinct diseases involving disabling, immune-mediated injury to peripheral nerves. Current treatments involve non-specific immunomodulation. Prospective identification of patients with specific disease mechanisms should increasingly inform the use of more targeted disease modifying therapies and may lead to improved outcomes. In this cohort study, we tested serum for antibodies directed against nodal/paranodal protein antigens. The clinical characteristics of antibody positive and seronegative patients were then compared. Eight patients with IgG1-subclass antibodies directed against both isoforms of the nodal/paranodal cell-adhesion molecule neurofascin were identified. All developed rapidly progressive tetraplegia. Cranial nerve deficits (100% versus 26%), autonomic dysfunction (75% versus 13%) and respiratory involvement (88% versus 14%) were more common than in seronegative patients. The response to intravenous immunoglobulin, steroids and/or plasmapheresis was poor. Four patients received the B-cell-depleting therapy rituximab. After several weeks, these patients began to show progressive functional improvements, became seronegative, and were ultimately discharged home. Four patients who did not receive rituximab did not improve and died following the development of infectious complications and/or withdrawal of intensive care support. IgG1 panneurofascin antibodies define a very severe autoimmune neuropathy. We urgently recommend trials of targeted immunotherapy for this serologically-classified patient group. | neurology |
10.1101/2021.01.31.21250861 | Dysgraphic features in motor neuron disease: a review | BackgroundMotor neuron disease (MND) patients can show oral language deficits mimicking those of frontotemporal degenerations (FTD). Although dysgraphic features have been also reported within the MND-FTD continuum, their characteristics and clinical relevance are still largely unexplored.
AimsTo profile writing disorders in MND patients can help further define their cognitive semiology and thus conveys relevant clinical entailments. Therefore, this study aimed at reviewing evidence of writing impairment in MND patients. This review was implemented and reported by consulting Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. Qualitative/quantitative measures of writing abilities in MND patients was the primary outcome. Both group studies and case reports/series were taken into consideration. Twenty-four contributions were included out of an initial N=83. Potential biases in generalizing results were qualitatively controlled for by extracting background, disease-related, neuropsychological and neuroanatomofunctional secondary outcomes.
Main ContributionFifteen studies assessed writing abilities in Japaneses patients, whereas the remaining eight in western patients. Central dysgraphic features were reported in both neuropsychologically-impaired and -unimpaired MND patients. Phonetic/phonological paragraphias and morpho-syntactic errors were frequently reported. Although FTD was frequently co-occurent, neither cognitive nor language impairment fully accounted for writing impairment in some patients. By contrast, evidence of peripheral dysgraphia was scarce. Patients displaying writing deficits often presented with bulbar signs and perisylvian cortices involvement (including Exners area and the left angular gyrus). Writing deficits proved to be associated with abnormalities in executive functioning and its neural substrates. Writing-to-dictation tasks as well as writing samples assessment proved to be useful to detect writing errors.
ConclusionsDysgraphic features in MND patients might be due to dysfunctions of the graphemic buffer - and possibly the phonological route. The lexico-semantic route appeared to be less involved. However, a mixed peripheral/central involvement cannot be ruled out. In this population, executive/attentive deficits are likely to contribute to writing errors as well. Writing deficits might thus be specific of MND patients cognitive/language impairment profile. The evaluation of writing abilities via writing-to-dictation/narrative writing tasks may be useful when assessing cognition/language in both neuropsychological-impaired and -unimpaired MND patients - especially when severe dysarthria/anarthria is present and prevents clinicians from assessing oral language. | neurology |
10.1101/2021.01.31.21250860 | Language impairment in motor neuron disease phenotypes different from classical amyotrophic lateral sclerosis: a review | BackgroundUp to 35-40% of patients with amyotrophic lateral sclerosis (ALS) present with language deficits falling within the spectrum of frontotemporal degeneration (FTD). It is currently debated whether frontotemporal involvement occurs or not in motor neuron disease (MND) phenotypes that differ from classical ALS (i.e., both non-ALS MNDs and non-classical ALS endo-phenotypes) - this stance being supported by the notion of a common pathology underlying MNDs. To investigate whether language dysfunctions also occur in patients with different-from-classical-ALS MNDs can; a) help determine whether the MND-FTD continuum could be broadened at a neuropsychological level; b) convey relevant entailments to cognitive diagnostics in these populations.
AimsThe present study thus aimed at reviewing evidence regarding language impairment in different-from-classical-ALS MND patients. Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines were consulted to implement and report the present review. Studies were included if a) language was quantitatively assessed b) in patients diagnosed with different-from-classical-ALS MND phenotypes. Studies assessing demented patients only were excluded. From an original N=1117 contributions, N=20 group studies were finally included. Secondary outcomes were taken into account for qualitatively assessing potential biases in generalizing results.
Main contributionStudies were divided into those assessing predominant-upper vs. - lower MND patients (UMND/LMND). Language dysfunctions appeared to be more prevalent and severe in UMND patients. Language screeners were able to detect language deficits in both groups. Lexical-semantic deficits appeared to be highly prevalent in both groups and a selective difficulty in action-vs. object-naming was systematically detected. Morpho-syntactic deficits were seldom reported in both groups. Phonological deficits and central dysgraphic features were found in UMND patients only.
ConclusionPatients with different-from-classical-ALS MND phenotypes display language deficits similar to those of classical ALS patients (as far as both prevalence and type are concerned) and thus could be validly included in the MND-FTD continuum at a neuropsychological level. A greater cortical involvement might account for language deficits being more severe in UMND patients. Consistently with guidelines for cognitive assessment in ALS patients, action-naming tasks might represent a valid and sensitive tool for assessing language in UMND/LMND patients too. | neurology |
10.1101/2021.01.31.21250702 | CSF Aβ38 levels are associated with Alzheimer-related decline: implications for γ-secretase modulators | ObjectiveShorter A{beta} species might modulate disease progression in Alzheimers disease (AD). Here we studied whether A{beta}38 levels in cerebrospinal fluid (CSF) are associated with risk of developing AD dementia and cognitive decline.
MethodsCSF A{beta}38 levels were measured in 656 individuals across two clinical cohorts - the Swedish BioFINDER study and the Alzheimers Disease Neuroimaging Initiative (ADNI). Cox regression models were used to evaluate the association between baseline A{beta}38 levels and risk of AD dementia in AD-biomarker positive individuals (AD+; determined by CSF P-tau/A{beta}42 ratio) with subjective cognitive decline (SCD) or mild cognitive impairment (MCI). Linear mixed effects models were used to evaluate the association between baseline A{beta}38 levels and cognitive decline as measured by MMSE in AD+ participants with SCD, MCI or AD dementia.
ResultsIn the BioFINDER cohort, high A{beta}38 levels were associated with slower decline in MMSE ({beta} = 0.30 points / sd., P = 0.001) and with lower risk of conversion. To AD dementia (HR = 0.83 per sd., P = 0.03). In the ADNI cohort, higher A{beta}38 levels were associated with less decline in MMSE ({beta} = 0.27, P = 0.01), but not risk of conversion to AD dementia (P = 0.66). A{beta}38 levels in both cohorts remained significantly associated with both outcomes when adjusted for CSF P-tau levels and remained associated with cognition when adjusted for CSF A{beta}42 levels.
ConclusionsHigher CSF A{beta}38 levels are associated with lower risk of AD-related changes in two independent clinical cohorts. These findings may have implications for {gamma}-secretase modulators as potential disease-altering therapy. | neurology |
10.1101/2021.01.31.21250879 | A gene expression platform to predict benefit from adjuvant external beam radiation in resected non-small cell lung cancer | BackgroundWe hypothesized that the radiosensitivity index (RSI), would classify non-small cell lung cancer (NSCLC) patients into radioresistant (RR) or radiosensitive (RS).
MethodsWe identified resected pathologic stage III NSCLC. For the radiation group (RT) group, at least 45 Gy of external beam radiation was required. mRNA was extracted from primary tumor. The predefined cut-point was the median RSI with a primary endpoint of local control. Similar criteria were then applied to two extramural datasets (E1; E2) with progression free survival as the primary endpoint.
ResultsMedian follow-up from diagnosis was 23.5 months (range: 4.8-169.6 months). RSI was associated with time to local failure in the RT group with a two-year rate of local control of 80% and 56% between RS and RR groups, respectively p=0.02. RSI was the only variable found to be significant on Cox local control analysis (HR 2.9; 95% CI: 1.2-8.2; p=0.02). There was no significance of RSI in predicting local control in patients not receiving RT, p=0.48. A cox regression model between receipt of radiotherapy and RSI combining E1 and E2 showed that the interaction term was significant for PFS (3.7; 95% CI 1.4-10; p=0.009). A summary measure combining E1 and E2 showed statistical significance for PFS between RR and RS patients treated with radiotherapy (HR 2.7l; 95% CI 1.3-5.6; p=0.007) but not in patients not treated with radiotherapy (HR 0.94; 95% CI 0.5-1.78; p=0.86).
ConclusionsRSI appears to be predictive for benefit from adjuvant radiation. Prospective validation is required. | oncology |
10.1101/2021.01.30.21250679 | Cohort-based association study of germline genetic variants with acute and chronic health complications of childhood cancer and its treatment: Genetic risks for childhood cancer complications Switzerland (GECCOS) study protocol | BackgroundChildhood cancer and its treatment may lead to many acute and chronic health complications. Related impairment in quality of life, excess in deaths, and accumulated health care costs are relevant. There is a wide inter-individual variability in the type and severity of health complications. Genetic variations are suggested to contribute to individual susceptibility. So far, only few genetic variants have been used to risk-stratify treatment and follow-up care. This study platform aims to identify germline genetic variants associated with acute and late complications of childhood cancer.
MethodsThe Genetic Risks for Childhood Cancer Complications Switzerland (GECCOS) study is a nationwide cohort study. It includes patients and survivors who were diagnosed with childhood cancers or Langerhans cell histiocytosis before age 21 years, were registered in the Swiss Childhood Cancer Registry (SCCR) since 1976 and have consented to the Pediatric Biobank for Research in Hematology and Oncology (BaHOP), Geneva, host of the Germline DNA Biobank Switzerland for Childhood Cancer and Blood Disorders (BISKIDS). BISKIDS is a national biobank for the collection of germline DNA in childhood cancer patients and survivors.
GECCOS uses demographic and clinical data from the SCCR and the associated Swiss Childhood Cancer Survivor Study (SCCSS), which contains health-related data of survivors. Phenotypic data consist of objective measurements, health conditions diagnosed by physicians, second primary neoplasms, self-reported and health-related information from participants. Germline genetic samples and sequencing data have been collected in BISKIDS. We will perform gene panel sequencing, whole-exome sequencing, or whole-genome sequencing depending on the research questions. We will perform association analyses to identify genetic variants associated with specified health conditions. We will use clustering and machine-learning techniques and assess multiple health conditions in different models.
DiscussionGECCOS will serve as an overarching platform to enable genotype-phenotype association analyses on complications associated with childhood cancer and its treatments. Knowledge of germline genetic variants associated with childhood cancer-associated health conditions will help to further individualize cancer treatment and follow-up care, potentially resulting in improved efficacy and reduced side effects, for personalized cancer care.
Trial registrationClinicaltrials.gov: NCT04702321 | genetic and genomic medicine |
10.1101/2021.01.31.21250859 | Prognosis and hematological findings in patients with COVID-19 in an Amazonian population of Peru | ObjectiveThis study examined the laboratory results of COVID-19 patients from a hospital in the Peruvian Amazon and their clinical prognosis.
MethodsAn analytical cross-sectional study was carried out whose purpose was to identify the laboratory tests of patients with COVID-19 and mortality in a hospital in Ucayali, Peru during the period from March 13 to May 9, 2020, selecting a total of 127 with Covid-19. Mean and the standard deviation was described for age, leukocytes, neutrophils, platelets, RDW-SD; median and interquartile range for the variables lymphocyte, RN / L, fibrinogen, CRP, D-dimer, DHL, hematocrit, monocytes, eosinophils.
ResultsNo differences were observed in this population regarding death and sex (OR: 1.31; 95% CI 0.92 to 1.87), however, it was observed that, for each one-year increase, the probability of death increased by 4% (PR: 1.04, 95% CI 1.03 to 1.05). The IRR (Incidence Risk Ratio) analysis for the numerical variables showed results strongly associated with hematological values such as Leukocytes (scaled by 2500 units) (IRR: 1.08, 95% CI 1.03 to 1.13), neutrophils (scaled by 2500 units) (IRR: 1.08; 95% CI 1.03 to 1.13), on the contrary, it is observed that the increase of 1000 units in lymphocytes, the probability of dying decreased by 48% (IRR: 0.52; 95% CI 0.38 to 071).
ConclusionParameters such as leukocytes and neutrophils were statistically much higher in patients who died. | hematology |
10.1101/2021.01.31.21250870 | Immuno-fibrotic drivers of impaired lung function in post-COVID-19 syndrome | IntroductionSubjects recovering from COVID-19 frequently experience persistent respiratory ailments; however, little is known about the underlying biological factors that may direct lung recovery and the extent to which these are affected by COVID-19 severity.
MethodsWe performed a prospective cohort study of subjects with persistent symptoms after acute COVID-19, collecting clinical data, pulmonary function tests, and plasma samples used for multiplex profiling of inflammatory, metabolic, angiogenic, and fibrotic factors.
ResultsSixty-one subjects were enrolled across two academic medical centers at a median of 9 weeks (interquartile range 6-10) after COVID-19 illness: n=13 subjects (21%) mild/non-hospitalized, n=30 (49%) hospitalized/non-critical, and n=18 subjects (30%) hospitalized/intensive care ("ICU"). Fifty-three subjects (85%) had lingering symptoms, most commonly dyspnea (69%) and cough (58%). Forced vital capacity (FVC), forced expiratory volume in 1 second (FEV1), and diffusing capacity for carbon monoxide (DLCO) declined as COVID-19 severity increased (P<0.05), but did not correlate with respiratory symptoms. Partial least-squares discriminant analysis of plasma biomarker profiles clustered subjects by past COVID-19 severity. Lipocalin 2 (LCN2), matrix metalloproteinase-7 (MMP-7), and hepatocyte growth factor (HGF) identified by the model were significantly higher in the ICU group (P<0.05) and inversely correlated with FVC and DLCO (P<0.05), and were confirmed in a separate validation cohort (n=53).
ConclusionsSubjective respiratory symptoms are common after acute COVID-19 illness but do not correlate with COVID-19 severity or pulmonary function. Host response profiles reflecting neutrophil activation (LCN2), fibrosis signaling (MMP-7), and alveolar repair (HGF) track with lung impairment and may be novel therapeutic or prognostic targets.
FundingThe study was funded in part by the NHLBI (K08HL130557 to BDK and R01HL142818 to HJC), the DeLuca Foundation Award (AP), a donation from Jack Levin to the Benign Hematology Program at Yale, and Divisional/Departmental funds from Duke University. | infectious diseases |
10.1101/2021.01.31.21250870 | Immuno-fibrotic drivers of impaired lung function in post-COVID-19 syndrome | IntroductionSubjects recovering from COVID-19 frequently experience persistent respiratory ailments; however, little is known about the underlying biological factors that may direct lung recovery and the extent to which these are affected by COVID-19 severity.
MethodsWe performed a prospective cohort study of subjects with persistent symptoms after acute COVID-19, collecting clinical data, pulmonary function tests, and plasma samples used for multiplex profiling of inflammatory, metabolic, angiogenic, and fibrotic factors.
ResultsSixty-one subjects were enrolled across two academic medical centers at a median of 9 weeks (interquartile range 6-10) after COVID-19 illness: n=13 subjects (21%) mild/non-hospitalized, n=30 (49%) hospitalized/non-critical, and n=18 subjects (30%) hospitalized/intensive care ("ICU"). Fifty-three subjects (85%) had lingering symptoms, most commonly dyspnea (69%) and cough (58%). Forced vital capacity (FVC), forced expiratory volume in 1 second (FEV1), and diffusing capacity for carbon monoxide (DLCO) declined as COVID-19 severity increased (P<0.05), but did not correlate with respiratory symptoms. Partial least-squares discriminant analysis of plasma biomarker profiles clustered subjects by past COVID-19 severity. Lipocalin 2 (LCN2), matrix metalloproteinase-7 (MMP-7), and hepatocyte growth factor (HGF) identified by the model were significantly higher in the ICU group (P<0.05) and inversely correlated with FVC and DLCO (P<0.05), and were confirmed in a separate validation cohort (n=53).
ConclusionsSubjective respiratory symptoms are common after acute COVID-19 illness but do not correlate with COVID-19 severity or pulmonary function. Host response profiles reflecting neutrophil activation (LCN2), fibrosis signaling (MMP-7), and alveolar repair (HGF) track with lung impairment and may be novel therapeutic or prognostic targets.
FundingThe study was funded in part by the NHLBI (K08HL130557 to BDK and R01HL142818 to HJC), the DeLuca Foundation Award (AP), a donation from Jack Levin to the Benign Hematology Program at Yale, and Divisional/Departmental funds from Duke University. | infectious diseases |
10.1101/2021.01.31.21250870 | Immuno-fibrotic drivers of impaired lung function in post-acute sequelae of SARS-CoV-2 infection (PASC) | IntroductionSubjects recovering from COVID-19 frequently experience persistent respiratory ailments; however, little is known about the underlying biological factors that may direct lung recovery and the extent to which these are affected by COVID-19 severity.
MethodsWe performed a prospective cohort study of subjects with persistent symptoms after acute COVID-19, collecting clinical data, pulmonary function tests, and plasma samples used for multiplex profiling of inflammatory, metabolic, angiogenic, and fibrotic factors.
ResultsSixty-one subjects were enrolled across two academic medical centers at a median of 9 weeks (interquartile range 6-10) after COVID-19 illness: n=13 subjects (21%) mild/non-hospitalized, n=30 (49%) hospitalized/non-critical, and n=18 subjects (30%) hospitalized/intensive care ("ICU"). Fifty-three subjects (85%) had lingering symptoms, most commonly dyspnea (69%) and cough (58%). Forced vital capacity (FVC), forced expiratory volume in 1 second (FEV1), and diffusing capacity for carbon monoxide (DLCO) declined as COVID-19 severity increased (P<0.05), but did not correlate with respiratory symptoms. Partial least-squares discriminant analysis of plasma biomarker profiles clustered subjects by past COVID-19 severity. Lipocalin 2 (LCN2), matrix metalloproteinase-7 (MMP-7), and hepatocyte growth factor (HGF) identified by the model were significantly higher in the ICU group (P<0.05) and inversely correlated with FVC and DLCO (P<0.05), and were confirmed in a separate validation cohort (n=53).
ConclusionsSubjective respiratory symptoms are common after acute COVID-19 illness but do not correlate with COVID-19 severity or pulmonary function. Host response profiles reflecting neutrophil activation (LCN2), fibrosis signaling (MMP-7), and alveolar repair (HGF) track with lung impairment and may be novel therapeutic or prognostic targets.
FundingThe study was funded in part by the NHLBI (K08HL130557 to BDK and R01HL142818 to HJC), the DeLuca Foundation Award (AP), a donation from Jack Levin to the Benign Hematology Program at Yale, and Divisional/Departmental funds from Duke University. | infectious diseases |
10.1101/2021.01.29.21250653 | Robust spike antibody responses and increased reactogenicity in seropositive individuals after a single dose of SARS-CoV-2 mRNA vaccine | An important question is arising as COVID-19 vaccines are getting rolled out: Should individuals who already had a SARS-CoV-2 infection receive one or two shots of the currently authorized mRNA vaccines. In this short report, we show that the antibody response to the first vaccine dose in individuals with pre-existing immunity is equal to or even exceeds the titers found in naive individuals after the second dose. We also show that the reactogenicity is significantly higher in individuals who have been infected with SARS-CoV-2 in the past. Changing the policy to give these individuals only one dose of vaccine would not negatively impact on their antibody titers, spare them from unnecessary pain and free up many urgently needed vaccine doses. | allergy and immunology |
10.1101/2021.01.30.21250715 | Mortality Risk and Temporal Patterns of Atrial Fibrillation in the Nationwide Registry | Aims Persistent and permanent atrial fibrillation (AF) often occurs in the presence of multiple comorbidities and is linked to adverse clinical outcomes. It is unclear whether the sustained pattern of AF itself is prognostic or if it is confounded by underlying comorbidities. Here, we tested the association between the temporal patterns of AF and the risks of ischemic stroke and all-cause mortality. Methods and Results In a prospective multicenter cohort, 3046 non-valvular AF patients were consecutively enrolled and followed for adverse outcomes of all-cause mortality and ischemic stroke. The risks of both outcomes were adjusted for underlying comorbidities, and compared between the patterns of AF. At baseline, the patients were classified as paroxysmal (N=963, 31.6%), persistent (N=604, 19.8%), and permanent AF (N=1479, 45.6%) according to the standard definition. Anticoagulants were administered in 75% of all patients and 83% of those with CHA2DS2-VASc score [≥]2 in males or [≥]3 in females . During a mean follow up of 26 (SD 10.5) months, all-cause mortality occurred less in paroxysmal AF (2.5 per 100 patient-years) than in persistent AF (4.4 per 100 patient-years; adjusted hazard ratio [HR] 0.66, 95% CI, 0.46-0.96; P = .029) and permanent AF (4.1 per 100 patient-years; adjusted HR 0.71, 95% CI, 0.52-0.98; P = .036). The risk of ischemic stroke was similar across all patterns of AF. Conclusions In this multicenter nationwide registry of AF patients, persistent and permanent AF was associated with higher all-cause mortality than paroxysmal AF, independent of baseline comorbidities. | cardiovascular medicine |
10.1101/2021.01.30.21250715 | Mortality Risk and Temporal Patterns of Atrial Fibrillation in the Nationwide Registry | Aims Persistent and permanent atrial fibrillation (AF) often occurs in the presence of multiple comorbidities and is linked to adverse clinical outcomes. It is unclear whether the sustained pattern of AF itself is prognostic or if it is confounded by underlying comorbidities. Here, we tested the association between the temporal patterns of AF and the risks of ischemic stroke and all-cause mortality. Methods and Results In a prospective multicenter cohort, 3046 non-valvular AF patients were consecutively enrolled and followed for adverse outcomes of all-cause mortality and ischemic stroke. The risks of both outcomes were adjusted for underlying comorbidities, and compared between the patterns of AF. At baseline, the patients were classified as paroxysmal (N=963, 31.6%), persistent (N=604, 19.8%), and permanent AF (N=1479, 45.6%) according to the standard definition. Anticoagulants were administered in 75% of all patients and 83% of those with CHA2DS2-VASc score [≥]2 in males or [≥]3 in females . During a mean follow up of 26 (SD 10.5) months, all-cause mortality occurred less in paroxysmal AF (2.5 per 100 patient-years) than in persistent AF (4.4 per 100 patient-years; adjusted hazard ratio [HR] 0.66, 95% CI, 0.46-0.96; P = .029) and permanent AF (4.1 per 100 patient-years; adjusted HR 0.71, 95% CI, 0.52-0.98; P = .036). The risk of ischemic stroke was similar across all patterns of AF. Conclusions In this multicenter nationwide registry of AF patients, persistent and permanent AF was associated with higher all-cause mortality than paroxysmal AF, independent of baseline comorbidities. | cardiovascular medicine |
10.1101/2021.01.29.21250766 | Using time use diaries to track changing behavior across successive stages of COVID-19 social restrictions | How did people change their behavior over the different phases of the UK COVID-19 restrictions, and how did these changes affect their risk of being exposed to infection? Time use diary surveys are unique in providing a complete chronicle of daily behavior; 24-hour continuous records of the populations activities, their social context and their location. We present results from four such surveys, collected in real time from representative UK samples, both before, and at three points over the course of the current pandemic. Comparing across the four waves, we find evidence of substantial changes in the UK populations behavior relating to activities, locations and social context. We assign different levels of risk to combinations of activities, locations and copresence, to compare risk-related behavior across successive lockdowns. We find evidence that during the second lockdown (November 2020) there was an increase in high-risk behaviors relative to the first (starting March 2020). This increase is shown to be associated with more paid work time in the workplace. At a time when capacity is still limited both in respect of immunization and track-trace technology, governments must continue to rely on changes in peoples daily behaviors to contain the spread of COVID-19 and similar viruses. Time use diary information of this type, collected in real time across the course of the COVID-19 pandemic, can provide policy-makers with information to assess and quantify changes in daily behaviors, and the impact they are likely to have on overall behavioral-associated risks. | public and global health |
10.1101/2021.01.29.21250766 | Using time use diaries to track changing behavior across successive stages of COVID-19 social restrictions | How did people change their behavior over the different phases of the UK COVID-19 restrictions, and how did these changes affect their risk of being exposed to infection? Time use diary surveys are unique in providing a complete chronicle of daily behavior; 24-hour continuous records of the populations activities, their social context and their location. We present results from four such surveys, collected in real time from representative UK samples, both before, and at three points over the course of the current pandemic. Comparing across the four waves, we find evidence of substantial changes in the UK populations behavior relating to activities, locations and social context. We assign different levels of risk to combinations of activities, locations and copresence, to compare risk-related behavior across successive lockdowns. We find evidence that during the second lockdown (November 2020) there was an increase in high-risk behaviors relative to the first (starting March 2020). This increase is shown to be associated with more paid work time in the workplace. At a time when capacity is still limited both in respect of immunization and track-trace technology, governments must continue to rely on changes in peoples daily behaviors to contain the spread of COVID-19 and similar viruses. Time use diary information of this type, collected in real time across the course of the COVID-19 pandemic, can provide policy-makers with information to assess and quantify changes in daily behaviors, and the impact they are likely to have on overall behavioral-associated risks. | public and global health |
10.1101/2021.01.30.21250834 | Minimizing loss of life in Covid-19 in a 100 day period in the U.S.A. by age-tailored dosing and distribution of a limited vaccine supply | BackgroundWe aimed at minimizing loss of lives in the Covid-19 pandemic in the USA by identifying optimal vaccination strategies during a 100-day period with limited vaccine supplies. While lethality is highest in the elderly, transmission and case numbers are highest in the younger. A strategy of first vaccinating the elderly is widely used, thought to protect the vulnerable, elderly best. Despite lower immunogenicity in the elderly, mRNA vaccines retain high efficacy, implying that in the younger, reduced vaccine doses might suffice, thereby increasing vaccination counts with a given vaccine supply.
MethodsUsing published immunogenicity data of the Moderna mRNA-1273 vaccine, we examined the value of personalized-dose vaccination strategies, using a modeling approach incorporating age-related vaccine immunogenicity, social contact patterns, population structure, Covid-19 case and death rates in the USA in late January 2021. An increase if the number of persons that can be vaccinated and a potential reduction of the individual protective efficacy was accounted for.
ResultsAge-personalized dosing strategies reduced cases faster, shortening the pandemic, reducing the delay to reaching <100000 cases/day from 64 to 30 days and avoiding 25000 deaths within 100 days in the USA. In an "elderly first" vaccination strategy, mortality is higher even in the elderly. Findings were robust with transmission blocking efficacies of reduced dose vaccination between 30% to 90%, and with a vaccine supply from 1 to 3 million full dose vaccinations per day.
ConclusionRapid reduction of Covid-19 case and death rate in the USA in 100 days with a limited vaccine supply is best achieved when personalized, age-tailored dosing for highly effective vaccines is used, according to this vaccination strategy model parameterized to U.S. demographics, Covid-19 transmission and vaccine characteristics. Protecting the vulnerable is most effectively achieved by personalized-dose vaccination of all population segments, while an "elderly first" approach costs more lives, even in the elderly. | public and global health |
10.1101/2021.01.30.21250834 | Minimizing loss of life in Covid-19 in a 100 day period in the U.S.A. by personalize-dose vaccination and distribution of a limited vaccine supply | BackgroundWe aimed at minimizing loss of lives in the Covid-19 pandemic in the USA by identifying optimal vaccination strategies during a 100-day period with limited vaccine supplies. While lethality is highest in the elderly, transmission and case numbers are highest in the younger. A strategy of first vaccinating the elderly is widely used, thought to protect the vulnerable, elderly best. Despite lower immunogenicity in the elderly, mRNA vaccines retain high efficacy, implying that in the younger, reduced vaccine doses might suffice, thereby increasing vaccination counts with a given vaccine supply.
MethodsUsing published immunogenicity data of the Moderna mRNA-1273 vaccine, we examined the value of personalized-dose vaccination strategies, using a modeling approach incorporating age-related vaccine immunogenicity, social contact patterns, population structure, Covid-19 case and death rates in the USA in late January 2021. An increase if the number of persons that can be vaccinated and a potential reduction of the individual protective efficacy was accounted for.
ResultsAge-personalized dosing strategies reduced cases faster, shortening the pandemic, reducing the delay to reaching <100000 cases/day from 64 to 30 days and avoiding 25000 deaths within 100 days in the USA. In an "elderly first" vaccination strategy, mortality is higher even in the elderly. Findings were robust with transmission blocking efficacies of reduced dose vaccination between 30% to 90%, and with a vaccine supply from 1 to 3 million full dose vaccinations per day.
ConclusionRapid reduction of Covid-19 case and death rate in the USA in 100 days with a limited vaccine supply is best achieved when personalized, age-tailored dosing for highly effective vaccines is used, according to this vaccination strategy model parameterized to U.S. demographics, Covid-19 transmission and vaccine characteristics. Protecting the vulnerable is most effectively achieved by personalized-dose vaccination of all population segments, while an "elderly first" approach costs more lives, even in the elderly. | public and global health |
10.1101/2021.01.30.21250834 | Minimizing loss of life in Covid-19 in a 100 day period in the U.S.A. by personalized-dose vaccination and distribution of a limited vaccine supply | BackgroundWe aimed at minimizing loss of lives in the Covid-19 pandemic in the USA by identifying optimal vaccination strategies during a 100-day period with limited vaccine supplies. While lethality is highest in the elderly, transmission and case numbers are highest in the younger. A strategy of first vaccinating the elderly is widely used, thought to protect the vulnerable, elderly best. Despite lower immunogenicity in the elderly, mRNA vaccines retain high efficacy, implying that in the younger, reduced vaccine doses might suffice, thereby increasing vaccination counts with a given vaccine supply.
MethodsUsing published immunogenicity data of the Moderna mRNA-1273 vaccine, we examined the value of personalized-dose vaccination strategies, using a modeling approach incorporating age-related vaccine immunogenicity, social contact patterns, population structure, Covid-19 case and death rates in the USA in late January 2021. An increase if the number of persons that can be vaccinated and a potential reduction of the individual protective efficacy was accounted for.
ResultsAge-personalized dosing strategies reduced cases faster, shortening the pandemic, reducing the delay to reaching <100000 cases/day from 64 to 30 days and avoiding 25000 deaths within 100 days in the USA. In an "elderly first" vaccination strategy, mortality is higher even in the elderly. Findings were robust with transmission blocking efficacies of reduced dose vaccination between 30% to 90%, and with a vaccine supply from 1 to 3 million full dose vaccinations per day.
ConclusionRapid reduction of Covid-19 case and death rate in the USA in 100 days with a limited vaccine supply is best achieved when personalized, age-tailored dosing for highly effective vaccines is used, according to this vaccination strategy model parameterized to U.S. demographics, Covid-19 transmission and vaccine characteristics. Protecting the vulnerable is most effectively achieved by personalized-dose vaccination of all population segments, while an "elderly first" approach costs more lives, even in the elderly. | public and global health |
10.1101/2021.01.30.21250834 | Minimizing loss of life in Covid-19 in a 100 day period in the U.S.A. by personalized-dose vaccination and distribution of a limited vaccine supply | BackgroundWe aimed at minimizing loss of lives in the Covid-19 pandemic in the USA by identifying optimal vaccination strategies during a 100-day period with limited vaccine supplies. While lethality is highest in the elderly, transmission and case numbers are highest in the younger. A strategy of first vaccinating the elderly is widely used, thought to protect the vulnerable, elderly best. Despite lower immunogenicity in the elderly, mRNA vaccines retain high efficacy, implying that in the younger, reduced vaccine doses might suffice, thereby increasing vaccination counts with a given vaccine supply.
MethodsUsing published immunogenicity data of the Moderna mRNA-1273 vaccine, we examined the value of personalized-dose vaccination strategies, using a modeling approach incorporating age-related vaccine immunogenicity, social contact patterns, population structure, Covid-19 case and death rates in the USA in late January 2021. An increase if the number of persons that can be vaccinated and a potential reduction of the individual protective efficacy was accounted for.
ResultsAge-personalized dosing strategies reduced cases faster, shortening the pandemic, reducing the delay to reaching <100000 cases/day from 64 to 30 days and avoiding 25000 deaths within 100 days in the USA. In an "elderly first" vaccination strategy, mortality is higher even in the elderly. Findings were robust with transmission blocking efficacies of reduced dose vaccination between 30% to 90%, and with a vaccine supply from 1 to 3 million full dose vaccinations per day.
ConclusionRapid reduction of Covid-19 case and death rate in the USA in 100 days with a limited vaccine supply is best achieved when personalized, age-tailored dosing for highly effective vaccines is used, according to this vaccination strategy model parameterized to U.S. demographics, Covid-19 transmission and vaccine characteristics. Protecting the vulnerable is most effectively achieved by personalized-dose vaccination of all population segments, while an "elderly first" approach costs more lives, even in the elderly. | public and global health |
10.1101/2021.01.30.21250834 | Minimizing loss of life in Covid-19 in a 100 day period in the U.S.A. by personalized-dose vaccination and distribution of a limited vaccine supply - a vaccination strategy model | BackgroundWe aimed at minimizing loss of lives in the Covid-19 pandemic in the USA by identifying optimal vaccination strategies during a 100-day period with limited vaccine supplies. While lethality is highest in the elderly, transmission and case numbers are highest in the younger. A strategy of first vaccinating the elderly is widely used, thought to protect the vulnerable, elderly best. Despite lower immunogenicity in the elderly, mRNA vaccines retain high efficacy, implying that in the younger, reduced vaccine doses might suffice, thereby increasing vaccination counts with a given vaccine supply.
MethodsUsing published immunogenicity data of the Moderna mRNA-1273 vaccine, we examined the value of personalized-dose vaccination strategies, using a modeling approach incorporating age-related vaccine immunogenicity, social contact patterns, population structure, Covid-19 case and death rates in the USA in late January 2021. An increase if the number of persons that can be vaccinated and a potential reduction of the individual protective efficacy was accounted for.
ResultsAge-personalized dosing strategies reduced cases faster, shortening the pandemic, reducing the delay to reaching <100000 cases/day from 64 to 30 days and avoiding 25000 deaths within 100 days in the USA. In an "elderly first" vaccination strategy, mortality is higher even in the elderly. Findings were robust with transmission blocking efficacies of reduced dose vaccination between 30% to 90%, and with a vaccine supply from 1 to 3 million full dose vaccinations per day.
ConclusionRapid reduction of Covid-19 case and death rate in the USA in 100 days with a limited vaccine supply is best achieved when personalized, age-tailored dosing for highly effective vaccines is used, according to this vaccination strategy model parameterized to U.S. demographics, Covid-19 transmission and vaccine characteristics. Protecting the vulnerable is most effectively achieved by personalized-dose vaccination of all population segments, while an "elderly first" approach costs more lives, even in the elderly. | public and global health |
10.1101/2021.01.30.21250834 | Minimizing loss of life in Covid-19 in a 100 day period in the U.S.A. by personalized-dose vaccination and distribution of a limited vaccine supply | BackgroundWe aimed at minimizing loss of lives in the Covid-19 pandemic in the USA by identifying optimal vaccination strategies during a 100-day period with limited vaccine supplies. While lethality is highest in the elderly, transmission and case numbers are highest in the younger. A strategy of first vaccinating the elderly is widely used, thought to protect the vulnerable, elderly best. Despite lower immunogenicity in the elderly, mRNA vaccines retain high efficacy, implying that in the younger, reduced vaccine doses might suffice, thereby increasing vaccination counts with a given vaccine supply.
MethodsUsing published immunogenicity data of the Moderna mRNA-1273 vaccine, we examined the value of personalized-dose vaccination strategies, using a modeling approach incorporating age-related vaccine immunogenicity, social contact patterns, population structure, Covid-19 case and death rates in the USA in late January 2021. An increase if the number of persons that can be vaccinated and a potential reduction of the individual protective efficacy was accounted for.
ResultsAge-personalized dosing strategies reduced cases faster, shortening the pandemic, reducing the delay to reaching <100000 cases/day from 64 to 30 days and avoiding 25000 deaths within 100 days in the USA. In an "elderly first" vaccination strategy, mortality is higher even in the elderly. Findings were robust with transmission blocking efficacies of reduced dose vaccination between 30% to 90%, and with a vaccine supply from 1 to 3 million full dose vaccinations per day.
ConclusionRapid reduction of Covid-19 case and death rate in the USA in 100 days with a limited vaccine supply is best achieved when personalized, age-tailored dosing for highly effective vaccines is used, according to this vaccination strategy model parameterized to U.S. demographics, Covid-19 transmission and vaccine characteristics. Protecting the vulnerable is most effectively achieved by personalized-dose vaccination of all population segments, while an "elderly first" approach costs more lives, even in the elderly. | public and global health |
10.1101/2021.01.29.21250643 | Emergence of first strains of Sars-CoV-2 lineage B.1.1.7 in Romania | United Kingdom reported the emergence of a new and highly transmissible SARS-CoV-2 variant B.1.1.7. that rapidly spread to other contries. The impact of this new mutation that occurs in the S protein, on infectivity, virulence and current vaccine effectiveness is still under evaluation. We have identified the first cases of the B.1.1.7 variant in samples collected from Romanian patients, of which one was traced to the UK region where the new variant was originally sequenced. Mutations in the Nsp3 protein, N844S and D455N and L15F in Orf3a were also detected, indicating common ancestry with UK strains as well as remote connections with strains from Nagasaki, Japan. These results indicate, for the first time, the presence and characteristics of the new variant B.1.1.7 in Romania and underscore the need for increased genomic sequencing in confirmed COVID-19 patients. | infectious diseases |
10.1101/2021.01.29.21250750 | Emergence of universality in the transmission dynamics of COVID-19 | The complexities involved in modeling the transmission dynamics of COVID-19 has been a major roadblock in achieving predictability in the spread and containment of the disease. In addition to understanding the modes of transmission, the effectiveness of the mitigation methods also needs to be built into any effective model for making such predictions. We show that such complexities can be circumvented by appealing to scaling principles which lead to the emergence of universality in the transmission dynamics of the disease. The ensuing data collapse renders the transmission dynamics largely independent of geopolitical variations, the effectiveness of various mitigation strategies, population demographics, etc. We propose a simple two-parameter model--the Blue Sky model--and show that one class of transmission dynamics can be explained by a solution that lives at the edge of a blue sky bifurcation. In addition, the data collapse leads to an enhanced degree of predictability in the disease spread for several geographical scales which can also be realized in a model-independent manner as we show using a deep neural network. The methodology adopted in this work can potentially be applied to the transmission of other infectious diseases and new universality classes may be found. The predictability in transmission dynamics and the simplicity of our methodology can help in building policies for exit strategies and mitigation methods during a pandemic. | epidemiology |
10.1101/2021.01.29.21250750 | Emergence of universality in the transmission dynamics of COVID-19 | The complexities involved in modeling the transmission dynamics of COVID-19 has been a major roadblock in achieving predictability in the spread and containment of the disease. In addition to understanding the modes of transmission, the effectiveness of the mitigation methods also needs to be built into any effective model for making such predictions. We show that such complexities can be circumvented by appealing to scaling principles which lead to the emergence of universality in the transmission dynamics of the disease. The ensuing data collapse renders the transmission dynamics largely independent of geopolitical variations, the effectiveness of various mitigation strategies, population demographics, etc. We propose a simple two-parameter model--the Blue Sky model--and show that one class of transmission dynamics can be explained by a solution that lives at the edge of a blue sky bifurcation. In addition, the data collapse leads to an enhanced degree of predictability in the disease spread for several geographical scales which can also be realized in a model-independent manner as we show using a deep neural network. The methodology adopted in this work can potentially be applied to the transmission of other infectious diseases and new universality classes may be found. The predictability in transmission dynamics and the simplicity of our methodology can help in building policies for exit strategies and mitigation methods during a pandemic. | epidemiology |
10.1101/2021.01.29.21250755 | A Molecular network approach reveals shared cellular and molecular signatures between chronic fatigue syndrome and other fatiguing illnesses | IntroThe molecular mechanisms of chronic fatigue syndrome (CFS, or Myalgic encephalomyelitis), a disease defined by extreme, long-term fatigue, remain largely uncharacterized, and presently no molecular diagnostic test and no specific treatments exist to diagnose and treat CFS patients. While CFS has historically had an estimated prevalence of 0.1-0.5% [1], concerns of a "long hauler" version of Coronavirus disease 2019 (COVID-19) that symptomatically overlaps CFS to a significant degree (Supplemental Table-1) and appears to occur in 10% of COVID-19 patients[2], has raised concerns of a larger spike in CFS [3]. Here, we established molecular signatures of CFS and a corresponding network-based disease context from RNA-sequencing data generated on whole blood and FACs sorted specific peripheral blood mononuclear cells (PBMCs) isolated from CFS cases and non-CFS controls. The immune cell type specific molecular signatures of CFS we identified, overlapped molecular signatures from other fatiguing illnesses, demonstrating a common molecular etiology. Further, after constructing a probabilistic causal model of the CFS gene expression data, we identified master regulator genes modulating network states associated with CFS, suggesting potential therapeutic targets for CFS. | genetic and genomic medicine |
10.1101/2021.01.31.21250866 | Who should be first in line for the COVID-19 vaccine? Surveys in 13 countries of the publics preferences for prioritisation | How does the public want a COVID-19 vaccine to be allocated? We conducted a conjoint experiment asking 15,536 adults in 13 countries to evaluate 248,576 profiles of potential vaccine recipients that varied randomly on five attributes. Our sample includes diverse countries from all continents. The results suggest that in addition to giving priority to health workers and to those at high risk, the public favours giving priority to a broad range of key workers and to those on lower incomes. These preferences are similar across respondents of different education levels, incomes, and political ideologies, as well as across most surveyed countries. The public favoured COVID-19 vaccines being allocated solely via government programs, but were highly polarized in some developed countries on whether taking a vaccine should be mandatory. There is a consensus among the public on many aspects of COVID-19 vaccination which needs to be taken into account when developing and communicating roll-out strategies. | health economics |
10.1101/2021.01.31.21250868 | Molecular epidemiology of SARS-CoV-2 in Greece reveals low rates of onward virus transmission after lifting of travel restrictions based on risk assessment during summer 2020 | Molecular epidemiology has provided an additive value to traditional public health tools by identifying SARS-CoV-2 clusters, or providing evidence that clusters based on virus sequences and contact tracing are highly concordant. Our aim was to infer the levels of virus importation and to estimate the impact of public health measures related to travel restrictions to local transmission in Greece. Our phylogenetic and phylogeographic analyses included 389 SARS-CoV-2 sequences collected during the first 7 months of the pandemic in Greece and a random collection in 5 replicates of 3,000 sequences sampled globally, as well as the best hits to our dataset identified by BLAST. Phylogenetic analyses revealed the presence of 70 genetically distinct viruses identified as independent introductions into Greece. The proportion of imported strains was 41%, 11.5%, and 8.8% during the three periods of sampling, namely, March (no travel restrictions), April to June (strict travel restrictions), and July to September (lifting of travel restrictions based on a thorough risk assessment), respectively. These findings reveal low levels of onward transmission from imported cases during summer and underscore the importance of targeted public health measures that can increase the safety of international travel during a pandemic. | infectious diseases |
10.1101/2021.01.31.21250167 | Dynamics of SARS-CoV-2-specific antibodies during and after COVID19: Lessons from a biobank in Argentina | BackgroundBiobanks are instrumental for accelerating research. Early in SARS-CoV-2 pandemic, the Argentinean Biobank of Infectious Diseases (BBEI) initiated the COVID19 collection and started its characterization.
MethodsBlood samples from subjects with confirmed SARS-CoV-2 infection either admitted to health institutions or outpatients, were enrolled. Highly exposed seronegative individuals, were also enrolled. Longitudinal samples were obtained in a subset of donors, including persons who donated plasma for therapeutic purposes (plasma donors). SARS-CoV-2-specific IgM and IgG levels, IgG titers and IgG viral neutralization capacity were determined.
FindingsOut of 825 donors, 57.1% were females and median age was 41 years (IQR 32-53 years). Donors were segregated as acute or convalescent donors, and mild versus moderate/severe disease donors. Seventy-eight percent showed seroconversion to SARS-CoV-2 specific antibodies. Specific IgM and IgG showed comparable positivity rates in acute donors. IgM detectability rate declined in convalescent donors while IgG detectability remained elevated in early (74,8%) and late (83%) convalescent donors. Among donors with follow-up samples, IgG levels seemed to decline more rapidly in plasma donors. IgG levels were higher with age, disease severity, number of symptoms, and was more durable in moderate/severe disease donors. Levels and titers of anti-spike/RBD IgG strongly correlated with neutralization activity against WT virus.
InterpretationThe BBEI-COVID19 collection served a dual role in this SARS-CoV-2 global crisis. First, it feed researchers and developers transferring samples and data to fuel research projects. Second, it generated highly needed local data to understand and frame the regional dynamics of the infection.
FundingThis work was supported by a grant from the Agencia Nacional de Promocion de la Investigacion, el Desarrollo Tecnologico y la Innovacion (Agencia I+D+i) from Argentina through an extraordinary funding opportunity to improve the national response to COVID19 (Proyecto COVID N{degrees} 11, IP 285). | infectious diseases |
10.1101/2021.02.01.21250077 | Neonatal-onset autoinflammation and immunodeficiency caused by heterozygous missense mutation of the proteasome subunit β-type 9 | BACKGROUNDDefective proteasome activities due to genetic mutations lead to an autoinflammatory disease, termed as proteasome-associated autoinflammatory syndromes (PRAAS). In PRAAS relapsing inflammations and progressive wasting are common, but immunodeficiency has not been reported.
METHODSWe studied two unrelated Japanese infants with PRAAS-like manifestations. We have also generated and analyzed the mice carrying the candidate mutation found in the patients.
RESULTSBoth patients showed neonatal-onset skin rash, myositis and basal ganglia calcification, similar to PRAAS patients. Meanwhile, they manifested distinct phenotypes, including pulmonary hypertension and immunodeficiency without lipoatrophy. We identified a novel de novo heterozygous missense mutation, G156D, in a proteasome subunit gene, PSMB9, encoding {beta}1i, in the two patients. Maturation and activity of the immunoproteasome were impaired, but ubiquitin accumulation was hardly detected not only in patient-derived cells and samples but also in Psmb9G156D/+ mice. As an immunodeficient phenotype, one patient showed decrease of B cells and increase of monocytes, while the other patient showed decrease of CD8 T cells. The proteasome defects and immunodeficient phenotypes were recapitulated in Psmb9G156D/+ mice.
CONCLUSIONSThe PSMB9 G156D is a unique mutation in proteasome subunits in causing defects by its heterozygosity, affecting two {beta} rings interaction and leading to immunodeficiency. The mutant mice are the first mice model for analyzing proteasome dysfunctions in PRAAS. We here propose the term, proteasome-associated autoinflammation and immunodeficiency disease (PRAID), as an umbrella name for our cases, PRAAS with immunodeficiency, as well as PRAAS described so far. | allergy and immunology |
10.1101/2021.01.30.21250708 | Radius of Gyration as predictor of COVID-19 deaths trend with three-weeks offset | Total and perimetral lockdowns were the strongest nonpharmaceutical interventions to fight against Covid-19, as well as the with the strongest socioeconomic collateral effects. Lacking a metric to predict the effect of lockdowns in the spreading of COVID-19, authorities and decision-makers opted for preventive measures that showed either too strong or not strong enough after a period of two to three weeks, once data about hospitalizations and deaths was available. We present here the radius of gyration as a candidate predictor of the trend in deaths by COVID-19 with an offset of three weeks. Indeed, the radius of gyration aggregates the most relevant microscopic aspects of human mobility into a macroscopic value, very sensitive to temporary trends and local effects, such as lockdowns and mobility restrictions. We use mobile phone data of more than 13 million users in Spain during a period of one year (from January 6th 2020 to January 10th 2021) to compute the users daily radius of gyration and compare the median value of the population with the evolution of COVID-19 deaths: we find that for all weeks where the radius of gyration is above a critical value (70% of its pre-pandemic score) the number of weekly deaths increases three weeks after. The reverse also stands: for all weeks where the radius of gyration is below the critical value, the number of weekly deaths decreased after three weeks. This observation leads to two conclusions: i) the radius of gyration can be used as a predictor of COVID-19-related deaths; and ii) partial mobility restrictions are as effective as a total lockdown as far the radius of gyration is below this critical value.
BackgroundAuthorities around the World have used lockdowns and partial mobility restrictions as major nonpharmaceutical interventions to control the expansion of COVID-19. While effective, the efficiency of these measures on the number of COVID-19 cases and deaths is difficult to quantify, severely limiting the feedback that can be used to tune the intensity of these measures. In addition, collateral socioeconomic effects challenge the overall effectiveness of lockdowns in the long term, and the degree by which they are followed can be difficult to estimate. It is desirable to find both a metric to accurately monitor the mobility restrictions and a predictor of their effectiveness.
MethodsWe correlate the median of the daily radius of gyration of more than 13M users in Spain during all of 2020 with the evolution of COVID-19 deaths for the same period. Mobility data is obtained from mobile phone metadata from one of the major operators in the country.
ResultsThe radius of gyration is a predictor of the trend in the number of COVID-19 deaths with 3 weeks offset. When the radius is above/below a critical threshold (70% of the pre-pandemic score), the number of deaths increases/decreases three weeks later.
ConclusionsThe radius of gyration can be used to monitor in real time the effectiveness of the mobility restrictions. The existence of a critical threshold suggest that partial lockdowns can be as efficient as total lockdowns, while reducing their socioeconomic impact. The mechanism behind the critical value is still unknow, and more research is needed. | epidemiology |
10.1101/2021.01.29.21250784 | COVID-19 vaccine acceptability and inequity in the United States: Results from a nationally representative survey | BackgroundAt the time of this survey, September 1st, there were roughly 6 million COVID-19 cases and 176,771 deaths in the United States and no federally approved vaccine. The objective of this study was to explore the willingness to accept a COVID-19 vaccine in the United States and describe variability in this acceptability by key racial, ethnic and socio-demographic characteristics.
MethodsThis was a cross-sectional digital survey that sampled participants from a nationally-representative panel maintained by a third party, Dynata. Dynata randomly sampled their database and emailed web-based surveys to United States residents ensuring the sample was matched to US Census estimates for age, race, gender, income, and Census region. Participants were asked how willing or unwilling they would be to: 1) receive a COVID-19 vaccine as soon as it was made publicly available, and 2) receive the influenza vaccine for the upcoming influenza season. Participants could respond with extremely willing, willing, unwilling, or extremely unwilling. For those who reported being unwilling to receive a COVID-19 vaccine, reasons for this hesitancy were captured. All participants were asked about where they obtain vaccine-related information, and which sources they trust most. Univariable and multivariable logistic regressions were conducted to examine the association of all demographic characteristics with willingness to receive COVID-19 vaccine.
FindingsFrom September 1st to September 7, 2020, 1592 respondents completed the online survey. Overall, weighted analyses found that only 58.9% of the sample population were either willing or extremely willing to receive a COVID-19 vaccine as soon as it was made publicly available. In comparison, 67.7% of the respondents were willing or extremely willing to take the influenza vaccine. By gender, 66.1% of males and 51.5% of females were willing to receive a COVID-19 vaccine. Males were significantly more willing to receive a COVID-19 vaccine (adjusted odds ratio (OR)=1.98, 95% CI: 1.56, 2.53; p<0.001) than females. Blacks were the least willing racial/ethnic group (48.8%) Blacks, (aOR=0.59, 95%CI: 0.43, 0.80; p<0.001) were significantly less willing, than whites, to receive a COVID-19 vaccine. There were numerous reasons provided for being unwilling to receive a COVID-19 vaccine. The most common reason was concern about the vaccines safety (36.9%), followed by concerns over its efficacy (19.1%).
InterpretationIn conclusion, we found that a substantial proportion (41%) of United States residents are unwilling to receive a COVID-19 vaccine as soon as one is made publicly available. We found that vaccine acceptance differs by sub-populations. In addition to sub-group differences in willingness to receive the vaccine, respondents provided a variety of reasons for being unwilling to receive the vaccine, driven by various sources of vaccine information (and misinformation). This compounds the challenge of delivering a safe and efficacious COVID-19 vaccine at a population level to achieve herd immunity. A multi-pronged and targeted communications and outreach effort is likely needed to achieve a high level of immunization coverage. | epidemiology |
10.1101/2021.01.29.21250786 | Sustainable targeted interventions to mitigate the COVID-19 pandemic: A big data-driven modeling study in Hong Kong | Nonpharmaceutical interventions (NPIs) for contact suppression have been widely used worldwide, which impose harmful burdens on the population and the local economy. The evaluation of alternative NPIs is needed to confront the pandemic with less disruption. By harnessing human mobility data, we develop an agent-based model that can evaluate the efficacies of NPIs with individualized mobility simulations. Based on the model, we propose data-driven targeted interventions to mitigate the COVID-19 pandemic in Hong Kong without city-wide NPIs. We develop a data-driven agent-based model for 7.55 million Hong Kong residents to evaluate the efficacies of various NPIs in the first 80 days of the initial outbreak. The entire territory of Hong Kong is split into 4,905 500m x 500m grids. The model can simulate detailed agent interactions based on the demographics data, public facilities and functional buildings, transportation systems, and travel patterns. The general daily human mobility patterns are adopted from Googles Community Mobility Report. The scenario without any NPIs is set as the baseline. By simulating the epidemic progression and human movement at the individual level, we proposed model-driven targeted interventions, which focus on the surgical testing and quarantine of only a small portion of regions instead of enforcing NPIs in the whole city. The efficacious of common NPIs and the proposed targeted interventions are evaluated by extensive 100 simulations. The proposed model can inform targeted interventions, which are able to effectively contain the COVID-19 outbreak with much lower disruption of the city. It represents a promising approach to sustainable NPIs to help us revive the economy of the city and the world. | epidemiology |
10.1101/2021.01.29.21250786 | Sustainable targeted interventions to mitigate the COVID-19 pandemic: A big data-driven modeling study in Hong Kong | Nonpharmaceutical interventions (NPIs) for contact suppression have been widely used worldwide, which impose harmful burdens on the population and the local economy. The evaluation of alternative NPIs is needed to confront the pandemic with less disruption. By harnessing human mobility data, we develop an agent-based model that can evaluate the efficacies of NPIs with individualized mobility simulations. Based on the model, we propose data-driven targeted interventions to mitigate the COVID-19 pandemic in Hong Kong without city-wide NPIs. We develop a data-driven agent-based model for 7.55 million Hong Kong residents to evaluate the efficacies of various NPIs in the first 80 days of the initial outbreak. The entire territory of Hong Kong is split into 4,905 500m x 500m grids. The model can simulate detailed agent interactions based on the demographics data, public facilities and functional buildings, transportation systems, and travel patterns. The general daily human mobility patterns are adopted from Googles Community Mobility Report. The scenario without any NPIs is set as the baseline. By simulating the epidemic progression and human movement at the individual level, we proposed model-driven targeted interventions, which focus on the surgical testing and quarantine of only a small portion of regions instead of enforcing NPIs in the whole city. The efficacious of common NPIs and the proposed targeted interventions are evaluated by extensive 100 simulations. The proposed model can inform targeted interventions, which are able to effectively contain the COVID-19 outbreak with much lower disruption of the city. It represents a promising approach to sustainable NPIs to help us revive the economy of the city and the world. | epidemiology |
10.1101/2021.01.30.21250827 | A Logistic Formula in Biology and Its Application to Deaths by the Third Wave of COVID-19 in Japan | A logistic formulation in biology is applied to analyze deaths by the third wave of COVID-19 in Japan. | epidemiology |
10.1101/2021.01.30.21250705 | COVID-19 risk perceptions of social interaction and essential activities and inequity in the United States: Results from a nationally representative survey | IntroductionSevere acute respiratory syndrome coronavirus 2 (SARS-CoV-2) related diagnoses, hospitalizations, and deaths have disproportionately affected disadvantaged communities across the United States. Few studies have sought to understand how risk perceptions related to social interaction and essential activities during the COVID-19 pandemic vary by sociodemographic factors, information that could inform targeted interventions to reduce inequities in access to care and information.
MethodsWe conducted a nationally representative online survey of 1,592 adults in the United States to understand risk perceptions related to transmission of COVID-19 for various social and essential activities. We assessed relationships for each activity, after weighting to adjust for the survey design, using bivariate comparisons and multivariable logistic regression modeling, between responses of safe and unsafe, and participant characteristics, including age, gender, race, education, income, and political affiliation.
ResultsHalf of participants were younger than 45 years (n=844, 53.0%), female (n=800, 50.3%), and White/Caucasian (n=685, 43.0%), Black/African American (n=410, 25.8%), or Hispanic/Latino (n=382, 24.0%). Risk perceptions of unsafe for 13 activities ranged from 29.2% to 73.5%. Large gatherings, indoor dining, and visits with elderly relatives had the highest proportion of unsafe responses (>58%) while activities outdoor, visiting the doctor or dentist, and going to the grocery store had the lowest (<36%). Older respondents were more likely to view social gatherings and indoor activities as unsafe, yet more likely to view activities such as going to the grocery store, participating in outdoor activities, visiting elderly relatives, and visiting the doctor or emergency room as safe. Compared to White/Caucasian respondents, Black/African American and Hispanic/Latino respondents were more likely to view activities such as dining and visiting friends outdoor as unsafe. Generally, men vs. women, Republicans vs. Democrats and independents, and individuals with higher vs. lower income were more likely to view activities as safe.
ConclusionsThese findings suggest the importance of sociodemographic differences in risk perception, health behaviors, and access to information and health care when implementing efforts to control the COVID-19 pandemic. Further research should address how evidence-based interventions can be tailored considering these differences with a goal of increased health equity in the pandemic response. | public and global health |
10.1101/2021.01.29.21250745 | Successful reboot of high-performance sporting activities by Japanese national women's handball team in Tokyo, 2020 during the COVID-19 pandemic: An initiative by Japan Sports-Cyber Physical System (JS-CPS) of Sports Research Innovation Project (SRIP) | BackgroundThe COVID-19 pandemic has negatively impacted sporting activities across the world. However, practical training strategies for athletes to reduce the risk of infection during the pandemic has not been definitively studied.
ObjectiveThe purpose of this report was to provide an overview of our challenges encountered during the reboot of high-performance sporting activities of the Japanese national handball team during the 3rd wave of the COVID-19 pandemic in Tokyo, Japan.
MethodsTwenty-nine Japanese national womens handball players and 24 staff participated in the study. To initiate the reboot of their first training camp after COVID-19 stay-home social policy, we conducted: web-based health-monitoring, SARS-CoV-2 screening with polymerase chain reaction (PCR) test, real-time automated quantitative monitoring of social distancing on-court using video-based artificial intelligence (AI) algorithm, physical intensity evaluation with wearable heart rate (HR) and acceleration sensors, and self-reported online questionnaire.
ResultsThe training camp was conducted successfully with no COVID-19 infections. The web-based health monitoring and the frequent PCR testing with short turnaround times contributed remarkably in early detection of athletes health problems and risk screening. During handball, the AI based on-court social-distancing monitoring revealed key time-dependent spatial metrics to define player-to-player proximity. This information facilitated positive team members on and off-game distancing behavior. Athletes regularly achieved around 80% of maximum HR during training, indicating anticipated improvements in achieving their physical intensities. Self-reported questionnaires related to the COVID management in the training camp revealed a sense of security among the athletes allowing them to focus singularly on their training.
ConclusionThe current challenge provided us considerable know-how to create and manage a safe environment for high-performing athletes in the COVID-19 pandemic via the Japan Sports-Cyber Physical System (JS-CPS) of SRIP (Japan Sports Agency, Tokyo, Japan). This report is envisioned to provide informed decisions to coaches, trainers, policymakers from the sports federations in creating targeted, infection-free, sporting and training environments. | sports medicine |
10.1101/2021.01.30.21250785 | Genetic determination of regional connectivity in modelling the spread of COVID-19 outbreak for improved mitigation strategies | Covid-19 has resulted in the death of more than 1,500,000 individuals. Due to the pandemics severity, thousands of genomes have been sequenced and publicly stored with extensive records, an unprecedented amount of data for an outbreak in a single year. Simultaneously, prediction models offered region-specific and often contradicting results, while states or countries implemented mitigation strategies with little information on success, precision, or agreement with neighboring regions. Even though viral transmissions have been already documented in a historical and geographical context, few studies aimed to model geographic and temporal flow from viral sequence information. Here, using a case study of 7 states, we model the flow of the Covid-19 outbreak with respect to phylogenetic information, viral migration, inter- and intra- regional connectivity, epidemiologic and demographic characteristics. By assessing regional connectivity from genomic variants, we can significantly improve predictions in modeling the viral spread and intensity.
Contrary to previous results, our study shows that the vast majority of the first outbreak can be traced to very few lineages, despite the existence of multiple worldwide transmissions. Moreover, our results show that while the distance from hotspots is initially important, connectivity becomes increasingly significant as the virus establishes itself. Similarly, isolated local strategies-such as relying on herd immunity-can negatively impact neighboring states. Our work suggests that we can achieve more efficient unified mitigation strategies with selective interventions. | health informatics |
10.1101/2021.01.22.21250265 | High proportion of genome-wide homology and increased basal pvcrt levels in Plasmodium vivax late recurrences: a chloroquine therapeutic efficacy study | Chloroquine (CQ) is the first-line treatment for Plasmodium vivax malaria in most endemic countries. Monitoring P.vivax CQ resistance (CQR) is critical but remains challenged by the difficulty to distinguish real treatment failure from reinfection or liver relapse. Therapeutic efficacy of CQ against uncomplicated P.vivax malaria was evaluated in Gia Lai province, Vietnam. Sixty-seven patients were enrolled and followed-up for 42 days using microscopy and (RT)qPCR. Adequate clinical and parasitological response (ACPR) was 100% (66/66) on Day 28, but 75.4% (49/65) on Day 42. Eighteen recurrences (27.7%) were detected with a median time-to-recurrence of 42 days (IQR 35, 42) and blood CQ concentration <100ng/ml. Parasite genotyping by microsatellites, SNP-barcoding and whole-genome sequencing (WGS) identified a majority of homologous recurrences, with 80% (8/10) showing >98% identity-by-descent to paired Day 0 samples. Primary infections leading to recurrence occurred in younger individuals (median age for ACPR=25 years [IQR 20, 28]; recurrences=18 [16, 21]; p=0.002), had a longer parasite clearance time (PCT for ACPR=47.5h [IQR 36.2, 59.8]; recurrences=54.2h [48.4, 62.0]; p=0.035) and higher pvcrt gene expression (median relative expression ratio for ACPR=0.09 [IQR 0.05, 0.22]; recurrences=0.20 [0.15, 0.56]; p=0.002), but there was no difference in ex vivo CQ sensitivity. This study shows that CQ remained largely efficacious to treat P.vivax in Gia Lai, i.e. recurrences occurred late (>Day 28) and in the presence of low blood CQ concentrations. However, the combination of WGS and gene expression analysis (pvcrt) with clinical data (PCT) allowed to identify potential emergence of low-grade CQR that should be closely monitored. | infectious diseases |
10.1101/2021.02.01.21250853 | IN-HOSPITAL CONTINUATION WITH ANGIOTENSIN RECEPTOR BLOCKERS IS ASSOCIATED WITH A LOWER MORTALITY RATE THAN CONTINUATION WITH ANGIOTENSIN CONVERTING ENZYME INHIBITORS IN COVID-19 PATIENTS A RETROSPECTIVE COHORT STUDY | BackgroundSeveral studies have reported a reduced risk of death associated with the inpatient use of angiotensin receptor blockers (ARBs) and angiotensin converting enzyme inhibitors (ACEIs) in COVID-19 patients, but have been criticized for incurring in several types of bias. Also, most studies have pooled ACEIs and ARBs as if they were a unique group, overlooking their pharmacological differences. We aimed to assess whether the in-hospital continuation of ARBs and ACEIs, in regular users of these drugs, was associated with a reduced risk of death as compared to their discontinuation and also to compare head-to-head ARBs with ACEIs.
MethodsAdult patients with a PCR-confirmed diagnosis of COVID-19 requiring admission during March, 2020 were consecutively selected from 7 hospitals in Madrid, Spain. Among them, we identified outpatient users of ACEIs/ARBs and divided them in two cohorts depending on treatment discontinuation/continuation at admission. Then, they were followed-up until discharge or in-hospital death. An intention-to-treat survival analysis was carried out and hazard ratios (HRs) and their 95%CI were computed through a Cox regression model adjusted for propensity scores of discontinuation and controlled by potential mediators.
ResultsOut of 625 ACEI/ARB users, 340(54.4%) discontinued treatment. The in-hospital mortality rates were 27.6% and 27.7% in discontinuation and continuation cohorts, respectively (HR=1.01; 95%CI:0.70-1.46). No difference in mortality was observed between ARB and ACEI discontinuation (28.6% vs. 27.1%, respectively), while a significantly lower mortality rate was found among patients who continued with ARBs (20.8%,N=125) as compared to those who continued with ACEIs (33.1%,N=136; p=0.03). The head-to-head comparison (ARB vs. ACEI continuation) yielded an adjusted HR of 0.52 (95%CI:0.29-0.93), being especially notorious among males (HR=0.34; 95%CI:0.12-0.93), subjects older than 74 years (HR=0.46; 95%CI:0.25-0.85), and patients with obesity (HR=0.22; 95%CI:0.05-0.94), diabetes (HR=0.36; 95%CI:0.13-0.97) and heart failure (HR=0.12; 95%CI:0.03-0.97).
ConclusionsAmong regular users of ARBs admitted for COVID-19, the in-hospital continuation with them was associated with an improved survival, while this was not observed with ACEIs. Regular users of ARBs should continue with this treatment if admitted for COVID-19, unless medically contraindicated. In admitted ACEI users, a switching to ARBs should be considered, especially among high-risk patients.
GRAPHICAL ABSTRACT
O_FIG O_LINKSMALLFIG WIDTH=200 HEIGHT=129 SRC="FIGDIR/small/21250853v1_ufig1.gif" ALT="Figure 1">
View larger version (41K):
[email protected]@5c67c2org.highwire.dtl.DTLVardef@9ffbfforg.highwire.dtl.DTLVardef@1594277_HPS_FORMAT_FIGEXP M_FIG C_FIG | pharmacology and therapeutics |
10.1101/2021.01.31.21250863 | Stay Home and Stay Active? The impact of stay-at-home restrictions on physical activity routines in the UK during the COVID-19 pandemic | Government restrictions applied during the COVID-19 pandemic in the UK led to the disruption of many peoples physical activity routines, with sports and leisure facilities closed and outdoor exercise only permitted once per day. In this study we investigated which population groups were impacted most in terms of reduced physical activity levels during these periods, and which groups benefitted in terms of increasing their usual level of physical activity. We surveyed UK residents, sampled through users of a rewards-for-exercise app (Sweatcoin; n=749) and an online panel (Prolific; n=907). Of the app users, n=487 further provided daily step-count data collected by the app, prior to, and during the periods of restrictions between March and June 2020. Regression models were applied to investigate factors associated with subjective change (perceived change in physical activity) and objective change (log-percentage change in daily step-count) in physical activity during the periods of restrictions. ANOVAs were used to further investigate the significant factors identified. Key factors associated with a substantial subjective reduction in physical activity included those classed as obese, gym users and people living in urban areas. All participants had a reduced step count during restrictions, with Black, Asian and minority ethnic (BAME) groups, students and urban dwellers showing the largest reductions. Therefore, targeted interventions are required to ensure that the physical and mental health impacts of sedentary behaviour are not exacerbated over the long-term by significant reductions in physical activity identified in these groups, particularly those who are also more vulnerable to the COVID-19 virus. | public and global health |
10.1101/2021.01.31.21250863 | Stay Home and Stay Active? The impact of stay-at-home restrictions on physical activity routines in the UK during the COVID-19 pandemic | Government restrictions applied during the COVID-19 pandemic in the UK led to the disruption of many peoples physical activity routines, with sports and leisure facilities closed and outdoor exercise only permitted once per day. In this study we investigated which population groups were impacted most in terms of reduced physical activity levels during these periods, and which groups benefitted in terms of increasing their usual level of physical activity. We surveyed UK residents, sampled through users of a rewards-for-exercise app (Sweatcoin; n=749) and an online panel (Prolific; n=907). Of the app users, n=487 further provided daily step-count data collected by the app, prior to, and during the periods of restrictions between March and June 2020. Regression models were applied to investigate factors associated with subjective change (perceived change in physical activity) and objective change (log-percentage change in daily step-count) in physical activity during the periods of restrictions. ANOVAs were used to further investigate the significant factors identified. Key factors associated with a substantial subjective reduction in physical activity included those classed as obese, gym users and people living in urban areas. All participants had a reduced step count during restrictions, with Black, Asian and minority ethnic (BAME) groups, students and urban dwellers showing the largest reductions. Therefore, targeted interventions are required to ensure that the physical and mental health impacts of sedentary behaviour are not exacerbated over the long-term by significant reductions in physical activity identified in these groups, particularly those who are also more vulnerable to the COVID-19 virus. | public and global health |
10.1101/2021.01.31.21250872 | Returning to a normal life via COVID-19 vaccines in the USA: a large-scale agent-based simulation study | BackgroundIn 2020, COVID-19 has claimed more than 300,000 deaths in the US alone. While non-pharmaceutical interventions were implemented by federal and state governments in the USA, these efforts have failed to contain the virus. Following the FDA approval of two COVID-19 vaccines, however, the hope for the return to normalcy is renewed. This hope rests on an unprecedented nation-wide vaccine campaign, which faces many logistical challenges and is also contingent on several factors whose values are currently unknown.
ObjectiveWe study the effectiveness of a nation-wide vaccine campaign in response to different vaccine efficacies, the willingness of the population to be vaccinated, and the daily vaccine capacity under two different federal plans. To characterize the possible outcomes most accurately, we also account for the interactions between non-pharmaceutical interventions and vaccines, through six scenarios that capture a range of possible impact from non-pharmaceutical interventions.
MethodsWe use large-scale cloud-based agent-based simulations by implementing the vaccination campaign using Covasim, an open-source ABM for COVID-19 that has been used in several peer-reviewed studies and accounts for individual heterogeneity as well as a multiplicity of contact networks. Several modifications to the parameters and simulation logic were made to better align the model with current evidence. We chose six non-pharmaceutical intervention scenarios and applied the vaccination intervention following both the plan proposed by Operation Warp Speed (former Trump administration) and the plan of one million vaccines per day, proposed by the Biden administration. We accounted for unknowns in vaccine efficacies and levels of population compliance by varying both parameters. For each experiment, the cumulative infection growth is fitted to a logistic growth model, and the carrying capacities and the growth rates are recorded.
ResultsFor both vaccination plans and all non-pharmaceutical intervention scenarios, the presence of the vaccine intervention considerably lowers the total number of infections when life returns to normal, even when the population compliance to vaccines is as low at 20%. We noted an unintended consequence: given the vaccine availability estimates under both federal plans and the focus on vaccinating individuals by age categories, a significant reduction in non-pharmaceutical interventions results in a counterintuitive situation in which higher vaccine compliance then leads to more total infections.
ConclusionsAlthough potent, vaccines alone cannot effectively end the pandemic given the current availability estimates and the adopted vaccination strategy. Non-pharmaceutical interventions need to continue and be enforced to ensure high compliance, so that the rate of immunity established by vaccination outpaces that induced by infections. | public and global health |
10.1101/2021.01.31.21250726 | Effectiveness of Wolbachia-infected mosquito deployments in reducing the incidence of dengue and chikungunya in Niteroi, Brazil: a quasi-experimental study | BackgroundThe introduction of the bacterium Wolbachia (wMel strain) into Aedes aegypti mosquitoes reduces their capacity to transmit dengue and other arboviruses. Evidence of a reduction in dengue case incidence following field releases of wMel-infected Ae. aegypti has been reported previously from a cluster randomised controlled trial in Indonesia, and quasi-experimental studies in Indonesia and northern Australia.
MethodsFollowing pilot releases in 2015 - 2016 and a period of intensive community engagement, deployments of adult wMel-infected Ae. aegypti mosquitoes were conducted in Niteroi, Brazil during 2017 - 2019. Deployments were phased across four release zones, with a total area of 83 km2 and a residential population of approximately 373,000. A quasi-experimental design was used to evaluate the effectiveness of wMel deployments in reducing dengue, chikungunya and Zika incidence. An untreated control zone was pre-defined, which was comparable to the intervention area in historical dengue trends. The wMel intervention effect was estimated by controlled interrupted time series analysis of monthly dengue, chikungunya and Zika case notifications to the public health surveillance system before, during and after releases, from release zones and the control zone.
ResultsThree years after commencement of releases, wMel introgression into local Ae. aegypti populations was heterogeneous throughout Niteroi, reaching a high prevalence (>80%) in the earliest release zone, and more moderate levels (prevalence 40 -70%) elsewhere. Despite this spatial heterogeneity in entomological outcomes, the wMel intervention was associated with a 69% reduction in dengue incidence (95% confidence interval 54%, 79%), a 56% reduction in chikungunya incidence (95%CI 16%, 77%) and a 37% reduction in Zika incidence (95%CI 1%, 60%), in the aggregate release area compared with the pre-defined control area. This significant intervention effect on dengue was replicated across all four release zones, and in three of four zones for chikungunya, though not in individual release zones for Zika.
ConclusionsWe demonstrate that wMel Wolbachia can be successfully introgressed into Ae. aegypti populations in a large and complex urban setting, and that a significant public health benefit from reduced incidence of Aedes-borne disease accrues even where the prevalence of wMel in local mosquito populations is moderate and spatially heterogeneous. These findings are consistent with the results of randomised and non-randomised field trials in Indonesia and northern Australia, and are supportive of the Wolbachia biocontrol method as a multivalent intervention against dengue, chikungunya and Zika. | public and global health |
10.1101/2021.01.31.21250726 | Effectiveness of Wolbachia-infected mosquito deployments in reducing the incidence of dengue and chikungunya in Niteroi, Brazil: a quasi-experimental study | BackgroundThe introduction of the bacterium Wolbachia (wMel strain) into Aedes aegypti mosquitoes reduces their capacity to transmit dengue and other arboviruses. Evidence of a reduction in dengue case incidence following field releases of wMel-infected Ae. aegypti has been reported previously from a cluster randomised controlled trial in Indonesia, and quasi-experimental studies in Indonesia and northern Australia.
MethodsFollowing pilot releases in 2015 - 2016 and a period of intensive community engagement, deployments of adult wMel-infected Ae. aegypti mosquitoes were conducted in Niteroi, Brazil during 2017 - 2019. Deployments were phased across four release zones, with a total area of 83 km2 and a residential population of approximately 373,000. A quasi-experimental design was used to evaluate the effectiveness of wMel deployments in reducing dengue, chikungunya and Zika incidence. An untreated control zone was pre-defined, which was comparable to the intervention area in historical dengue trends. The wMel intervention effect was estimated by controlled interrupted time series analysis of monthly dengue, chikungunya and Zika case notifications to the public health surveillance system before, during and after releases, from release zones and the control zone.
ResultsThree years after commencement of releases, wMel introgression into local Ae. aegypti populations was heterogeneous throughout Niteroi, reaching a high prevalence (>80%) in the earliest release zone, and more moderate levels (prevalence 40 -70%) elsewhere. Despite this spatial heterogeneity in entomological outcomes, the wMel intervention was associated with a 69% reduction in dengue incidence (95% confidence interval 54%, 79%), a 56% reduction in chikungunya incidence (95%CI 16%, 77%) and a 37% reduction in Zika incidence (95%CI 1%, 60%), in the aggregate release area compared with the pre-defined control area. This significant intervention effect on dengue was replicated across all four release zones, and in three of four zones for chikungunya, though not in individual release zones for Zika.
ConclusionsWe demonstrate that wMel Wolbachia can be successfully introgressed into Ae. aegypti populations in a large and complex urban setting, and that a significant public health benefit from reduced incidence of Aedes-borne disease accrues even where the prevalence of wMel in local mosquito populations is moderate and spatially heterogeneous. These findings are consistent with the results of randomised and non-randomised field trials in Indonesia and northern Australia, and are supportive of the Wolbachia biocontrol method as a multivalent intervention against dengue, chikungunya and Zika. | public and global health |
10.1101/2021.01.31.21250726 | Effectiveness of Wolbachia-infected mosquito deployments in reducing the incidence of dengue and other Aedes-borne diseases in Niteroi, Brazil: a quasi-experimental study | BackgroundThe introduction of the bacterium Wolbachia (wMel strain) into Aedes aegypti mosquitoes reduces their capacity to transmit dengue and other arboviruses. Evidence of a reduction in dengue case incidence following field releases of wMel-infected Ae. aegypti has been reported previously from a cluster randomised controlled trial in Indonesia, and quasi-experimental studies in Indonesia and northern Australia.
MethodsFollowing pilot releases in 2015 - 2016 and a period of intensive community engagement, deployments of adult wMel-infected Ae. aegypti mosquitoes were conducted in Niteroi, Brazil during 2017 - 2019. Deployments were phased across four release zones, with a total area of 83 km2 and a residential population of approximately 373,000. A quasi-experimental design was used to evaluate the effectiveness of wMel deployments in reducing dengue, chikungunya and Zika incidence. An untreated control zone was pre-defined, which was comparable to the intervention area in historical dengue trends. The wMel intervention effect was estimated by controlled interrupted time series analysis of monthly dengue, chikungunya and Zika case notifications to the public health surveillance system before, during and after releases, from release zones and the control zone.
ResultsThree years after commencement of releases, wMel introgression into local Ae. aegypti populations was heterogeneous throughout Niteroi, reaching a high prevalence (>80%) in the earliest release zone, and more moderate levels (prevalence 40 -70%) elsewhere. Despite this spatial heterogeneity in entomological outcomes, the wMel intervention was associated with a 69% reduction in dengue incidence (95% confidence interval 54%, 79%), a 56% reduction in chikungunya incidence (95%CI 16%, 77%) and a 37% reduction in Zika incidence (95%CI 1%, 60%), in the aggregate release area compared with the pre-defined control area. This significant intervention effect on dengue was replicated across all four release zones, and in three of four zones for chikungunya, though not in individual release zones for Zika.
ConclusionsWe demonstrate that wMel Wolbachia can be successfully introgressed into Ae. aegypti populations in a large and complex urban setting, and that a significant public health benefit from reduced incidence of Aedes-borne disease accrues even where the prevalence of wMel in local mosquito populations is moderate and spatially heterogeneous. These findings are consistent with the results of randomised and non-randomised field trials in Indonesia and northern Australia, and are supportive of the Wolbachia biocontrol method as a multivalent intervention against dengue, chikungunya and Zika. | public and global health |
10.1101/2021.01.31.21250067 | DPP9 deficiency: an inflammasomopathy which can be rescued by lowering NLRP1/IL-1 signaling | Dipeptidyl peptidase 9 (DPP9) is a direct inhibitor of NLRP1, but how it impacts inflammasome regulation in vivo is not yet established. Here, we report two families with immune-associated defects, skin pigmentation abnormalities and neurological deficits that segregate with biallelic DPP9 rare variants. Using patient-derived primary cells and biochemical assays, these variants are shown to behave as hypomorphic or loss-of-function alleles that fail to repress NLRP1. Remarkably, the removal in mice, of a single copy of either Nlrp1a/b/c, Asc, Gsdmd, Il-1r, but not Il-18, was sufficient to rescue the lethality of Dpp9 mutant neonates. These experiments suggest that the deleterious consequences of DPP9 deficiency are mostly driven by the aberrant activation of the canonical NLRP1 inflammasome and IL-1{beta} signaling. Collectively, our results delineate a Mendelian disorder of DPP9 deficiency driven by increased NLRP1 activity as demonstrated in patient cells and in a mouse model of the disease. | genetic and genomic medicine |
10.1101/2021.01.31.21250067 | DPP9 deficiency: an inflammasomopathy which can be rescued by lowering NLRP1/IL-1 signaling | Dipeptidyl peptidase 9 (DPP9) is a direct inhibitor of NLRP1, but how it impacts inflammasome regulation in vivo is not yet established. Here, we report two families with immune-associated defects, skin pigmentation abnormalities and neurological deficits that segregate with biallelic DPP9 rare variants. Using patient-derived primary cells and biochemical assays, these variants are shown to behave as hypomorphic or loss-of-function alleles that fail to repress NLRP1. Remarkably, the removal in mice, of a single copy of either Nlrp1a/b/c, Asc, Gsdmd, Il-1r, but not Il-18, was sufficient to rescue the lethality of Dpp9 mutant neonates. These experiments suggest that the deleterious consequences of DPP9 deficiency are mostly driven by the aberrant activation of the canonical NLRP1 inflammasome and IL-1{beta} signaling. Collectively, our results delineate a Mendelian disorder of DPP9 deficiency driven by increased NLRP1 activity as demonstrated in patient cells and in a mouse model of the disease. | genetic and genomic medicine |
10.1101/2021.01.31.21250647 | ESTIMATING THE HEALTH AND ECONOMIC EFFECTS OF THE VOLUNTARY SODIUM REDUCTION TARGETS IN BRAZIL: MICROSIMULATION ANALYSIS | ObjectiveTo analyse the potential health and economic impact of the voluntary sodium reduction targets in Brazil, from 2013 to 2032.
DesignModelling study. A microsimulation approach of a close-to-reality synthetic population (IMPACT NCD BR) was used to evaluate the potential health benefits of setting voluntary upper limits for sodium content as part of Brazilian government strategy. The model estimates cardiovascular disease (CVD) deaths and cases prevented or postponed, and disease treatment costs.
Model inputs were informed by the 2013 National Health Survey, the 2008-2009 Household Budget Survey, and high-quality meta-analyses to inform model inputs. Costs included costs of the National Health System on CVD treatment and informal care costs.
SettingSynthetic population with similar characteristics to the community dwelling population of Brazil.
ParticipantsSynthetic people with traits informed by the national surveys of Brazil.
Main outcome measuresCardiovascular disease cases and deaths prevented or postponed by 2032, over a 20-year period (2013-2032), stratified by age and sex.
ResultsApplying the voluntary sodium targets between 2013 and 2032 could prevent or postpone approximately 112,000 CVD cases (95% Uncertainty Intervals UI: 28,000 to 258,000) among men and 70,000 cases among women (95% UI: 16,000 to 167,000), and also prevent or postpone approximately 2,600 CVD deaths (95% UI: -1,000 to 11,000), 55% in men. The policy could also produce a net cost saving of approximately US$ 222 million (95% UI: US$ 53.6-524.4 million) in medical costs to the Brazilian National Health System for the treatment of CHD and stroke, and save approximately US$ 71 million (95% UI: US$ 17.1-166.9 million) in informal costs.
ConclusionsBrazilian voluntary sodium targets could generate substantial health and economic impacts. Further progress in lower, more comprehensive thresholds for sodium in foods and strategies for reducing other sodium sources could maximise the health and economic benefits to the population. This is the first IMPACT NCD microsimulation model adapted to a Latin American country and represents a big step forward for using models to inform policy in the region. The results indicate that sodium reduction targets must go further and faster in order to achieve national and international commitments.
WHAT IS ALREADY KNOWN ON THIS TOPIC- Public-private partnerships (PPPs), including voluntary targets for the reduction of critical nutrients, such as sodium, sugars and fats, through food reformulation, have been promoted as effective strategies for addressing dietary factors for non-communicable disease prevention.
- Salt (sodium chloride) intake is a leading dietary risk factor for cardiovascular disease (CVD) globally. Over 27 thousand deaths from coronary heart disease and stroke are attributable to excessive sodium intake in Brazil every year. About 20% of sodium in the Brazilian diet comes from industrialized foods, and over 70% come from added table salt and salt-based condiments.
- Since 2011, Brazil has implemented a voluntary approach for reducing sodium in processed and ultra-processed foods, including salt-based condiments. Nevertheless, national targets have not matched the targets of other countries in the Region of the Americas and globally, and target compliance has not been achieved across the entire Brazilian food market.
WHAT THIS STUDY ADDS- We estimated the impact of the current sodium reduction targets in Brazil, by analysing individual-level food category consumption and sodium intake.
- Using the first IMPACT NCD microsimulation model adapted to the Latin American context, we estimated that if applied between 2013 and 2032, the voluntary targets could potentially have prevented approximately 180,000 CVD cases and 2,500 CVD deaths. The case reductions might save approximately US$ 220 million in CVD-related medical costs (hospitalizations, outpatient and primary health care and pharmaceutical treatment) and some US$ 70 million in informal costs.
- More impactful sodium reductions in Brazil may not be achieved without more stringent and comprehensive targets; for instance, mandatory rather than voluntary policy formulation, including policies aimed at reducing the consumption of discretionary table salt. | health economics |
10.1101/2021.02.01.21250904 | Limited specificity of SARS-CoV-2 antigen-detecting rapid diagnostic tests at low temperatures | SARS-CoV-2 antigen-detecting rapid diagnostic tests (Ag-RDTs) are available within and outside of health care settings to enable increased access to COVID-19 diagnosis. These environments include provisional testing facilities, lacking temperature control; as outside temperatures fall, recommended testing temperatures cannot be guaranteed. We report impaired specificity in two out of six Ag-RDTs when used at 2-4{degrees}C, indicating that testing in cold settings might cause false-positive results potentially entailing unwarranted quarantine assignments and incorrect incidence estimates. | infectious diseases |
10.1101/2021.01.29.21250791 | Bayesian Calibration of Using CO2 Sensors to Assess Ventilation Conditions and Associated COVID-19 Airborne Aerosol Transmission Risk in Schools | Ventilation rate plays a significant role in preventing the airborne transmission of diseases in indoor spaces. Classrooms are a considerable challenge during the COVID-19 pandemic because of large occupancy density and mainly poor ventilation conditions. The indoor CO2 level may be used as an index for estimating the ventilation rate and airborne infection risk. In this work, we analyzed a one-day measurement of CO2 levels in three schools to estimate the ventilation rate and airborne infection risk. Sensitivity analysis and Bayesian calibration methods were applied to identify uncertainties and calibrate key parameters. The outdoor ventilation rate with a 95% confidence was 1.96 {+/-} 0.31ACH for Room 1 with mechanical ventilation and fully open window, 0.40 {+/-} 0.08 ACH for Rooms 2, and 0.79 {+/-} 0.06 ACH for Room 3 with only windows open. A time-averaged CO2 level < 450 ppm is equivalent to a ventilation rate > 10 ACH in all three rooms. We also defined the probability of the COVID-19 airborne infection risk associated with ventilation uncertainties. The outdoor ventilation threshold to prevent classroom COVID-19 aerosol spreading is between 3 - 8 ACH, and the CO2 threshold is around 500 ppm of a school day (< 8 hr) for the three schools.
Practical ImplicationsThe actual outdoor ventilation rate in a room cannot be easily measured, but it can be calculated by measuring the transient indoor CO2 level. Uncertainty in input parameters can result in uncertainty in the calculated ventilation rate. Our three classrooms study shows that the estimated ventilation rate considering various input parameters uncertainties is between {+/-} 8-20 %. As a result, the uncertainty of the ventilation rate contributes to the estimated COVID-19 airborne aerosol infection risks uncertainty up to {+/-} 10 %. Other studies can apply the proposed Bayesian and MCMC method to estimating building ventilation rates and airborne aerosol infection risks based on actual measurement data such as CO2 levels with uncertainties and sensitivity of input parameters identified. The outdoor ventilation rate and CO2 threshold values as functions of exposure times could be used as the baseline models to develop correlations to be implemented by cheap/portable sensors to be applied in similar situations to monitor ventilation conditions and airborne risk levels. | infectious diseases |
10.1101/2021.01.31.21250871 | Clinical Prediction Models for Primary Prevention of Cardiovascular Disease: Validity in Independent Cohorts | BackgroundClinical prediction models (CPMs) are used to inform treatment decisions for the primary prevention of cardiovascular disease. We aimed to assess the performance of such CPMs in fully independent cohorts.
Methods and Results63 models predicting outcomes for patients at risk of cardiovascular disease from the Tufts PACE CPM Registry were selected for external validation on publicly available data from up to 4 broadly inclusive primary prevention clinical trials. For each CPM-trial pair, we assessed model discrimination, calibration, and net benefit. Results were stratified based on the relatedness of derivation and validation cohorts, and net benefit was reassessed after updating model intercept, slope, or complete re-estimation. The median c statistic of the CPMs decreased from 0.77 (IQR 0.72-0.78) in the derivation cohorts to 0.63 (IQR 0.58-0.66) when externally validated. The validation c-statistic was higher when derivation and validation cohorts were considered related than when they were distantly related (0.67 vs 0.60, p < 0.001). The calibration slope was also higher in related cohorts than distantly related cohorts (0.69 vs 0.58, p < 0.001). Net benefit analysis suggested substantial likelihood of harm when models were externally applied, but this likelihood decreased after model updating.
ConclusionsDiscrimination and calibration decrease significantly when CPMs for primary prevention of cardiovascular disease are tested in external populations, particularly when the population is only distantly related to the derivation population. Poorly calibrated predictions lead to poor decision making. Model updating can reduce the likelihood of harmful decision making, and is needed to realize the full potential of risk-based decision making in new settings. | cardiovascular medicine |
10.1101/2021.02.01.21250537 | Towards a COVID-19 symptom triad: The importance of symptom constellations in the SARS-CoV-2 pandemic | Pandemic scenarios like SARS-Cov-2 require rapid information aggregation. In the age of eHealth and data-driven medicine, publicly available symptom tracking tools offer efficient and scalable means of collecting and analyzing large amounts of data. As a result, information gains can be communicated to front-line providers. We have developed such an application in less than a month and reached more than 500 thousand users within 48 hours. The dataset contains information on basic epidemiological parameters, symptoms, risk factors and details on previous exposure to a COVID-19 patient. Exploratory Data Analysis revealed different symptoms reported by users with confirmed contacts vs. no confirmed contacts. The symptom combination of anosmia, cough and fatigue was the most important feature to differentiate the groups, while single symptoms such as anosmia, cough or fatigue alone were not sufficient. A linear regression model from the literature using the same symptom combination as features was applied on all data. Predictions matched the regional distribution of confirmed cases closely across Germany, while also indicating that the number of cases in northern federal states might be higher than officially reported. In conclusion, we report that symptom combinations anosmia, fatigue and cough are most likely to indicate an acute SARS-CoV-2 infection. | epidemiology |
10.1101/2021.02.01.21250877 | Optimal time to return to normality: parallel use of COVID-19 vaccines and circuit breakers | By January 2020, the COVID-19 illness has caused over two million deaths. Countries have restricted disease spread through non-pharmaceutical interventions (e.g., social distancing). More severe "lockdowns" have also been required. Although lockdowns keep people safer from the virus, they substantially disrupt economies and individual well-being. Fortunately, vaccines are becoming available. Yet, vaccination programs may take several months to implement, requiring further time for individuals to develop immunity following inoculation. To prevent health services being overwhelmed it may be necessary to implement further lockdowns in conjunction with vaccination. Here, we investigate optimal approaches for vaccination under varying lockdown lengths and/or severities to prevent COVID-19-related deaths exceeding critical thresholds. We find increases in vaccination rate cause a disproportionately larger decrease in lockdowns: with vaccination, severe lockdowns can reduce infections by up to 89%. Notably, we include demographics, modelling three groups: vulnerable, front-line workers, and non-vulnerable. We investigate the sequence of vaccination. One counter-intuitive finding is that even though the vulnerable group is high risk, demographically, this is a small group (per person, vaccination occurs more slowly) so vaccinating this group first achieves limited gains in overall disease control. Better disease control occurs by vaccinating the non-vulnerable group with longer and/or more severe lockdowns. | epidemiology |
10.1101/2021.02.01.21250918 | Change in cognition and body mass index in relation to preclinical dementia | INTRODUCTIONTo study if declining cognition drives weight loss in preclinical dementia, we examined the longitudinal association between body mass index (BMI) and cognitive abilities in those who did or did not later develop dementia.
METHODSUsing data from individuals spanning age 50-89, we applied dual change score models separately in individuals who remained cognitively intact (n=1,498) and those who were diagnosed with dementia within five years of last assessment (n=459).
RESULTSAmong the cognitively intact, there was a bidirectional association: stable BMI predicted stable cognition and vice versa. Among those subsequently diagnosed with dementia, the association was unidirectional: higher BMI predicted declining cognition, but cognition did not predict change in BMI.
DISCUSSIONWhile BMI and cognition stabilized each other when cognitive functioning was intact, this buffering effect was missing in the preclinical dementia phase. This finding indicates that weight loss in preclinical dementia is not driven by declining cognition. | epidemiology |
10.1101/2021.02.01.21250739 | Comparing effect estimates in randomized trials and observational studies from the same population: an application to percutaneous coronary intervention | BackgroundThe ability for real world data to deliver similar results as a trial that asks the same question about the risks or benefits of a clinical intervention can be restricted not only by lack of randomization, but also limited information on eligibility criteria and outcomes. To understand when results from observational studies and randomized trials are comparable, we carried out an observational emulation of a target trial designed to ask similar questions as the VALIDATE randomized trial. VALIDATE compared the effect of bivalirudin and heparin during percutaneous coronary intervention on the risk of death, myocardial infarction, and bleeding across Sweden.
MethodsWe specified the protocol of a target trial similar to the VALIDATE trial protocol, then emulated the target trial in the period before the trial took place using data from the SWEDEHEART registry; the same registry in which the trial was undertaken.
ResultsThe target trial emulation and the VALIDATE trial both estimated no difference in the effect of bivalirudin and heparin on the risk of death or myocardial infarction by 180 days: emulation risk ratio for death 1.21 (0.88, 1.54); VALIDATE hazard ratio for death 1.05 (0.78, 1.41). The observational data, however, could not capture less severe cases of bleeding, resulting in an inability to define a bleeding outcome like the trial, and could not account for intractable confounding early in follow-up (risk ratio for death by 14 days 1.85 (0.95, 3.63)).
ConclusionUsing real world data to emulate a target trial can deliver accurate long-term effect estimates. Yet, even with rich observational data, it is not always possible to estimate the short-term effect of interventions, or the effect on outcomes for which data are not routinely collected. If registries included information on reasons for treatment decisions, researchers may be better positioned to identify important confounders. | epidemiology |
10.1101/2021.02.01.21250609 | Forskolin-induced swelling of intestinal organoids predicts long-term cystic fibrosis disease progression | Patient-derived organoids hold great potential as predictive biomarker for disease expression or therapeutic response. Here, we used intestinal organoids to estimate individual cystic fibrosis transmembrane conductance regulator (CFTR) function of people with cystic fibrosis, a monogenic life-shortening disease associated with more than 2000 CFTR mutations and highly variable disease progression. In vitro CFTR function in CF intestinal organoids of 176 individuals with diverse CFTR mutations was quantified by forskolin induced swelling and was strongly associated with longitudinal changes of lung function and development of pancreatic insufficiency, CF-related liver disease and diabetes. This association was not observed when the commonly used biomarker of CFTR function sweat chloride concentration was used. The data strongly exemplifies the value of an organoid-based biomarker in a clinical disease setting and supports the prognostic value of forskolin induced swelling of intestinal organoids, especially for people with CF who have rare CFTR genotypes with unclear clinical consequences. | respiratory medicine |
10.1101/2021.02.01.21250623 | Differential expression of Angiotensin-Converting Enzyme 2 in Nasal Tissue of Patients with Chronic Rhinosinusitis with Nasal Polyps | The coronavirus disease 2019 (COVID-19) caused by the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) became a pandemic and a global health emergency. The SARS-CoV-2 receptor angiotensin-converting enzyme 2 (ACE2) is highly expressed in nasal epithelial cells and plays a major role in cellular entry leading to infection. High expression of ACE2 has been suggested to be a potential risk factor for virus infection and disease severity. However the profile of ACE2 gene expression in diseases of the upper airways remains poorly understood. We herein investigated ACE2 gene expression in the nasal tissues of a cohort of Swedish patients with chronic rhinosinusitis with nasal polyps (CRSwNPs) using RT-qPCR. ACE2 mRNA expression was significantly reduced in the nasal mucosa of CRSwNP patients compared to that of controls. Moreover, we observed a sex-dependant difference in nasal ACE2 expression, where significantly lower levels of the ACE2 transcript were detected in the nasal mucosa of only female CRSwNP patients. These findings indicate that CRSwNP patients with a decrease in ACE2 gene expression may thereby be less prone to be infected by SARS-CoV-2. These results enhance our understanding on the profile of ACE2 expression in the nasal mucosa of patients with upper airway diseases, and their susceptibility to infection with SARS-CoV-2. | respiratory medicine |
10.1101/2021.01.31.21250524 | Non-invasively measured brain activity and radiological progression in diffuse glioma | Non-invasively measured brain activity is related to progression-free survival in glioma patients, suggesting its potential as a marker of glioma progression. We therefore assessed the relationship between brain activity and increasing tumor volumes on routine clinical magnetic resonance imaging (MRI) in glioma patients. Postoperative magnetoencephalography (MEG) was recorded in 45 diffuse glioma patients. Brain activity was estimated using three measures (absolute broadband power, offset and slope) calculated at three spatial levels: global average, averaged across the peritumoral areas, and averaged across the homologues of these peritumoral areas in the contralateral hemisphere. Tumors were segmented on MRI. Changes in tumor volume between the two scans surrounding the MEG were calculated and correlated with brain activity. Brain activity was compared between patient groups classified into having increasing or stable tumor volume. Results show that brain activity was significantly increased in the tumor hemisphere in general, and in peritumoral regions specifically. However, none of the measures and spatial levels of brain activity correlated with changes in tumor volume, nor did they differ between patients with increasing versus stable tumor volumes. Longitudinal studies in more homogeneous subgroups of glioma patients are necessary to further explore the clinical potential of non-invasively measured brain activity. | oncology |
10.1101/2021.01.31.21250524 | Non-invasively measured brain activity and radiological progression in diffuse glioma | Non-invasively measured brain activity is related to progression-free survival in glioma patients, suggesting its potential as a marker of glioma progression. We therefore assessed the relationship between brain activity and increasing tumor volumes on routine clinical magnetic resonance imaging (MRI) in glioma patients. Postoperative magnetoencephalography (MEG) was recorded in 45 diffuse glioma patients. Brain activity was estimated using three measures (absolute broadband power, offset and slope) calculated at three spatial levels: global average, averaged across the peritumoral areas, and averaged across the homologues of these peritumoral areas in the contralateral hemisphere. Tumors were segmented on MRI. Changes in tumor volume between the two scans surrounding the MEG were calculated and correlated with brain activity. Brain activity was compared between patient groups classified into having increasing or stable tumor volume. Results show that brain activity was significantly increased in the tumor hemisphere in general, and in peritumoral regions specifically. However, none of the measures and spatial levels of brain activity correlated with changes in tumor volume, nor did they differ between patients with increasing versus stable tumor volumes. Longitudinal studies in more homogeneous subgroups of glioma patients are necessary to further explore the clinical potential of non-invasively measured brain activity. | oncology |
10.1101/2021.02.01.21250900 | RAY: CRISPR diagnostic for rapid and accurate detection of SARS-CoV2 variants on a paper strip | The COVID-19 pandemic originating in the Wuhan province of China in late 2019 has impacted global health, causing increased mortality among elderly patients and individuals with comorbid conditions. During the passage of the virus through affected populations, it has undergone mutations- some of which have recently been linked with increased viral load and prognostic complexities. Interestingly, several of these variants are point mutations that are difficult to diagnose using the gold standard quantitative real-time PCR (qPCR) method. This necessitates widespread sequencing which is expensive, has long turn-around times, and requires high viral load for calling mutations accurately. In this study, we show that the high specificity of Francisella novicida Cas9 (FnCas9) to point mismatches can be successfully adapted for the simultaneous detection of SARS-CoV2 infection as well as for detecting point mutations in the sequence of the virus obtained from patient samples. We report the detection of the mutation N501Y (earlier shown to be present in the British N501Y.V1, South African N501Y.V2, and Brazilian N501Y.V3 variants of SARS-CoV2) within an hour using paper strip chemistry. The results were corroborated using deep sequencing. Our design principle can be rapidly adapted for other mutations, highlighting the advantages of quick optimization and roll-out of CRISPR diagnostics (CRISPRDx) for disease surveillance even beyond COVID-19. | infectious diseases |
10.1101/2021.02.01.21250898 | Antibody seroprevalence and rate of asymptomatic infections with SARS-CoV-2 in Austrian hospital personnel. | ContextOn March 11, the World Health Organization (WHO) announced the current corona virus disease 2019 (COVID-19) outbreak as a pandemic. The first laboratory-confirmed case of COVID-19 in Austria was announced on February 27, 2020. Since then, the incidence of infection followed an exponential increase until a complete lockdown in March 2020. Thereafter easing of restrictions was gradually introduced and until mid-August daily infections remained mostly below 5 per 100.000 population.
ObjectivesThe aims of this study are to determine i) how many employees in Austrian trauma hospitals and rehabilitation facilities have virus specific IgG and IgM, and/or neutralizing antibodies against SARS-CoV-2, ii) how many are active virus carriers (symptomatic and asymptomatic) during the study, iii) the antibody decline in seropositive subjects over a period of around six months, and iv) the utility of rapid antibody tests for outpatient screening.
Study DesignOpen uncontrolled observational cross-sectional study.
Setting/ParticipantsA total of 3301 employees in 11 Austrian trauma hospitals and rehabilitation facilities of the Austrian Social Insurance for Occupational Risks (AUVA) participated in the study.
Study Interventions and MeasuresRapid antibody tests for SARS-CoV-2 specific IgG and IgM antibodies, and RT-PCR tests based on oropharyngeal swab samples, as well as laboratory-based antibody tests using ELISA/PRNT were performed. The tests were conducted twice, with an interval of 42.4{+/-}7.7 (Min=30, Max=64) days. Additionally, participants filled out a questionnaire including questions related to personal health, traveling activities, living situation, as well as inquiries of symptoms and comorbidities. Antibody positive tested participants were re-tested with ELISA/PRNT tests at a third time point on average 188.0{+/-}12.8 days after their initial test.
ResultsIn our study cohort, only 27 out of 3301 participants (0.81%) had a positive antibody test at any time point during the study confirmed via neutralization test. Among participants who had positive test results in either of the antibody tests, 50.4% did not report any symptoms consistent with common manifestations of COVID-19 during the study period or within the preceding six weeks. In the group who tested positive during or prior to study inclusion the most common symptoms of an acute viral illness were rhinitis (21.9%), and loss of taste and olfactory sense (21.9%).
The rapid antibody test was generally more sensitive based on serum (sensitivity=86.6%) as compared to whole blood (sensitivity=65.4). Concerning both ELISA tests overall the Roche test detected 24 (sensitivity=88.9%) and the Diasorin test 22 positive participants (sensitivity=81.5%).
In participants with a positive PRNT, a significant decrease in PRNT concentration from 31.8{+/-}22.9 (Md=32.0) at T1 to 26.1{+/-}17.6 (Md=21.3) at T2 to 21.4{+/-}13.4 (Md=16.0) at T3 ({chi}2=23.848, df=2, p<0.001) was observed ({chi}2=23.848, df=2, p<0.001) - with an average time of 42.4{+/-}7.7 days between T1 and T2 and 146.9{+/-}13.8 days between T2 and T3.
ConclusionsDuring the study period (May 11th - December 21th) only 0.81% were tested positive for antibodies in our study cohort. The antibody concentration decreases significantly over time with 14.8% (4 out of 27) losing detectable antibodies. | infectious diseases |
10.1101/2021.02.01.21250951 | Diffusion kurtosis imaging of white matter in bipolar disorder | ObjectivesWhite matter pathology is thought to contribute to the pathogenesis of bipolar disorder (BD). However, most studies of white matter in BD have used the simple diffusion tensor imaging (DTI) model, which has several limitations. DTI studies have reported heterogenous results, leading to a lack of consensus about the extent and location of white matter alterations. Here, we applied two advanced diffusion magnetic resonance imaging (MRI) techniques to investigate white matter microstructure in BD.
MethodsTwenty-five patients with BD and 24 controls comparable for age and sex were included in the study. Whole-brain voxel-based analysis (VBA) and a network-based connectivity approach using constrained spherical deconvolution (CSD)-tractography were used to assess group differences in diffusion kurtosis imaging (DKI) and DTI metrics.
ResultsVBA showed lower mean kurtosis in the corona radiata and posterior association fibers in BD following threshold-free cluster enhancement. Regional differences in connectivity were indicated by lower mean kurtosis and kurtosis anisotropy in streamlines traversing the temporal and occipital lobes, and lower mean axial kurtosis in the right cerebellar, thalamo-subcortical pathways in BD. Significant differences were not seen in the DTI metrics following FDR- correction.
ConclusionsDifferences between BD and controls were observed in DKI metrics in multiple brain regions, indicating altered connectivity across cortical, subcortical and cerebellar areas. DKI was more sensitive than DTI at detecting these differences, suggesting that DKI is useful for investigating white matter in BD. | psychiatry and clinical psychology |
10.1101/2021.02.01.21250929 | One Year of Evidence on Mental Health in the COVID-19 Crisis - A Systematic Review and Meta-Analysis | This paper provides a systematic review and meta-analysis on the prevalence rate of mental health issues of general population, general and frontline healthcare workers (HCWs) in China over one year of the COVID-19 crisis. We systematically searched PubMed, Embase, Web of Science, and Medrxiv at November 16th, 2020, pooled data using random-effects meta-analyses to estimate the prevalence rates, and ran meta-regression to tease out the heterogeneity. The meta-regression results uncovered several predictors of the prevalence rates, including severity, type of mental issues, population, sampling location, and study quality. Pooled prevalence rates are significantly different from, yet largely between, the findings of previous meta-analyses, suggesting the results of our larger study are consistent with yet more accurate than the findings of the smaller, previous meta-analyses. The prevalence rates of distress and insomnia and those of frontline HCWs are higher suggest future research and interventions should pay more attention to those mental outcomes and populations. Our findings suggest a need to examine the prevalence rates at varying levels of severity. The one-year cumulative evidence on sampling locations (Wuhan vs. non-Wuhan) corroborates the typhoon eye effect theory.
Trial registrationCRD4202022059 | psychiatry and clinical psychology |
10.1101/2021.02.01.21250929 | One Year of Evidence on Mental Health Disorders in China during the COVID-19 Crisis - A Systematic Review and Meta-Analysis | This paper provides a systematic review and meta-analysis on the prevalence rate of mental health issues of general population, general and frontline healthcare workers (HCWs) in China over one year of the COVID-19 crisis. We systematically searched PubMed, Embase, Web of Science, and Medrxiv at November 16th, 2020, pooled data using random-effects meta-analyses to estimate the prevalence rates, and ran meta-regression to tease out the heterogeneity. The meta-regression results uncovered several predictors of the prevalence rates, including severity, type of mental issues, population, sampling location, and study quality. Pooled prevalence rates are significantly different from, yet largely between, the findings of previous meta-analyses, suggesting the results of our larger study are consistent with yet more accurate than the findings of the smaller, previous meta-analyses. The prevalence rates of distress and insomnia and those of frontline HCWs are higher suggest future research and interventions should pay more attention to those mental outcomes and populations. Our findings suggest a need to examine the prevalence rates at varying levels of severity. The one-year cumulative evidence on sampling locations (Wuhan vs. non-Wuhan) corroborates the typhoon eye effect theory.
Trial registrationCRD4202022059 | psychiatry and clinical psychology |
10.1101/2021.02.01.21250929 | One Year of Evidence on Mental Health Disorders in China during the COVID-19 Crisis - A Systematic Review and Meta-Analysis | This paper provides a systematic review and meta-analysis on the prevalence rate of mental health issues of general population, general and frontline healthcare workers (HCWs) in China over one year of the COVID-19 crisis. We systematically searched PubMed, Embase, Web of Science, and Medrxiv at November 16th, 2020, pooled data using random-effects meta-analyses to estimate the prevalence rates, and ran meta-regression to tease out the heterogeneity. The meta-regression results uncovered several predictors of the prevalence rates, including severity, type of mental issues, population, sampling location, and study quality. Pooled prevalence rates are significantly different from, yet largely between, the findings of previous meta-analyses, suggesting the results of our larger study are consistent with yet more accurate than the findings of the smaller, previous meta-analyses. The prevalence rates of distress and insomnia and those of frontline HCWs are higher suggest future research and interventions should pay more attention to those mental outcomes and populations. Our findings suggest a need to examine the prevalence rates at varying levels of severity. The one-year cumulative evidence on sampling locations (Wuhan vs. non-Wuhan) corroborates the typhoon eye effect theory.
Trial registrationCRD4202022059 | psychiatry and clinical psychology |
10.1101/2021.02.01.21250929 | One Year of Evidence on Mental Health Disorders in China during the COVID-19 Crisis - A Systematic Review and Meta-Analysis | This paper provides a systematic review and meta-analysis on the prevalence rate of mental health issues of general population, general and frontline healthcare workers (HCWs) in China over one year of the COVID-19 crisis. We systematically searched PubMed, Embase, Web of Science, and Medrxiv at November 16th, 2020, pooled data using random-effects meta-analyses to estimate the prevalence rates, and ran meta-regression to tease out the heterogeneity. The meta-regression results uncovered several predictors of the prevalence rates, including severity, type of mental issues, population, sampling location, and study quality. Pooled prevalence rates are significantly different from, yet largely between, the findings of previous meta-analyses, suggesting the results of our larger study are consistent with yet more accurate than the findings of the smaller, previous meta-analyses. The prevalence rates of distress and insomnia and those of frontline HCWs are higher suggest future research and interventions should pay more attention to those mental outcomes and populations. Our findings suggest a need to examine the prevalence rates at varying levels of severity. The one-year cumulative evidence on sampling locations (Wuhan vs. non-Wuhan) corroborates the typhoon eye effect theory.
Trial registrationCRD4202022059 | psychiatry and clinical psychology |
10.1101/2021.02.01.21250930 | A survey of International Health Regulations National Focal Points experiences in carrying out their functions | BackgroundThe 2005 International Health Regulations (IHR (2005)) require States Parties to establish National Focal Points (NFPs) responsible for notifying the World Health Organization (WHO) of potential events that might constitute public health emergencies of international concern (PHEICs), such as outbreaks of novel infectious diseases. Given the critical role of NFPs in the global surveillance and response system supported by the IHR, we sought to assess their experiences in carrying out their functions.
MethodsIn collaboration with WHO officials, we administered a voluntary online survey to all 196 States Parties to the IHR (2005) in Africa, Asia, Europe, and South and North America, from October to November 2019. The survey was available in six languages via a secure internet-based system.
ResultsIn total, 121 NFP representatives answered the 56-question survey; 105 in full, and an additional 16 in part, resulting in a response rate of 62% (121 responses to 196 invitations to participate). The majority of NFPs knew how to notify the WHO of a potential PHEIC, and believed they have the content expertise to carry out their functions. Respondents found training workshops organized by WHO Regional Offices helpful on how to report PHEICs. NFPs experienced challenges in four critical areas: 1) insufficient intersectoral collaboration within their countries, including limited access to, or a lack of cooperation from, key relevant ministries; 2) inadequate communications, such as deficient information technology systems in place to carry out their functions in a timely fashion; 3) lack of authority to report potential PHEICs; and 4) inadequacies in some resources made available by the WHO, including a key tool - the NFP Guide. Finally, many NFP representatives expressed concern about how WHO uses the information they receive from NFPs.
ConclusionOur study, conducted just prior to the COVID-19 pandemic, illustrates key challenges experienced by NFPs that can affect States Parties and WHO performance when outbreaks occur. In order for NFPs to be able to rapidly and successfully communicate potential PHEICs such as COVID-19 in the future, continued measures need to be taken by both WHO and States Parties to ensure NFPs have the necessary authority, capacity, training, and resources to effectively carry out their functions as described in the IHR. | public and global health |
10.1101/2021.02.01.21250919 | How lifestyle changes within the COVID-19 global pandemic have affected the pattern and symptoms of the menstrual cycle. | BackgroundThe coronavirus 2019 (COVID-19) pandemic has caused significant changes to homes, working life and stress. The purpose of this research was to investigate the implications that the COVID-19 pandemic has had on the menstrual cycle and any contributing factors to these changes.
MethodsA questionnaire was completed by 749 participants, whom ranged from physically active to elite, in their training status. The questionnaire captured detail on menstrual cycle symptoms and characteristics prior to and during the COVID-19 pandemic lockdown period, as well as lifestyle, stress, exercise and nutrition. Descriptive statistics and frequency distribution were reported and decision tree analysis performed. Statistical significance was assumed at p<0.05.
ResultsFifty-two point six percent of females experienced a change in their menstrual cycle during the lockdown period. Psychosocial symptoms had changed in over half of all participants. Participants who reported increased stress/worry in family and personal health were significantly associated with changes in menstrual symptoms. Similarly, job security stress was associated with increases in bleeding time (p<0.05).
ConclusionsIt is important that females and practitioners become aware of the implications of stressful environments and the possible long-term implications on fertility, particularly given the uncertainty around a second wave of the global pandemic. | sports medicine |
10.1101/2021.02.01.21250961 | Exempting low-risk health research from ethics reviews to better serve the interests of the patients and public: a qualitative analysis of survey responses | BackgroundWe conducted a survey to identify what types of health research could be exempted from research ethics reviews in Australia.
MethodsWe surveyed active Australian health researchers and members of Human Research Ethics Committees (HREC). We presented the respondents with eight hypothetical research scenarios, involving: N of 1 trials, no treatment studies, linked data sets, surplus samples, audits, surveys, interviews with patients, and professional opinion. We asked whether these scenarios should or should not be exempt from ethics review, and to provide (optional) explanations. We analysed the reasons thematically, to identify Top 3 reasons underlying the decisions.
ResultsMost frequent reasons for requiring ethics reviews, included: the need for independent oversight, privacy/confidentiality issues, review of scientific rigour, and publishing considerations. Most frequent reasons for exempting scenarios from reviews, included: level of risk, study design, privacy/confidentiality issues, and standard clinical practice. Four research scenarios listed the same Top 3 reasons for requiring ethics reviews: need for independent oversight, review of scientific rigour, privacy/confidentiality. Reasons for exempting were less uniform, but low risk was a Top 3 reason for 7 scenarios, and study design for 4 scenarios. Privacy/confidentiality was given as a Top 3 reason for both requiring and exempting from ethics the same two scenarios.
ConclusionsThe most frequently offered reasons in support of requiring ethics reviews for research scenarios are more uniform than those for exempting them. However, considerable disagreement exists about when the risks of research are so minimal that the exemption is appropriate. | medical ethics |
10.1101/2021.02.01.21250864 | Improving the Prediction of Clinical Success Using Machine Learning | In pharmaceutical research, assessing drug candidates odds of success as they move through clinical research often relies on crude methods based on historical data. However, the rapid progress of machine learning offers a new tool to identify the more promising projects. To evaluate its usefulness, we trained and validated several machine learning algorithms on a large database of projects. Using various project descriptors as input data we were able to predict the clinical success and failure rates of projects with an average balanced accuracy of 83% to 89%, which compares favorably with the 56% to 70% balanced accuracy of the method based on historical data. We also identified the variables that contributed most to trial success and used the algorithm to predict the success (or failure) of assets currently in the industry pipeline. We conclude by discussing how pharmaceutical companies can use such model to improve the quantity and quality of their new drugs, and how the broad adoption of this technology could reduce the industrys risk profile with important consequences for industry structure, R&D investment, and the cost of innovation | health informatics |
10.1101/2021.02.01.21250924 | A claims-based score for the prediction of bleeding in a contemporary cohort of patients receiving oral anticoagulation for venous thromboembolism | BackgroundCurrent scores for bleeding risk assessment in patients with venous thromboembolism (VTE) undergoing oral anticoagulation (OAC) have limited predictive capacity. We developed and internally validated a bleeding prediction model using healthcare claims data.
Methods and ResultsWe selected patients with incident VTE in the 2011-2017 MarketScan databases initiating OAC. Hospitalized bleeding events were identified using validated algorithms in the 180 days after VTE diagnosis. We evaluated demographic factors, comorbidities, and medication use prior to OAC initiation as potential predictors of bleeding using stepwise selection of variables in Cox models ran on 1000 bootstrap samples of the patient population. Variables included in >60% of all models were selected for the final analysis. We internally validated the model using bootstrapping and correcting for optimism. We included 165,434 VTE patients initiating OAC, of which 2,294 had a bleeding event. After undergoing the variable selection process, the final model included 20 terms (15 main effects and 5 interactions). The c-statistic for the final model was 0.68 (95% confidence interval [CI] 0.67-0.69). The internally validated c-statistic corrected for optimism was 0.68 (95%CI 0.67-0.69). For comparison, the c-statistic of the HAS-BLED score in this population was 0.62 (95%CI 0.61-0.63).
ConclusionWe have developed a novel model for bleeding prediction in VTE using large healthcare claims databases. Performance of the model was moderately good, highlighting the urgent need to identify better predictors of bleeding to inform treatment decisions. | hematology |
10.1101/2021.02.01.21250935 | The effect of mobility restrictions on the SARS-CoV-2 diffusion during the first wave: what are the impacts in Sweden, USA, France and Colombia. | ResumeCombined with sanitation and social distancing measures, control of human mobility has quickly been targeted as a major leverage to contain the spread of SARS-CoV-2 in a great majority of countries worldwide. The extent to which such measures were successful, however, is uncertain (Gibbs et al. 2020; Kraemer et al. 2020). Very few studies are quantifying the relation between mobility, lockdown strategies and the diffusion of the virus in different countries. Using the anonymised data collected by one of the major social media platforms (Facebook) combined with spatial and temporal Covid-19 data, the objective of this research is to understand how mobility patterns and SARS-CoV-2 diffusion during the first wave are connected in four different countries: the west coast of the USA, Colombia, Sweden and France. Our analyses suggest a relatively modest impact of lockdown on the spread of the virus at the national scale. Despite a varying impact of lockdown on mobility reduction in these countries (83% in France and Colombia, 55% in USA, 10% in Sweden), no country successfully implemented control measures to stem the spread of the virus. As observed in Hubei (Chinazzi et al. 2020), it is likely that the virus had already spread very widely prior to lockdown; the number of affected administrative units in all countries was already very high at the time of lockdown despite the low testing levels. The second conclusion is that the integration of mobility data considerably improved the epidemiological model (as revealed by the QAIC). If inter-individual contact is a fundamental element in the study of the spread of infectious diseases, it is also the case at the level of administrative units. However, this relational dimension is little understood beyond the individual scale mostly due to the lack of mobility data at this scale. Fortunately, these types of data are getting increasingly provided by social media or mobile operators, and they can be used to help administrations to observe changes in movement patterns and/or to better locate where to implement disease control measures such as vaccination (Pollina & Busvine 2020; Pullano et al. 2020; Romm et al. 2020). | infectious diseases |
10.1101/2021.02.01.21250306 | ABC2-SPH risk score for in-hospital mortality in COVID-19 patients: development, external validation and comparison with other available scores | ObjectiveTo develop and validate a rapid scoring system at hospital admission for predicting in-hospital mortality in patients hospitalized with coronavirus disease 19 (COVID-19), and to compare this score with other existing ones.
DesignCohort study
SettingThe Brazilian COVID-19 Registry has been conducted in 36 Brazilian hospitals in 17 cities. Logistic regression analysis was performed to develop a prediction model for in-hospital mortality, based on the 3978 patients that were admitted between March-July, 2020. The model was then validated in the 1054 patients admitted during August-September, as well as in an external cohort of 474 Spanish patients.
ParticipantsConsecutive symptomatic patients ([≥]18 years old) with laboratory confirmed COVID-19 admitted to participating hospitals. Patients who were transferred between hospitals and in whom admission data from the first hospital or the last hospital were not available were excluded, as well those who were admitted for other reasons and developed COVID-19 symptoms during their stay.
Main outcome measuresIn-hospital mortality
ResultsMedian (25th-75th percentile) age of the model-derivation cohort was 60 (48-72) years, 53.8% were men, in-hospital mortality was 20.3%. The validation cohorts had similar age distribution and in-hospital mortality. From 20 potential predictors, seven significant variables were included in the in-hospital mortality risk score: age, blood urea nitrogen, number of comorbidities, C-reactive protein, SpO2/FiO2 ratio, platelet count and heart rate. The model had high discriminatory value (AUROC 0.844, 95% CI 0.829 to 0.859), which was confirmed in the Brazilian (0.859) and Spanish (0.899) validation cohorts. Our ABC2-SPH score showed good calibration in both Brazilian cohorts, but, in the Spanish cohort, mortality was somewhat underestimated in patients with very high (>25%) risk. The ABC2-SPH score is implemented in a freely available online risk calculator (https://abc2sph.com/).
ConclusionsWe designed and validated an easy-to-use rapid scoring system based on characteristics of COVID-19 patients commonly available at hospital presentation, for early stratification for in-hospital mortality risk of patients with COVID-19.
Summary boxesWhat is already known on this topic? O_LIRapid scoring systems may be very useful for fast and effective assessment of COVID-19 patients in the emergency department.
C_LIO_LIThe majority of available scores have high risk of bias and lack benefit to clinical decision making.
C_LIO_LIDerivation and validation studies in low- and middle-income countries, including Latin America, are scarce.
C_LI
What this study adds O_LIABC2-SPH employs seven well defined variables, routinely assessed upon hospital presentation: age, number of comorbidities, blood urea nitrogen, C reactive protein, Spo2/FiO2 ratio, platelets and heart rate.
C_LIO_LIThis easy-to-use risk score identified four categories at increasing risk of death with a high level of accuracy, and displayed better discrimination ability than other existing scores.
C_LIO_LIA free web-based calculator is available and may help healthcare practitioners to estimate the expected risk of mortality for patients at hospital presentation.
C_LI | infectious diseases |
10.1101/2021.02.01.21250765 | A Rapid Realist Review of the Role of Community Pharmacy in the Public Health Response to COVID-19 | IntroductionCommunity pharmacists and their teams have remained accessible to the public providing essential services despite immense pressures during the COVID-19 pandemic. They have successfully expanded the influenza vaccination programme and are now supporting the delivery of the COVID-19 vaccination roll-out.
AimThis rapid realist review aims to understand how community pharmacy can most effectively deliver essential and advanced services, with a focus on vaccination, during the pandemic and in the future.
MethodAn embryonic programme theory was generated using four diverse and complementary documents along with the expertise of the project team. Academic databases, preprint services and grey literature were searched and screened for documents meeting our inclusion criteria. The data was extracted from 103 documents to develop and refine a programme theory using a realist logic of analysis. Our analysis generated 13 context-mechanism-outcome configurations explaining when, why and how community pharmacy can support public health vaccination campaigns, maintain essential services during pandemics, and capitalise on opportunities for expanded, sustainable public health service roles. The views of stakeholders including pharmacy users, pharmacists, pharmacy teams and other healthcare professionals were sought throughout to refine the 13 explanatory configurations.
ResultsThe 13 context-mechanism-outcome configurations are organised according to decision makers, community pharmacy teams and community pharmacy users as key actors. Review findings include: supporting a clear role for community pharmacies in public health; clarifying pharmacists legal and professional liabilities; involving pharmacy teams in service specification design; providing suitable guidance, adequate compensation and resources; and leveraging accessible, convenient locations of community pharmacy.
DiscussionCommunity pharmacy has been able to offer key services during the pandemic. Decision makers must endorse, articulate and support a clear public health role for community pharmacy. We provide key recommendations for decision makers to optimise such a role during these unprecedented times and in the future. | infectious diseases |
10.1101/2021.02.01.20232785 | Increased circulating levels of angiotensin-(1-7) in severely ill COVID-19 patients | The mono-carboxypeptidase Angiotensin-Converting Enzyme 2 (ACE2) is an important "player" of the renin-angiotensin system (RAS). ACE2 is also the receptor for SARS-CoV-2, the new coronavirus that causes COVID-19. It has been hypothesized that following SARS-CoV-2/ACE2 internalization Ang II level would increase in parallel to a decrease of Ang-(1-7) in COVID-19 patients. In this preliminary report, we analyzed the plasma levels of angiotensin peptides in 19 severe COVID-19 patients and 19 non-COVID-19 volunteers, to assess potential outcome associations. Unexpectedly, a significant increase in circulating Ang-(1-7) and lower Ang II plasma level were found in critically ill COVID-19 patients. Accordingly, an increased Ang-(1-7)/ Ang II ratio was observed in COVID-19 suggesting a RAS dysregulation toward an increased formation of Ang-(1-7) in these patients. | infectious diseases |