Pre‐eclampsia (PE) is a multisystem disorder that typically affects 2%–5% of pregnant women and is one of the leading causes of maternal and perinatal morbidity and mortality, especially when the ...condition is of early onset. Globally, 76 000 women and 500 000 babies die each year from this disorder. Furthermore, women in low‐resource countries are at a higher risk of developing PE compared with those in high‐resource countries.
Although a complete understanding of the pathogenesis of PE remains unclear, the current theory suggests a two‐stage process. The first stage is caused by shallow invasion of the trophoblast, resulting in inadequate remodeling of the spiral arteries. This is presumed to lead to the second stage, which involves the maternal response to endothelial dysfunction and imbalance between angiogenic and antiangiogenic factors, resulting in the clinical features of the disorder.
Accurate prediction and uniform prevention continue to elude us. The quest to effectively predict PE in the first trimester of pregnancy is fueled by the desire to identify women who are at high risk of developing PE, so that necessary measures can be initiated early enough to improve placentation and thus prevent or at least reduce the frequency of its occurrence. Furthermore, identification of an “at risk” group will allow tailored prenatal surveillance to anticipate and recognize the onset of the clinical syndrome and manage it promptly.
PE has been previously defined as the onset of hypertension accompanied by significant proteinuria after 20 weeks of gestation. Recently, the definition of PE has been broadened. Now the internationally agreed definition of PE is the one proposed by the International Society for the Study of Hypertension in Pregnancy (ISSHP).
According to the ISSHP, PE is defined as systolic blood pressure at ≥140 mm Hg and/or diastolic blood pressure at ≥90 mm Hg on at least two occasions measured 4 hours apart in previously normotensive women and is accompanied by one or more of the following new‐onset conditions at or after 20 weeks of gestation:
1.Proteinuria (i.e. ≥30 mg/mol protein:creatinine ratio; ≥300 mg/24 hour; or ≥2 + dipstick);
2.Evidence of other maternal organ dysfunction, including: acute kidney injury (creatinine ≥90 μmol/L; 1 mg/dL); liver involvement (elevated transaminases, e.g. alanine aminotransferase or aspartate aminotransferase >40 IU/L) with or without right upper quadrant or epigastric abdominal pain; neurological complications (e.g. eclampsia, altered mental status, blindness, stroke, clonus, severe headaches, and persistent visual scotomata); or hematological complications (thrombocytopenia–platelet count <150 000/μL, disseminated intravascular coagulation, hemolysis); or
3.Uteroplacental dysfunction (such as fetal growth restriction, abnormal umbilical artery Doppler waveform analysis, or stillbirth).
It is well established that a number of maternal risk factors are associated with the development of PE: advanced maternal age; nulliparity; previous history of PE; short and long interpregnancy interval; use of assisted reproductive technologies; family history of PE; obesity; Afro‐Caribbean and South Asian racial origin; co‐morbid medical conditions including hyperglycemia in pregnancy; pre‐existing chronic hypertension; renal disease; and autoimmune diseases, such as systemic lupus erythematosus and antiphospholipid syndrome. These risk factors have been described by various professional organizations for the identification of women at risk of PE; however, this approach to screening is inadequate for effective prediction of PE.
PE can be subclassified into:
1.Early‐onset PE (with delivery at <34+0 weeks of gestation);
2.Preterm PE (with delivery at <37+0 weeks of gestation);
3.Late‐onset PE (with delivery at ≥34+0 weeks of gestation);
4.Term PE (with delivery at ≥37+0 weeks of gestation).
These subclassifications are not mutually exclusive. Early‐onset PE is associated with a much higher risk of short‐ and long‐term maternal and perinatal morbidity and mortality.
Obstetricians managing women with preterm PE are faced with the challenge of balancing the need to achieve fetal maturation in utero with the risks to the mother and fetus of continuing the pregnancy longer. These risks include progression to eclampsia, development of placental abruption and HELLP (hemolysis, elevated liver enzyme, low platelet) syndrome. On the other hand, preterm delivery is associated with higher infant mortality rates and increased morbidity resulting from small for gestational age (SGA), thrombocytopenia, bronchopulmonary dysplasia, cerebral palsy, and an increased risk of various chronic diseases in adult life, particularly type 2 diabetes, cardiovascular disease, and obesity. Women who have experienced PE may also face additional health problems in later life, as the condition is associated with an increased risk of death from future cardiovascular disease, hypertension, stroke, renal impairment, metabolic syndrome, and diabetes. The life expectancy of women who developed preterm PE is reduced on average by 10 years. There is also significant impact on the infants in the long term, such as increased risks of insulin resistance, diabetes mellitus, coronary artery disease, and hypertension in infants born to pre‐eclamptic women.
The International Federation of Gynecology and Obstetrics (FIGO) brought together international experts to discuss and evaluate current knowledge on PE and develop a document to frame the issues and suggest key actions to address the health burden posed by PE.
FIGO's objectives, as outlined in this document, are: (1) To raise awareness of the links between PE and poor maternal and perinatal outcomes, as well as to the future health risks to mother and offspring, and demand a clearly defined global health agenda to tackle this issue; and (2) To create a consensus document that provides guidance for the first‐trimester screening and prevention of preterm PE, and to disseminate and encourage its use.
Based on high‐quality evidence, the document outlines current global standards for the first‐trimester screening and prevention of preterm PE, which is in line with FIGO good clinical practice advice on first trimester screening and prevention of pre‐eclampsia in singleton pregnancy.1
It provides both the best and the most pragmatic recommendations according to the level of acceptability, feasibility, and ease of implementation that have the potential to produce the most significant impact in different resource settings. Suggestions are provided for a variety of different regional and resource settings based on their financial, human, and infrastructure resources, as well as for research priorities to bridge the current knowledge and evidence gap.
To deal with the issue of PE, FIGO recommends the following:
Public health focus: There should be greater international attention given to PE and to the links between maternal health and noncommunicable diseases (NCDs) on the Sustainable Developmental Goals agenda. Public health measures to increase awareness, access, affordability, and acceptance of preconception counselling, and prenatal and postnatal services for women of reproductive age should be prioritized. Greater efforts are required to raise awareness of the benefits of early prenatal visits targeted at reproductive‐aged women, particularly in low‐resource countries.
Universal screening: All pregnant women should be screened for preterm PE during early pregnancy by the first‐trimester combined test with maternal risk factors and biomarkers as a one‐step procedure. The risk calculator is available free of charge at https://fetalmedicine.org/research/assess/preeclampsia. FIGO encourages all countries and its member associations to adopt and promote strategies to ensure this. The best combined test is one that includes maternal risk factors, measurements of mean arterial pressure (MAP), serum placental growth factor (PLGF), and uterine artery pulsatility index (UTPI). Where it is not possible to measure PLGF and/or UTPI, the baseline screening test should be a combination of maternal risk factors with MAP, and not maternal risk factors alone. If maternal serum pregnancy‐associated plasma protein A (PAPP‐A) is measured for routine first‐trimester screening for fetal aneuploidies, the result can be included for PE risk assessment. Variations to the full combined test would lead to a reduction in the performance screening. A woman is considered high risk when the risk is 1 in 100 or more based on the first‐trimester combined test with maternal risk factors, MAP, PLGF, and UTPI.
Contingent screening: Where resources are limited, routine screening for preterm PE by maternal factors and MAP in all pregnancies and reserving measurements of PLGF and UTPI for a subgroup of the population (selected on the basis of the risk derived from screening by maternal factors and MAP) can be considered.
Prophylactic measures: Following first‐trimester screening for preterm PE, women identified at high risk should receive aspirin prophylaxis commencing at 11–14+6 weeks of gestation at a dose of ~150 mg to be taken every night until 36 weeks of gestation, when delivery occurs, or when PE is diagnosed. Low‐dose aspirin should not be prescribed to all pregnant women. In women with low calcium intake (<800 mg/d), either calcium replacement (≤1 g elemental calcium/d) or calcium supplementation (1.5–2 g elemental calcium/d) may reduce the burden of both early‐ and late‐onset PE.
SUMMARY
Soybean (Glycine max L. Merr.) is a major crop in animal feed and human nutrition, mainly for its rich protein and oil contents. The remarkable rise in soybean transcriptome studies over the ...past 5 years generated an enormous amount of RNA‐seq data, encompassing various tissues, developmental conditions and genotypes. In this study, we have collected data from 1298 publicly available soybean transcriptome samples, processed the raw sequencing reads and mapped them to the soybean reference genome in a systematic fashion. We found that 94% of the annotated genes (52 737/56 044) had detectable expression in at least one sample. Unsupervised clustering revealed three major groups, comprising samples from aerial, underground and seed/seed‐related parts. We found 452 genes with uniform and constant expression levels, supporting their roles as housekeeping genes. On the other hand, 1349 genes showed heavily biased expression patterns towards particular tissues. A transcript‐level analysis revealed that 95% (70 963 of 74 490) of the assembled transcripts have intron chains exactly matching those from known transcripts, whereas 3256 assembled transcripts represent potentially novel splicing isoforms. The dataset compiled here constitute a new resource for the community, which can be downloaded or accessed through a user‐friendly web interface at http://venanciogroup.uenf.br/resources/. This comprehensive transcriptome atlas will likely accelerate research on soybean genetics and genomics.
Significance Statement
Here we report an integrative and systematic analysis of 1298 RNA‐Seq samples to build a soybean gene expression atlas. This resource is accessible via a user‐friendly web interface as well as available for download.
Objective
To assess the utility of placental growth factor (PlGF) levels and the soluble fms‐like tyrosine kinase‐1/placental growth factor (sFlt‐1/PlGF) ratio to predict preterm birth (PTB) for ...infants with fetal growth restriction (FGR) and those appropriate for gestational age (AGA).
Design
Prospective, observational cohort study.
Setting
Tertiary maternity hospital in Australia.
Population
There were 320 singleton pregnancies: 141 (44.1%) AGA, 83 (25.9%) early FGR (<32+0 weeks) and 109 (30.0%) late FGR (≥32+0 weeks).
Methods
Maternal serum PlGF and sFlt‐1/PlGF ratio were measured at 4‐weekly intervals from recruitment to delivery. Low maternal PlGF levels and elevated sFlt‐1/PlGF ratio were defined as <100 ng/L and >5.78 if <28 weeks and >38 if ≥28 weeks respectively. Cox proportional hazards models were used. The analysis period was defined as the time from the first measurement of PlGF and sFlt‐1/PlGF ratio to the time of birth or censoring.
Main outcome measures
The primary study outcome was overall PTB. The relative risks (RR) of birth within 1, 2 and 3 weeks and for medically indicated and spontaneous PTB were also ascertained.
Results
The early FGR cohort had lower median PlGF levels (54 versus 229 ng/L, p < 0.001) and higher median sFlt‐1 levels (2774 ng/L versus 2096 ng/L, p < 0.001) and sFlt‐1/PlGF ratio higher (35 versus 10, p < 0.001). Both PlGF <100 ng/L and elevated sFlt‐1/PlGF ratio were strongly predictive for PTB as well as PTB within 1, 2 and 3 weeks of diagnosis. For both FGR and AGA groups, PlGF <100 ng/L or raised sFlt‐1/PlGF ratio were strongly associated with increased risk for medically indicated PTB. The highest RR was seen in the FGR cohort when PlGF was <100 ng/L (RR 35.20, 95% CI 11.48–175.46).
Conclusions
Low maternal PlGF levels and elevated sFlt‐1/PlGF ratio are potentially useful to predict PTB in both FGR and AGA pregnancies.
Soybean is a crucial crop worldwide, used as a source of food, feed, and industrial products due to its high protein and oil content. Previously, the rapid accumulation of soybean RNA-seq data in ...public databases and the computational challenges of processing raw RNA-seq data motivated us to develop the Soybean Expression Atlas, a gene expression database of over a thousand RNA-seq samples. Over the past few years, our database has allowed researchers to explore the expression profiles of important gene families, discover genes associated with agronomic traits, and understand the transcriptional dynamics of cellular processes. Here, we present the Soybean Expression Atlas v2, an updated version of our database with a fourfold increase in the number of samples, featuring transcript- and gene-level transcript abundance matrices for 5481 publicly available RNA-seq samples. New features in our database include the availability of transcript-level abundance estimates and equivalence classes to explore differential transcript usage, abundance estimates in bias-corrected counts to increase the accuracy of differential gene expression analyses, a new web interface with improved data visualization and user experience, and a reproducible and scalable pipeline available as an R package. The Soybean Expression Atlas v2 is available at https://soyatlas.venanciogroup.uenf.br/, and it will accelerate soybean research, empowering researchers with high-quality and easily accessible gene expression data.
Drylands are predicted to become more arid and saline due to increasing global temperature and drought. Although species from the Caatinga, a Brazilian tropical dry forest, are tolerant to these ...conditions, the capacity for germination to withstand extreme soil temperature and water deficit associated with climate change remains to be quantified. We aimed to evaluate how germination will be affected under future climate change scenarios of limited water and increased temperature. Seeds of three species were germinated at different temperatures and osmotic potentials. Thermal time and hydrotime model parameters were established and thresholds for germination calculated. Germination performance in 2055 was predicted, by combining temperature and osmotic/salt stress thresholds, considering soil temperature and moisture following rainfall events. The most pessimistic climate scenario predicts an increase of 3.9 °C in soil temperature and 30% decrease in rainfall. Under this scenario, soil temperature is never lower than the minimum and seldomly higher than maximum temperature thresholds for germination. As long as the soil moisture (0.139 cm³ cm³) requirements are met, germination can be achieved in 1 day. According to the base water potential and soil characteristics, the minimum weekly rainfall for germination is estimated to be 17.5 mm. Currently, the required minimum rainfall occurs in 14 weeks of the year but will be reduced to 4 weeks by 2055. This may not be sufficient for seedling recruitment of some species in the natural environment. Thus, in future climate scenarios, rainfall rather than temperature will be extremely limiting for seed germination.
Virtual reality is increasingly being utilized by clinicians to facilitate analgesia and anxiolysis within an inpatient setting. There is however, a lack of a clinically relevant review to guide its ...use for this purpose.
To systematically review the current evidence for the efficacy of virtual reality as an analgesic in the management of acute pain and anxiolysis in an inpatient setting.
A comprehensive search was conducted up to and including January 2019 on PubMed, Ovid Medline, EMBASE, and Cochrane Database of Systematic reviews according to PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines. Search terms included virtual reality, vr, and pain. Primary articles with a focus on acute pain in the clinical setting were considered for the review. Primary outcome measures included degree of analgesia afforded by virtual reality therapy, degree of anxiolysis afforded by virtual reality therapy, effect of virtual reality on physiological parameters, side effects precipitated by virtual reality, virtual reality content type, and type of equipment utilized.
Eighteen studies were deemed eligible for inclusion in this systematic review; 67% (12/18) of studies demonstrated significant reductions in pain with the utilization of virtual reality; 44% (8/18) of studies assessed the effects of virtual reality on procedural anxiety, with 50% (4/8) of these demonstrating significant reductions; 28% (5/18) of studies screened for side effects with incidence rates of 0.5% to 8%; 39% (7/18) of studies evaluated the effects of virtual reality on autonomic arousal as a biomarker of pain, with 29% (2/7) demonstrating significant changes; 100% (18/18) of studies utilized a head mounted display to deliver virtual reality therapy, with 50% being in active form (participants interacting with the environment) and 50% being in passive form (participants observing the content only).
Available evidence suggests that virtual reality therapy can be applied to facilitate analgesia for acute pain in a variety of inpatient settings. Its effects, however, are likely to vary by patient population and indication. This highlights the need for individualized pilot testing of virtual reality therapy's effects for each specific clinical use case rather than generalizing its use for the broad indication of facilitating analgesia. In addition, virtual reality therapy has the added potential of concurrently providing procedural anxiolysis, thereby improving patient experience and cooperation, while being associated with a low incidence of side effects (nausea, vomiting, eye strain, and dizziness). Furthermore, findings indicated a head mounted display should be utilized to deliver virtual reality therapy in a clinical setting with a slight preference for active over passive virtual reality for analgesia. There, however, appears to be insufficient evidence to substantiate the effect of virtual reality on autonomic arousal, and this should be considered at best to be for investigational uses, at present.
The demand for processing ever increasing amounts of genomic data has raised new challenges for the implementation of highly scalable and efficient computational systems. In this paper we propose ...SparkBLAST, a parallelization of a sequence alignment application (BLAST) that employs cloud computing for the provisioning of computational resources and Apache Spark as the coordination framework. As a proof of concept, some radionuclide-resistant bacterial genomes were selected for similarity analysis.
Experiments in Google and Microsoft Azure clouds demonstrated that SparkBLAST outperforms an equivalent system implemented on Hadoop in terms of speedup and execution times.
The superior performance of SparkBLAST is mainly due to the in-memory operations available through the Spark framework, consequently reducing the number of local I/O operations required for distributed BLAST processing.
Summary
Hybridization, the process of crossing individuals from diverse genetic backgrounds, plays a pivotal role in evolution, biological invasiveness, and crop breeding.
At the transcriptional ...level, hybridization often leads to complex nonadditive effects, presenting challenges for understanding its consequences. Although standard transcriptomic analyses exist to compare hybrids to their progenitors, such analyses have not been implemented in a software package, hindering reproducibility.
We introduce hybridexpress, an R/Bioconductor package designed to facilitate the analysis, visualization, and comparison of gene expression patterns in hybrid triplets (hybrids and their progenitors). hybridexpress provides users with a user‐friendly and comprehensive workflow that includes all standard comparative analyses steps, including data normalization, calculation of midparent expression values, sample clustering, expression‐based gene classification into categories and classes, and overrepresentation analysis for functional terms.
We illustrate the utility of hybridexpress through comparative transcriptomic analyses of cotton allopolyploidization and rice root trait heterosis. hybridexpress is designed to streamline comparative transcriptomic studies of hybrid triplets, advancing our understanding of evolutionary dynamics in allopolyploids, and enhancing plant breeding strategies. hybridexpress is freely accessible from Bioconductor (https://bioconductor.org/packages/HybridExpress) and its source code is available on GitHub (https://github.com/almeidasilvaf/HybridExpress).
Linked article: This is a mini commentary on Terteel Elawad et al, pp. 46–62 in this issue. To view this article visit https://doi.org/10.1111/1471‐0528.17320
A significant number of promising applications for vehicular ad hoc networks (VANETs) are becoming a reality. Most of these applications require a variety of heterogenous content to be delivered to ...vehicles and to their on-board users. However, the task of content delivery in such dynamic and large-scale networks is easier said than done. In this article, we propose a classification of content delivery solutions applied to VANETs while highlighting their new characteristics and describing their underlying architectural design. First, the two fundamental building blocks that are part of an entire content delivery system are identified: replica allocation and content delivery. The related solutions are then classified according to their architectural definition. Within each category, solutions are described based on the techniques and strategies that have been adopted. As result, we present an in-depth discussion on the architecture, techniques, and strategies adopted by studies in the literature that tackle problems related to vehicular content delivery networks.