Cosmological Tests of Gravity Ferreira, Pedro G
Annual review of astronomy and astrophysics,
08/2019, Letnik:
57, Številka:
1
Journal Article
Recenzirano
Cosmological observations are beginning to reach a level of precision that allows us to test some of the most fundamental assumptions in our working model of the Universe. One such assumption is that ...gravity is governed by the theory of general relativity. In this review, we discuss how one might go about extending general relativity and how such extensions can be described in a unified way on large scales. This allows us to describe the phenomenology of modified gravity in the growth and morphology of the large-scale structure of the Universe. On smaller scales, we explore the physics of gravitational screening and how it might manifest itself in galaxies, clusters, and, more generally, in the cosmic web. We then analyze the current constraints from large-scale structure and conclude by discussing the future prospects of the field in light of the plethora of surveys currently being planned. Key results include the following:
There are a plethora of alternative theories of gravity that are restricted by fundamental physics considerations.
There is now a well-established formalism for describing cosmological perturbations in the linear regime for general theories of gravity.
Gravitational screening can mask modifications to general relativity on small scales but may, itself, lead to distinctive signatures in the large-scale structure of the Universe.
Current constraints on both linear and nonlinear scales may be affected by systematic uncertainties that limit our ability to rule out alternatives to general relativity.
The next generation of cosmological surveys will dramatically improve constraints on general relativity, by up to two orders of magnitude.
Abstract
Predicting the sensitivity of tumors to specific anti-cancer treatments is a challenge of paramount importance for precision medicine. Machine learning(ML) algorithms can be trained on ...high-throughput screening data to develop models that are able to predict the response of cancer cell lines and patients to novel drugs or drug combinations. Deep learning (DL) refers to a distinct class of ML algorithms that have achieved top-level performance in a variety of fields, including drug discovery. These types of models have unique characteristics that may make them more suitable for the complex task of modeling drug response based on both biological and chemical data, but the application of DL to drug response prediction has been unexplored until very recently. The few studies that have been published have shown promising results, and the use of DL for drug response prediction is beginning to attract greater interest from researchers in the field. In this article, we critically review recently published studies that have employed DL methods to predict drug response in cancer cell lines. We also provide a brief description of DL and the main types of architectures that have been used in these studies. Additionally, we present a selection of publicly available drug screening data resources that can be used to develop drug response prediction models. Finally, we also address the limitations of these approaches and provide a discussion on possible paths for further improvement. Contact: mrocha@di.uminho.pt
One of the main obstacles to the successful treatment of cancer is the phenomenon of drug resistance. A common strategy to overcome resistance is the use of combination therapies. However, the space ...of possibilities is huge and efficient search strategies are required. Machine Learning (ML) can be a useful tool for the discovery of novel, clinically relevant anti-cancer drug combinations. In particular, deep learning (DL) has become a popular choice for modeling drug combination effects. Here, we set out to examine the impact of different methodological choices on the performance of multimodal DL-based drug synergy prediction methods, including the use of different input data types, preprocessing steps and model architectures. Focusing on the NCI ALMANAC dataset, we found that feature selection based on prior biological knowledge has a positive impact-limiting gene expression data to cancer or drug response-specific genes improved performance. Drug features appeared to be more predictive of drug response, with a 41% increase in coefficient of determination (R2) and 26% increase in Spearman correlation relative to a baseline model that used only cell line and drug identifiers. Molecular fingerprint-based drug representations performed slightly better than learned representations-ECFP4 fingerprints increased R2 by 5.3% and Spearman correlation by 2.8% w.r.t the best learned representations. In general, fully connected feature-encoding subnetworks outperformed other architectures. DL outperformed other ML methods by more than 35% (R2) and 14% (Spearman). Additionally, an ensemble combining the top DL and ML models improved performance by about 6.5% (R2) and 4% (Spearman). Using a state-of-the-art interpretability method, we showed that DL models can learn to associate drug and cell line features with drug response in a biologically meaningful way. The strategies explored in this study will help to improve the development of computational methods for the rational design of effective drug combinations for cancer therapy.
Modified gravity and cosmology Clifton, Timothy; Ferreira, Pedro G.; Padilla, Antonio ...
Physics reports,
03/2012, Letnik:
513, Številka:
1-3
Journal Article
Recenzirano
Odprti dostop
In this review we present a thoroughly comprehensive survey of recent work on modified theories of gravity and their cosmological consequences. Amongst other things, we cover General Relativity, ...scalar–tensor, Einstein–æther, and Bimetric theories, as well as TeVeS, f(R), general higher-order theories, Hořava–Lifschitz gravity, Galileons, Ghost Condensates, and models of extra dimensions including Kaluza–Klein, Randall–Sundrum, DGP, and higher co-dimension braneworlds. We also review attempts to construct a Parameterised Post-Friedmannian formalism, that can be used to constrain deviations from General Relativity in cosmology, and that is suitable for comparison with data on the largest scales. These subjects have been intensively studied over the past decade, largely motivated by rapid progress in the field of observational cosmology that now allows, for the first time, precision tests of fundamental physics on the scale of the observable Universe. The purpose of this review is to provide a reference tool for researchers and students in cosmology and gravitational physics, as well as a self-contained, comprehensive and up-to-date introduction to the subject as a whole.
Phenotypic plasticity is important in adaptation and shapes the evolution of organisms. However, we understand little about what aspects of the genome are important in facilitating plasticity. ...Eusocial insect societies produce plastic phenotypes from the same genome, as reproductives (queens) and nonreproductives (workers). The greatest plasticity is found in the simple eusocial insect societies in which individuals retain the ability to switch between reproductive and nonreproductive phenotypes as adults. We lack comprehensive data on the molecular basis of plastic phenotypes. Here, we sequenced genomes, microRNAs (miRNAs), and multiple transcriptomes and methylomes from individual brains in a wasp (Polistes canadensis) and an ant (Dinoponera quadriceps) that live in simple eusocial societies. In both species, we found few differences between phenotypes at the transcriptional level, with little functional specialization, and no evidence that phenotype-specific gene expression is driven by DNA methylation or miRNAs. Instead, phenotypic differentiation was defined more subtly by nonrandom transcriptional network organization, with roles in these networks for both conserved and taxon-restricted genes. The general lack of highly methylated regions or methylome patterning in both species may be an important mechanism for achieving plasticity among phenotypes during adulthood. These findings define previously unidentified hypotheses on the genomic processes that facilitate plasticity and suggest that the molecular hallmarks of social behavior are likely to differ with the level of social complexity.
ABSTRACT We present a framework for forecasting cosmological constraints from future neutral hydrogen intensity mapping experiments at low to intermediate redshifts. In the process, we establish a ...simple way of comparing such surveys with optical galaxy redshift surveys. We explore a wide range of experimental configurations and assess how well a number of cosmological observables (the expansion rate, growth rate, and angular diameter distance) and parameters (the densities of dark energy and dark matter, spatial curvature, the dark energy equation of state, etc.) will be measured by an extensive roster of upcoming experiments. A number of potential contaminants and systematic effects are also studied in detail. The overall picture is encouraging-if autocorrelation calibration can be controlled to a sufficient level, Phase I of the Square Kilometre Array should be able to constrain the dark energy equation of state about as well as a DETF Stage IV galaxy redshift survey like Euclid, in roughly the same time frame.
Single-pixel imaging is an imaging technique that has recently attracted a lot of attention from several areas. This paper presents a study on the influence of the Hadamard basis ordering on the ...image reconstruction quality, using simulation and experimental methods. During this work, five different orderings, Natural, Walsh, Cake-cutting, High Frequency and Random orders, along with two different reconstruction algorithms, TVAL3 and NESTA, were tested. Also, three different noise levels and compression ratios from 0.1 to 1 were evaluated. A single-pixel camera was developed using a digital micromirror device for the experimental phase. For a compression ratio of 0.1, the Cake-cutting order achieved the best reconstruction quality, while the best contrast was achieved by Walsh order. For compression ratios of 0.5, the Walsh and Cake-cutting orders achieved similar results. Both Walsh and Cake-cutting orders reconstructed the images with good quality using compression ratios from 0.3. Finally, the TVAL3 algorithm showed better image reconstruction quality, in comparison with NESTA, when considering compression ratios from 0.1 to 0.5.
We make use of a large set of fast simulations of an intensity mapping experiment with characteristics similar to those expected of the Square Kilometre Array in order to study the viability and ...limits of blind foreground subtraction techniques. In particular, we consider three different approaches: polynomial fitting, principal component analysis (PCA) and independent component analysis (ICA). We review the motivations and algorithms for the three methods, and show that they can all be described, using the same mathematical framework, as different approaches to the blind source separation problem. We study the efficiency of foreground subtraction both in the angular and radial (frequency) directions, as well as the dependence of this efficiency on different instrumental and modelling parameters. For well-behaved foregrounds and instrumental effects, we find that foreground subtraction can be successful to a reasonable level on most scales of interest. We also quantify the effect that the cleaning has on the recovered signal and power spectra. Interestingly, we find that the three methods yield quantitatively similar results, with PCA and ICA being almost equivalent.