Honey bee toxicology Johnson, Reed M
Annual review of entomology,
01/2015, Letnik:
60
Journal Article
Recenzirano
Odprti dostop
Insecticides are chemicals used to kill insects, so it is unsurprising that many insecticides have the potential to harm honey bees (Apis mellifera). However, bees are exposed to a great variety of ...other potentially toxic chemicals, including flavonoids and alkaloids that are produced by plants; mycotoxins produced by fungi; antimicrobials and acaricides that are introduced by beekeepers; and fungicides, herbicides, and other environmental contaminants. Although often regarded as uniquely sensitive to toxic compounds, honey bees are adapted to tolerate and even thrive in the presence of toxic compounds that occur naturally in their environment. The harm caused by exposure to a particular concentration of a toxic compound may depend on the level of simultaneous exposure to other compounds, pathogen levels, nutritional status, and a host of other factors. This review takes a holistic view of bee toxicology by taking into account the spectrum of xenobiotics to which bees are exposed.
Four timely and broadly available remotely sensed datasets were assessed for inclusion into county-level corn and soybean yield forecasting efforts focused on the Corn Belt region of the central ...United States (US). Those datasets were the (1) Normalized Difference Vegetation Index (NDVI) as derived from the Terra satellite's Moderate Resolution Imaging Spectroradiometer (MODIS), (2) daytime and (3) nighttime land surface temperature (LST) as derived from Aqua satellite's MODIS, and (4) precipitation from the National Weather Service (NWS) Nexrad-based gridded data product. The originating MODIS data utilized were the globally produced 8-day, clear sky composited science products (MOD09Q1 and MYD11A2), while the US-wide NWS data were manipulated to mesh with the MODIS imagery both spatially and temporally by regridding and summing the otherwise daily measurements. The crop growing seasons of 2006–2011 were analyzed with each year bounded by 32 8-day periods from mid-February through late October. Land cover classifications known as the Cropland Data Layer as produced annually by the National Agricultural Statistics Service (NASS) were used to isolate the input dataset pixels as to corn and soybeans for each of the corresponding years. The relevant pixels were then averaged by crop and time period to produce a county-level estimate of NDVI, the LSTs, and precipitation. They in turn were related to official annual NASS county level yield statistics. For the Corn Belt region as a whole, both corn and soybean yields were found to be positively correlated with NDVI in the middle of the summer and negatively correlated to daytime LST at that same time. Nighttime LST and precipitation showed no correlations to yield, regardless of the time prior or during the growing season. There was also slight suggestion of low NDVI and high daytime LST in the spring being positively related to final yields, again for both crops. Taking only NDVI and daytime LST as inputs from the 2006–2011 dataset, regression tree-based models were built and county-level, within-sample coefficients of determination (R2) of 0.93 were found for both crops. Limiting the models by systematically removing late season data showed the model performance to remain strong even at mid-season and still viable even earlier. Finally, the derived models were used to predict out-of-sample for the 2012 season, which ended up having an anomalous drought. Yet, the county-level results compared reasonably well against official statistics with R2=0.77 for corn and 0.71 for soybeans. The root-mean-square errors were 1.26 and 0.42metrictonsper hectare, respectively.
•MODIS NDVI was found to be positively correlated to crop yields mid-summer.•MODIS daytime land surface temperature was negatively correlated mid-summer.•MODIS nighttime land surface temperature showed no relationship to yields anytime.•NWS Nexrad rainfall data also showed no relationship to yields anytime.•A rule-based decision tree model predicted well for the anomalous 2012 crop season.
The purpose of this study is to examine existing deep learning techniques for addressing class imbalanced data. Effective classification with imbalanced data is an important area of research, as high ...class imbalance is naturally inherent in many real-world applications, e.g., fraud detection and cancer detection. Moreover, highly imbalanced data poses added difficulty, as most learners will exhibit bias towards the majority class, and in extreme cases, may ignore the minority class altogether. Class imbalance has been studied thoroughly over the last two decades using traditional machine learning models, i.e. non-deep learning. Despite recent advances in deep learning, along with its increasing popularity, very little empirical work in the area of deep learning with class imbalance exists. Having achieved record-breaking performance results in several complex domains, investigating the use of deep neural networks for problems containing high levels of class imbalance is of great interest. Available studies regarding class imbalance and deep learning are surveyed in order to better understand the efficacy of deep learning when applied to class imbalanced data. This survey discusses the implementation details and experimental results for each study, and offers additional insight into their strengths and weaknesses. Several areas of focus include: data complexity, architectures tested, performance interpretation, ease of use, big data application, and generalization to other domains. We have found that research in this area is very limited, that most existing work focuses on computer vision tasks with convolutional neural networks, and that the effects of big data are rarely considered. Several traditional methods for class imbalance, e.g. data sampling and cost-sensitive learning, prove to be applicable in deep learning, while more advanced methods that exploit neural network feature learning abilities show promising results. The survey concludes with a discussion that highlights various gaps in deep learning from class imbalanced data for the purpose of guiding future research.
► DSC is a routine technique in many biophysics labs. ► Uses of DSC in studying protein folding and thermal stability are reviewed. ► Novel applications of DSC have emerged over the review period. ► ...DSC has been used to probe folding free energy surfaces and barrier heights. ► DSC can probe complex samples such as plasma for application in disease diagnosis.
Differential scanning calorimetry measures the heat capacity of states and the excess heat associated with transitions that can be induced by temperature change. The integral of the excess heat capacity is the enthalpy for this process. Despite this potentially intimidating sounding physical chemistry background, DSC has found almost universal application in studying biological macromolecules. In the case of proteins, DSC can be used to determine equilibrium thermodynamic stability and folding mechanism but can also be used in a more qualitative manner screening for thermal stability as an indicator for, ligand binding, pharmaceutical formulation or conditions conducive to crystal growth. DSC usually forms part of a wider biophysical characterisation of the biological system of interest and so the literature is diverse and difficult to categorise for the technique in isolation. This review therefore describes the potential uses of DSC in studying protein folding and stability, giving brief examples of applications from the recent literature. There have also been some interesting developments in the use of DSC to determine barrier heights for fast folding proteins and in studying complex protein mixtures such as human plasma that are considered in more detail.
The Dark Energy Survey Image Processing Pipeline Morganson, E.; Gruendl, R. A.; Menanteau, F. ...
Publications of the Astronomical Society of the Pacific,
07/2018, Letnik:
130, Številka:
989
Journal Article
Recenzirano
Odprti dostop
The Dark Energy Survey (DES) is a five-year optical imaging campaign with the goal of understanding the origin of cosmic acceleration. DES performs a ∼5000 deg2 survey of the southern sky in five ...optical bands (g, r, i, z, Y) to a depth of ∼24th magnitude. Contemporaneously, DES performs a deep, time-domain survey in four optical bands (g, r, i, z) over ∼27 deg2. DES exposures are processed nightly with an evolving data reduction pipeline and evaluated for image quality to determine if they need to be retaken. Difference imaging and transient source detection are also performed in the time domain component nightly. On a bi-annual basis, DES exposures are reprocessed with a refined pipeline and coadded to maximize imaging depth. Here we describe the DES image processing pipeline in support of DES science, as a reference for users of archival DES data, and as a guide for future astronomical surveys.
In the beginning, cheese making in the United States was all art, but embracing science and technology was necessary to make progress in producing a higher quality cheese. Traditional cheese making ...could not keep up with the demand for cheese, and the development of the factory system was necessary. Cheese quality suffered because of poor-quality milk, but 3 major innovations changed that: refrigeration, commercial starters, and the use of pasteurized milk for cheese making. Although by all accounts cold storage improved cheese quality, it was the improvement of milk quality, pasteurization of milk, and the use of reliable cultures for fermentation that had the biggest effect. Together with use of purified commercial cultures, pasteurization enabled cheese production to be conducted on a fixed time schedule. Fundamental research on the genetics of starter bacteria greatly increased the reliability of fermentation, which in turn made automation feasible. Demand for functionality, machinability, application in baking, and more emphasis on nutritional aspects (low fat and low sodium) of cheese took us back to the fundamental principles of cheese making and resulted in renewed vigor for scientific investigations into the chemical, microbiological, and enzymatic changes that occur during cheese making and ripening. As milk production increased, cheese factories needed to become more efficient. Membrane concentration and separation of milk offered a solution and greatly enhanced plant capacity. Full implementation of membrane processing and use of its full potential have yet to be achieved. Implementation of new technologies, the science of cheese making, and the development of further advances will require highly trained personnel at both the academic and industrial levels. This will be a great challenge to address and overcome.
The applications of fluorine in drug design continue to expand, facilitated by an improved understanding of its effects on physicochemical properties and the development of synthetic methodologies ...that are providing access to new fluorinated motifs. In turn, studies of fluorinated molecules are providing deeper insights into the effects of fluorine on metabolic pathways, distribution, and disposition. Despite the high strength of the C–F bond, the departure of fluoride from metabolic intermediates can be facile. This reactivity has been leveraged in the design of mechanism-based enzyme inhibitors and has influenced the metabolic fate of fluorinated compounds. In this Perspective, we summarize the literature associated with the metabolism of fluorinated molecules, focusing on examples where the presence of fluorine influences the metabolic profile. These studies have revealed potentially problematic outcomes with some fluorinated motifs and are enhancing our understanding of how fluorine should be deployed.