Environmental Impact Assessment (EIA) tools link the environmental context of a building to the design decision-making framework. Life Cycle Assessment (LCA) and Green Building Rating Systems (GBRS) ...are the two approaches commonly used to holistically analyse the environmental performance of the whole building. While GBRS is mostly based on a checklist with many qualitative criteria, LCA compels decision-makers to base the analyses on numerical evidence, facilitating the comparison between design choices. Some GBRS, such as LEED, BREEAM and Green Star, have been incorporating LCA as part of their assessment system. This practice brings transparency to the design process and increases designers' awareness of the building's environmental impact. Consequently, this paper aims to develop a schematic EIA framework within the design life cycle, proposing future research directions to support its development. The framework was based in three parallel analyses: (1) comparison between LCA and GBRS assessment methodologies, (2) analysis of the LCA parameters within GBRS, and (3) investigation of LCA software tools focused on buildings that comply with GBRS requirements. The research found that, rather than performing either a GBRS or an LCA, the most appropriate EIA methodology will depend on the design life cycle stage. Graphical outputs and connection with 3D modelling will improve the evaluation of environmental impacts as a process integrated with the design.
•The article synthesises the Life Cycle Assessment (LCA) requirements for Green Building Rating Systems (GBRS).•It analyses building's specific LCA software tools recommended by GBRS, based on LCA standards requirements.•It proposes a schematic framework to perform an Environmental Impact Assessment (EIA) throughout the design life cycle.•It suggests future research directions to support the development of the proposed schematic framework.
The primary goal of dynamic building envelopes is to meet and balance antagonistic performance criteria utilizing automatic operation. As opposed to static systems, automated shading and daylighting ...systems are increasingly being used in façade design with the intent to improve building performance. Taking this into consideration, the question that arises is whether such systems can significantly improve buildings energy performance and occupants׳ visual and thermal comfort. The present paper is a review of dynamic operation methods of shading/daylighting systems and their associated implications in building energy balance. Based on the subject distribution of the reviewed studies, the majority of the systems examined are versions of motorized blinds while the analysis of new emerging ideas on deployable and foldable façade systems is limited. User acceptance is quite crucial and is strongly dependent on the system׳s intuitive operation. According to the paper findings, energy savings with automatically controlled blinds depend on the type of control strategy and their connection to dimmable electric lighting systems. Even though control strategies enhance energy performance and occupants׳ comfort, their level of complexity highly affects their efficiency and therefore influences their performance.
We studied how the interactions among animals in a collective allow for the transfer of information. We performed laboratory experiments to study how zebrafish in a collective follow a subset of ...trained animals that move towards a light when it turns on because they expect food at that location. We built some deep learning tools to distinguish from video which are the trained and the naïve animals and to detect when each animal reacts to the light turning on. These tools gave us the data to build a model of interactions that we designed to have a balance between transparency and accuracy. The model finds a low-dimensional function that describes how a naïve animal weights neighbours depending on focal and neighbour variables. According to this low-dimensional function, neighbour speed plays an important role in the interactions. Specifically, a naïve animal weights more a neighbour in front than to the sides or behind, and more so the faster the neighbour is moving; and if the neighbour moves fast enough, the differences coming from the neighbour's relative position largely disappear. From the lens of decision-making, neighbour speed acts as confidence measure about where to go. This article is part of a discussion meeting issue 'Collective behaviour through time'.
How We Refactor, and How We Know It Murphy-Hill, E.; Parnin, C.; Black, A. P.
IEEE transactions on software engineering,
2012-Jan.-Feb., 2012-01-00, 20120101, Volume:
38, Issue:
1
Journal Article
Peer reviewed
Refactoring is widely practiced by developers, and considerable research and development effort has been invested in refactoring tools. However, little has been reported about the adoption of ...refactoring tools, and many assumptions about refactoring practice have little empirical support. In this paper, we examine refactoring tool usage and evaluate some of the assumptions made by other researchers. To measure tool usage, we randomly sampled code changes from four Eclipse and eight Mylyn developers and ascertained, for each refactoring, if it was performed manually or with tool support. We found that refactoring tools are seldom used: 11 percent by Eclipse developers and 9 percent by Mylyn developers. To understand refactoring practice at large, we drew from a variety of data sets spanning more than 39,000 developers, 240,000 tool-assisted refactorings, 2,500 developer hours, and 12,000 version control commits. Using these data, we cast doubt on several previously stated assumptions about how programmers refactor, while validating others. Finally, we interviewed the Eclipse and Mylyn developers to help us understand why they did not use refactoring tools and to gather ideas for future research.
Plug-in hybrid electric vehicles are a midterm solution to reduce the transportation sector's dependency on oil. However, if implemented in a large scale without control, peak load increases ...significantly and the grid may be overloaded. Two algorithms to address this problem are proposed and analyzed. Both are based on a forecast of future electricity prices and use dynamic programming to find the economically optimal solution for the vehicle owner. The first optimizes the charging time and energy flows. It reduces daily electricity cost substantially without increasing battery degradation. The latter also takes into account vehicle to grid support as a means of generating additional profits by participating in ancillary service markets. Constraints caused by vehicle utilization as well as technical limitations are taken into account. An analysis, based on data of the California independent system operator, indicates that smart charge timing reduces daily electricity costs for driving from 0.43 to 0.2. Provision of regulating power substantially improves plug-in hybrid electric vehicle economics and the daily profits amount to 1.71, including the cost of driving.
Computer-aided molecular design (CAMD) studies quantitative structure–property relationships and discovers desired molecules using optimization algorithms. With the emergence of machine learning ...models, CAMD score functions may be replaced by various surrogates to automatically learn the structure–property relationships. Due to their outstanding performance on graph domains, graph neural networks (GNNs) have recently appeared frequently in CAMD. But using GNNs introduces new optimization challenges. This paper formulates GNNs using mixed-integer programming and then integrates this GNN formulation into the optimization and machine learning toolkit OMLT. To characterize and formulate molecules, we inherit the well-established mixed-integer optimization formulation for CAMD and propose symmetry-breaking constraints to remove symmetric solutions caused by graph isomorphism. In two case studies, we investigate fragment-based odorant molecular design with more practical requirements to test the compatibility and performance of our approaches.
•We formulate graph neural networks with mixed-integer programming and break symmetry.•We integrate our formulation into the optimization and machine learning toolkit OMLT.•We develop an application in computer-aided molecular design of odorants.
Differential sensitivity analysis is indispensable in fitting parameters, understanding uncertainty, and forecasting the results of both thought and lab experiments. Although there are many methods ...currently available for performing differential sensitivity analysis of biological models, it can be difficult to determine which method is best suited for a particular model. In this paper, we explain a variety of differential sensitivity methods and assess their value in some typical biological models. First, we explain the mathematical basis for three numerical methods: adjoint sensitivity analysis, complex perturbation sensitivity analysis, and forward mode sensitivity analysis. We then carry out four instructive case studies. (a) The CARRGO model for tumor-immune interaction highlights the additional information that differential sensitivity analysis provides beyond traditional naive sensitivity methods, (b) the deterministic SIR model demonstrates the value of using second-order sensitivity in refining model predictions, (c) the stochastic SIR model shows how differential sensitivity can be attacked in stochastic modeling, and (d) a discrete birth-death-migration model illustrates how the complex perturbation method of differential sensitivity can be generalized to a broader range of biological models. Finally, we compare the speed, accuracy, and ease of use of these methods. We find that forward mode automatic differentiation has the quickest computational time, while the complex perturbation method is the simplest to implement and the most generalizable.
Precision medicine approaches often rely on complex and integrative analyses of multiple biomarkers from “omics” data to generate insights that can help with either diagnostic, prognostic, or ...therapeutical decisions. Such insights are often made using machine learning (ML) models that perform sample classification for a particular phenotype (yes/no). Building such models is a challenge and time-consuming, requiring advanced coding skills and mathematical modelling expertise. Artificial intelligence (AI) is a methodological solution that has the potential to facilitate, optimize, and scale model development. In this work, we developed an AI-based, user-friendly, and code-free platform that fully automated the development of predictive models from quantitative “omics” data. Here, we show the application of this tool with the development of cancer survival prognostics models using real-life data from breast, lung, and renal cancer transcriptomes. In comparison to other models, our generated models rendered performances with competitive sensitivities (72–85%), specificities (76–85%), accuracies (75–85%), and Receiver Operating Characteristic curves with superior Areas Under the Curve (ROC-AUC of 77–86%). Further, we reported the associated sets of genes (biomarkers) and their expression patterns that were predictive of cancer survival. Moreover, we made our models available as online tools to generate prognostic predictions based on the gene expressions of the biomarkers. In conclusion, we demonstrated that our tool is a robust, user-friendly solution for developing bespoke predictive tools from “omics” data, which facilitate precision medicine applications to the point-of-care.
Agricultural productivity is highly influenced by its associated microbial community. With advancements in omics technology, metagenomics is known to play a vital role in microbial world studies by ...unlocking the uncultured microbial populations present in the environment. Metagenomics is a diagnostic tool to target unique signature loci of plant and animal pathogens as well as beneficial microorganisms from samples. Here, we reviewed various aspects of metagenomics from experimental methods to techniques used for sequencing, as well as diversified computational resources, including databases and software tools. Exhaustive focus and study are conducted on the application of metagenomics in agriculture, deciphering various areas, including pathogen and plant disease identification, disease resistance breeding, plant pest control, weed management, abiotic stress management, post-harvest management, discoveries in agriculture, source of novel molecules/compounds, biosurfactants and natural product, identification of biosynthetic molecules, use in genetically modified crops, and antibiotic-resistant genes. Metagenomics-wide association studies study in agriculture on crop productivity rates, intercropping analysis, and agronomic field is analyzed. This article is the first of its comprehensive study and prospects from an agriculture perspective, focusing on a wider range of applications of metagenomics and its association studies.
Display omitted
This short communication reports on the implementation and validation of the three-phase method for daylight calculations in the freely available software DIALux evo. Compared to the ...previous approach to daylight calculation in DIALux, the 3-phase method allows for a faster calculation of yearly daylight illuminance without sacrificing accuracy. DIALux is one of the most widespread software among electric lighting designers. By implementing simple, accurate, and fast analysis, this new version of DIALux makes daylight analysis easily available to electric lighting designers. This can unleash the potential of properly integrated daylighting and electric lighting design, for better and more efficient lighting projects.