There is growing attention to energy efficiency in the software engineering field. This has been driven by modern technologies, for example, Internet of Things (IoT), Social Networking Services (SNS) ...and quantum computing. In addition to this, recent trends and concerns such as Environment, Social, and Governance (ESG) and human/societal/environmental well-being for responsible Artificial Intelligence (AI) have accelerated the use of energy efficient software. Despite this, energy concerns in this field have been less explored and studied. This limitation results in falling short to address and overcome greenability issues at the software level, and leaving critical challenges to be solved in this space. This study aims to address this limitation and fill the gap between previous studies. We survey green in software engineering framed by the ten knowledge areas of software engineering to not only cover the entire development life-cycle but also widen the scope of discussion to software process, method, and model management. Based on our comprehensive investigation, we discuss open challenges, trade-offs and implications of this study for both researchers and practitioners.
•Comprehensive investigation of energy concerns in software engineering.•Identification of open challenges and trade-offs in green software engineering.•Proposal of practical approaches for green software development.•Exploration of energy consumption in AI systems within the supply chain.
Some of the proxies used to identify palaeotsunamis are reviewed in light of new findings following the 2004 Indian Ocean Tsunami and the 2009 South Pacific Tsunami, and a revised toolkit provided. ...The new application of anisotropy of magnetic susceptibility (AMS) to the study of tsunami deposits and its usefulness to determine the hydrodynamic conditions during the emplacement of tsunami sequences, together with data from grain size analysis, are presented. The value of chemical proxies as indicators of saltwater inundation, associated marine shell and/or coral, high-energy depositional environment, and possible contamination, is demonstrated and issues of preservation addressed. We also provide new findings from detailed studies of heavy minerals.
New information gathered during the UNESCO — International Oceanographic Commission (IOC) International Tsunami Survey of fine onshore sediments following the 2009 South Pacific Tsunami is presented, and includes grain size, chemical, diatom and foraminifera data. The tsunami deposit varied, ranging from fining-upward sand layers to thin sand layers overlain by a thick layer of organic debris and/or a mud cap. Grain size characteristics, chemical data and microfossil assemblages provide evidence for marine inundation from near shore, and changes in flow dynamics during the tsunami.
Extensive bathymetric and two‐dimensional seismic surveys have been carried out and cores collected in Pago Pago Bay (Tutuila, American Samoa) in order to describe and gain a better understanding of ...the sediment fill of the bay, which was affected by the 2009 South Pacific Tsunami. Eight sedimentary units were identified over the volcanic bedrock. The basal transgressive unit displays retrograding onlaps towards the shore, whereas the overlying seven aggradational layers alternate between four draping units and three pinching out seaward units. ‘Core to seismic’ correlation reveals that draping units are composed of homogeneous silts, while pinching out units are dominated by very coarse coral fragments showing fresh cuts, mixed with Halimeda plates. The basal unit is attributed to transgressive sedimentation in response to flooding of the bay after the last glacial maximum, followed by the upper aggradational units corresponding to highstand sedimentation. The changeovers in these upper units indicate an alternation between low‐energy silt units and high‐energy coral debris units interpreted as tsunami‐induced deposits. The 14C dating reveals that high‐energy sedimentation units can last up to approximately 2000 years while low‐energy sedimentation units can last up to approximately 1000 years. This alternation, deposited during the last highstand, may be explained by cycles of tectonic activity and quiescence of the Tonga Trench subduction, which is the main source of tsunamigenic earthquakes impacting the Samoan archipelago. In the uppermost silt unit, only the geochemical signature of the terrestrial input of the 2009 SPT backwash deposits was detected between 7 cm and 9 cm depth. Hence, Pago Pago Bay offers a unique sediment record of Holocene bay‐fill under the impact of past tsunamis intermittently during the last 7000 years.
There is a growing interest for marine flooding related to recent catastrophic events and their unintended consequences in terms of casualties and damages, and to the increasing population and issues ...along the coasts in a context of changing climate. Consequently, the knowledge on marine flooding has progressed significantly for the last years and this review, focused on storm-induced marine submersions, responds to the need for a synthesis. Three main components are presented in the review: (1) a state-of-the-art on marine submersions from the viewpoint of several scientific disciplines; (2) a selection of examples demonstrating the added value of interdisciplinary approaches to improve our knowledge of marine submersions; (3) a selection of examples showing how the management of future crises or the planning efforts to adapt to marine submersions can be supported by new results or techniques from the research community.
From a disciplinary perspective, recent progress was achieved with respect to physical processes, numerical modeling, the knowledge of past marine floods and vulnerability assessment. At a global scale, the most vulnerable coastal areas to marine flooding with high population density are deltas and estuaries. Recent and well-documented floods allow analyzing the vulnerability parameters of different coastal zones. While storm surges can nowadays be reproduced accurately, the modeling of coastal flooding is more challenging, particularly when barrier breaches and wave overtopping have to be accounted for. The chronology of past marine floods can be reconstructed combining historical archives and sediment records. Sediment records of past marine floods localized in back barrier depressions are more adequate to reconstruct past flooding chronology. For the two last centuries, quantitative and descriptive historical data can be used to characterize past marine floods. Beyond providing a chronology of events, sediment records combined with geochronology, statistical analysis and climatology, can be used to reconstruct millennial-scale climate variability and enable a better understanding of the possible regional and local long-term trends in storm activity. Sediment records can also reveal forgotten flooding of exceptional intensity, much more intense than those of the last few decades. Sedimentological and historical archives, combined with high-resolution topographic data or numerical hindcast of storms can provide quantitative information and explanations for marine flooding processes. From these approaches, extreme past sea levels height can be determined and are very useful to complete time series provided by the instrumental measurements on shorter time scales. In particular, historical data can improve the determination of the return periods associated with extreme water levels, which are often inaccurate when computed based on instrumental data, due to the presence of gaps and too short time-series. Long-term numerical hindcast of tides and surges can also be used to provide the required time series for statistical analysis. Worst-case scenarios, used to define coastal management plans and strategies, can be obtained from realistic atmospheric settings with different tidal ranges and by shifting the trajectory of storms.
Management of future crises and planning efforts to adapt to marine submersions are optimized by predictions of water levels from hydrodynamic models. Such predictions combined with in situ measurements and analysis of human stakes can be used to define a vulnerability index. Then, the efficiency of adaptation measures can be evaluated with respect to the number of lives that could be potentially saved. Numerical experiments also showed that the realignment of coastal defenses could result in water level reduction up to 1m in the case where large marshes are flooded. Such managed realignment of coastal defenses may constitute a promising adaptation to storm-induced flooding and future sea level rise. From a legal perspective, only a few texts pay specific attention to the risk of marine flooding whether nationally or globally. Recent catastrophic events and their unintended consequences in terms of death and damages have triggered political decisions, like in USA after hurricane Katrina, and in France after catastrophic floods that occurred in 2010.
Internet of Things (IoT) architectures generally focus on providing consistent performance and reliable communications. The convergence of IoT, edge, fog, and cloud aims to improve the quality of ...service of applications, which does not typically emphasize energy efficiency. Considering energy in IoT architectures would reduce the energy impact from billions of IoT devices. The research presented in this paper proposes an optimization framework that considers energy consumption of nodes when selecting a node for processing an IoT request in edge-fog-cloud layered architecture. The IoT use cases considered in this paper include smart grid, autonomous vehicles, and eHealth. The proposed framework is evaluated using CPLEX simulations. The results provide insights into mechanisms that can be used to select nodes energy-efficiently whilst meeting the application requirements and other network constraints in multi-layered IoT architectures.
•We found hyper-param tuning is not well justified in many cases but still very useful in a few.•We propose a framework to address the problem of deciding to-tune or not-to-tune.•We implemented a ...prototype of the framework with 486 datasets and 4 algorithm.•The results indicates our framework is effective at avoiding effects of ineffective tuning.•Our framework enables a life-long learning approach to the problem.
Hyper-parameter optimization is a process to find suitable hyper-parameters for predictive models. It typically incurs highly demanding computational costs due to the need of the time-consuming model training process to determine the effectiveness of each set of candidate hyper-parameter values. A priori, there is no guarantee that hyper-parameter optimization leads to improved performance. In this work, we propose a framework to address the problem of whether one should apply hyper-parameter optimization or use the default hyper-parameter settings for traditional classification algorithms. We implemented a prototype of the framework, which we use a basis for a three-fold evaluation with 486 datasets and 4 algorithms. The results indicate that our framework is effective at supporting modeling tasks in avoiding adverse effects of using ineffective optimizations. The results also demonstrate that incrementally adding training datasets improves the predictive performance of framework instantiations and hence enables “life-long learning.”
•Deriving correlation between operation logs and metrics.•Statistically relevant metrics are identified from all available metrics.•Log activities with highest impact on changes on target metrics ...identified.•Assertion specifications derived and utilized for anomaly detection.•Anomaly detection is evaluated with fault injection on rolling upgrade operation.
Cloud computing systems provide the facilities to make application services resilient against failures of individual computing resources. However, resiliency is typically limited by a cloud consumer’s use and operation of cloud resources. In particular, system operations have been reported as one of the leading causes of system-wide outages. This applies specifically to DevOps operations, such as backup, redeployment, upgrade, customized scaling, and migration – which are executed at much higher frequencies now than a decade ago. We address this problem by proposing a novel approach to detect errors in the execution of these kinds of operations, in particular for rolling upgrade operations. Our regression-based approach leverages the correlation between operations’ activity logs and the effect of operation activities on cloud resources. First, we present a metric selection approach based on regression analysis. Second, the output of a regression model of selected metrics is used to derive assertion specifications, which can be used for runtime verification of running operations. We have conducted a set of experiments with different configurations of an upgrade operation on Amazon Web Services, with and without randomly injected faults to demonstrate the utility of our new approach.
•Successful implementation of continuous chiral resolution of racemic Ibuprofen in a Couette-Taylor crystallizer.•Enhanced productivity and chiral purity thanks to the transfer in continuous ...mode.•Results obtained according to a design of experiments (DoE) methodology.•Thanks to DoE methodology, 14 distinct experiments were enough among the 81 experiments of the full factorial design.•The model established with the DoE leads to a set of parameters which are adaptative to different specifications.
S-Ibuprofen has been proved to be more efficient than racemic Ibuprofen, one the best sell drugs over the world. Thus, there is a strong interest in implementing chiral resolution of racemic Ibuprofen in the continuous mode.
Chiral resolution by diastereomeric salt formation of racemic Ibuprofen has been successfully implemented in a Couette-Taylor (CT) crystallizer. Among the seven parameters identified as impacting, a screening has been performed on the four factors expected to have the strongest influence: temperature gradient in the CT crystallizer, rotation speed, residence time and target temperature within the crystallizer.
Thanks to the rationalization of the study through a design of experiments approach, only 14 distinct experiments were performed to identify the influence of these factors on global productivity/yield, diastereomeric excess and diastereomeric productivity/yield, and to find a first operating point.
If the obtained yields were unsurprisingly low, global productivity and diastereomeric productivity are increased by 16 and 8 times respectively, in comparison with batch mode. Also, continuous mode gives an easy access to a higher diastereomeric excess, in a repeatable manner.
Surprisingly, under the selected process conditions, the steady state does not seem to be reached before 14–15 residence times, which is longer than expected from previous results.
The sources of the Amu Darya, one of the major Central Asian rivers draining to the Aral Sea, are located in the glacierized high-mountain areas of Tajikistan, Kyrgyzstan and Afghanistan. There, ...climate change and the resulting retreat of glaciers have led to the formation of numerous new glacial lakes. Other lakes in the area are embedded in older glacial landscapes (erosion lakes) or retained by block or debris dams (e.g., Lake Sarez). A multi-temporal lake inventory is prepared and analysed, based on remotely sensed data. Corona images from 1968 are used as well as more up-to-date ASTER and Landsat 7 scenes. 1642 lakes are mapped in total, 652 out of them are glacial lakes. 73% of all lakes are located above 4000ma.s.l. Glacial lakes, abundant in those areas where glacier tongues retreat over flat or moderately steep terrain, have experienced a significant growth, even though changes are often superimposed by short-term fluctuations. The analysis results also indicate a shifting of the growth of glacial lakes from the south western Pamir to the central and northern Pamir during the observation period. This trend is most likely associated with more elevated contribution areas in the central and northern Pamir. The lakes of the other types have remained constant in size in general. The lake development reflects changes in the state of the water resources in the study area on the one hand and determines the level of lake outburst hazards on the other hand.
•1642 lakes exist in the research area, 652 are glacial lakes in a strict sense.•Glacial lakes show a clear growing trend not observed for other lakes.•Glacial lake growth is related to glacier retreat or decay in rather flat areas.•A shift of glacial lake growth towards more elevated catchments is observed.
Context
In the last decade of data-driven decision-making, Machine Learning (ML) systems reign supreme. Because of the different characteristics between ML and traditional Software Engineering ...systems, we do not know to what extent the issue-reporting needs are different, and to what extent these differences impact the issue resolution process.
Objective
We aim to compare the differences between ML and non-ML issues in open-source applied AI projects in terms of resolution time and size of fix. This research aims to enhance the predictability of maintenance tasks by providing valuable insights for issue reporting and task scheduling activities.
Method
We collect issue reports from Github repositories of open-source ML projects using an automatic approach, filter them using ML keywords and libraries, manually categorize them using an adapted deep learning bug taxonomy, and compare resolution time and fix size for ML and non-ML issues in a controlled sample.
Result
147 ML issues and 147 non-ML issues are collected for analysis. We found that ML issues take more time to resolve than non-ML issues, the median difference is 14 days. There is no significant difference in terms of size of fix between ML and non-ML issues. No significant differences are found between different ML issue categories in terms of resolution time and size of fix.
Conclusion
Our study provided evidence that the life cycle for ML issues is stretched, and thus further work is required to identify the reason. The results also highlighted the need for future work to design custom tooling to support faster resolution of ML issues.