Resource sharing and mass storage in server farms provided by cloud platforms save huge amounts of energy. However, optimizing energy consumption at the server room is not enough, being desirable to ...perform energy optimization of cloud services at the application level. In cloud computing a tailored configuration of services is deployed for each client (tenant), requiring different energy consumption optimizations. Indeed, energy consumption of cloud services depends on several factors determined by the context and usage of the applications. So, to evolve a cloud application to new requirements of energy efficiency implies to perform custom-made adaptations for each tenant. Thus, managing the evolution of a multi-tenant application with hundreds of tenants and thousands of different valid architectural configurations can become intractable if performed manually. This paper proposes a product line architecture approach that: (1) uses cardinality-based variability models to model each tenant as a clonable feature, and (2) automatizes the process of evolving the multi-tenant application architecture when the energy requirements change. The implemented process is efficient for a high number of tenants in a reasonable time.
Niobia and alumina supported palladium catalyst promoted by copper were investigated in the reaction of nitrate catalytic reduction in water and characterized by temperature programmed reduction, ...physisorption, H2 chemisorption and X-ray diffraction. Niobia supported Pd–Cu catalysts were as active and selective as an alumina supported catalyst. All catalysts had similar turnover frequencies independent of the support. The control of pH and the interaction between Pd and Cu were critical to improving the selectivity and activity of Pd–Cu/Nb2O5 catalysts.
International Guidelines as well as Cancer Associations recommend a multidisciplinary approach to lung cancer care. A multidisciplinary team (MDT) can significantly improve treatment decision-making ...and patient coordination by putting different physicians and other health professionals "in the same room", who collectively decide upon the best possible treatment. However, this is not a panacea for cancer treatment. The impact of multidisciplinary care (MDC) on patient outcomes is not univocal, while the effective functioning of the MDT depends on many factors. This review presents the available MDT literature with an emphasis on the key factors that characterize high-quality patient care in lung cancer. The study was conducted with a bibliographic search using different electronic databases (PubMed Central, Scopus, Google Scholar, and Google) referring to multidisciplinary cancer care settings. Many key elements appear consolidated, while others emerge as prevalent and actual, especially those related to visible barriers which work across geographic, organizational, and disciplinary boundaries. MDTs must be sustained by strategic management, structured within the entity, and cannot be managed as a separate care process. Furthermore, they need to coordinate with other teams (within and outside the organization) and join with the broad range of services delivered by multiple providers at various points of the cancer journey or within the system, with the vision of integrated care.
Infrared (IR) spectroscopy is commonly utilized for the investigation of protein structures and protein-mediated processes. While the amide I band provides information on protein secondary ...structures, amino acid side chains are used as IR probes for the investigation of protein reactions, such as proton pumping in rhodopsins. In this work, we calculate the IR spectra of the solvated aspartic acid, with both zwitterionic and protonated backbones, and of a capped form,
i.e.
mimicking the aspartic acid residue in proteins, by means of molecular dynamics (MD) simulations and the perturbed matrix method (PMM). This methodology has already proved its good modeling capabilities for the amide I mode and is here extended to the treatment of protein side chains. The computed side chain vibrational signal is in very good agreement with the experimental one, well reproducing both the peak frequency position and the bandwidth. In addition, the MD-PMM approach proposed here is able to reproduce the small frequency shift (5-10 cm
−1
) experimentally observed between the protonated and zwitterionic forms, showing that such a shift depends on the excitonic coupling between the modes localized on the side chain and on the backbone in the protonated form. The spectrum of the capped form, in which the amide I band is also calculated, agrees well with the corresponding experimental spectrum. The reliable calculation of the vibrational bands of carboxyl-containing side chains provides a useful tool for the interpretation of experimental spectra.
Infrared (IR) spectroscopy is commonly utilized for the investigation of protein structures and protein-mediated processes.
The long-term problems of head and neck cancer survivors (HNCS) are not well known. In a cross-sectional international study aimed at exploring the long-term quality of life in this population, 1114 ...HNCS were asked to state their two most serious long-term effects. A clinician recorded the responses during face-to-face appointments. A list of 15 example problems was provided, but a free text field was also available. A total of 1033 survivors responded to the question. The most frequent problems were 'dry mouth' (DM) (
= 476; 46%), 'difficulty swallowing/eating' (DSE) (
= 408; 40%), 'hoarseness/difficulty speaking' (HDS) (
= 169; 16%), and 'pain in the head and neck' (PHN) (
= 142; 14%). A total of 5% reported no problems. Logistic regression adjusted for age, gender, treatment, and tumor stage and site showed increased odds of reporting DM and DSE for chemo-radiotherapy (CRT) alone compared to surgery alone (odds ratio (OR): 4.7, 95% confidence interval (CI): 2.5-9.0; OR: 2.1, CI: 1.1-3.9), but decreased odds for HDS and PHN (OR: 0.3, CI: 0.1-0.6; OR: 0.2, CI: 0.1-0.5). Survivors with UICC stage IV at diagnosis compared to stage I had increased odds of reporting HDS (OR: 1.9, CI: 1.2-3.0). Laryngeal cancer survivors had reduced odds compared to oropharynx cancer survivors of reporting DM (OR: 0.4, CI: 0.3-0.6) but increased odds of HDS (OR: 7.2, CI: 4.3-12.3). This study provides evidence of the serious long-term problems among HNCS.
Real-world Software Product Lines (SPLs) need Numerical Feature Models (NFMs) whose features have not only boolean values that satisfy boolean constraints but also have numeric attributes that ...satisfy arithmetic constraints. An essential operation on NFMs finds near-optimal performing products, which requires counting the number of SPL products. Typical constraint satisfaction solvers perform poorly on counting and sampling.
Nemo (Numbers, features, models) is a tool that supports NFMs by bit-blasting, the technique that encodes arithmetic expressions as boolean clauses. The newest version, Nemo2, translates NFMs to propositional formulas and the Universal Variability Language (UVL). By doing so, products can be counted efficiently by #SAT and Binary Decision Tree solvers, enabling finding near-optimal products. This article evaluates Nemo2 with a large set of synthetic and colossal real-world NFMs, including complex arithmetic constraints and counting and sampling experiments. We empirically demonstrate the viability of Nemo2 when counting and sampling large and complex SPLs.
•Model counting allows fast near-optimal optimization of colossal configuration spaces.•Reasoning tools that perform model counting do not support numerical features models.•Nemo2 transforms numerical feature models into classical ones by using bit-blasting.•SharpSAT model counts 10250 configurations of a Nemo2 transformed model in <5 h.•Binary decision diagrams uniform random sample 1045 configurations in <10 s.
We consider the optimal prediction problem of stopping a spectrally negative Lévy process as close as possible to a given distance $b \geq 0$ from its ultimate supremum, under a squared-error penalty ...function. Under some mild conditions, the solution is fully and explicitly characterised in terms of scale functions. We find that the solution has an interesting non-trivial structure: if b is larger than a certain threshold then it is optimal to stop as soon as the difference between the running supremum and the position of the process exceeds a certain level (less than b), while if b is smaller than this threshold then it is optimal to stop immediately (independent of the running supremum and position of the process). We also present some examples.
Emergent application domains, such as cyber–physical systems, edge computing or industry 4.0. present a high variability in software and hardware infrastructures. However, no single variability ...modeling language supports all language extensions required by these application domains (i.e., attributes, group cardinalities, clonables, complex constraints). This limitation is an open challenge that should be tackled by the software engineering field, and specifically by the software product line (SPL) community. A possible solution could be to define a completely new language, but this has a high cost in terms of adoption time and development of new tools. A more viable alternative is the definition of refactoring and specialization rules that allow interoperability between existing variability languages. However, with this approach, these rules cannot be reused across languages because each language uses a different set of modeling concepts and a different concrete syntax. Our approach relies on a modular and extensible metamodel that defines a common abstract syntax for existing variability modeling extensions. We map existing feature modeling languages in the SPL community to our common abstract syntax. Using our abstract syntax, we define refactoring rules at the language construct level that help to achieve interoperability between variability modeling languages.
•Model-Driven approach to provide interoperability between feature modeling tools.•An extensible and modular common abstract syntax for feature modeling.•Expressiveness analysis of existing language constructs for feature modeling.•Refactorings at the abstract syntax level between feature modeling concepts.
For the last ten years, software product line (SPL) tool developers have been facing the implementation of different variability requirements and the support of SPL engineering activities demanded by ...emergent domains. Despite systematic literature reviews identifying the main characteristics of existing tools and the SPL activities they support, these reviews do not always help to understand if such tools provide what complex variability projects demand. This paper presents an empirical research in which we evaluate the degree of maturity of existing SPL tools focusing on their support of variability modeling characteristics and SPL engineering activities required by current application domains. We first identify the characteristics and activities that are essential for the development of SPLs by analyzing a selected sample of case studies chosen from application domains with high variability. Second, we conduct an exploratory study to analyze whether the existing tools support those characteristics and activities. We conclude that, with the current tool support, it is possible to develop a basic SPL approach. But we have also found out that these tools present several limitations when dealing with complex variability requirements demanded by emergent application domains, such as non-Boolean features or large configuration spaces. Additionally, we identify the necessity for an integrated approach with appropriate tool support to completely cover all the activities and phases of SPL engineering. To mitigate this problem, we propose different road map using the existing tools to partially or entirely support SPL engineering activities, from variability modeling to product derivation.
Emergent application domains (e.g., Edge Computing/Cloud/B5G systems) are complex to be built manually. They are characterised by high variability and are modelled by large Variability Models (VMs), ...leading to large configuration spaces. Due to the high number of variants present in such systems, it is challenging to find the best-ranked product regarding particular Quality Attributes (QAs) in a short time. Moreover, measuring QAs sometimes is not trivial, requiring a lot of time and resources, as is the case of the energy footprint of software systems — the focus of this paper. Hence, we need a mechanism to analyse how features and their interactions influence energy footprint, but without measuring all configurations. While practical, sampling and predictive techniques base their accuracy on uniform spaces or some initial domain knowledge, which are not always possible to achieve. Indeed, analysing the energy footprint of products in large configuration spaces raises specific requirements that we explore in this work. This paper presents SAVRUS (Smart Analyser of Variability Requirements in Unknown Spaces), an approach for sampling and dynamic statistical learning without relying on initial domain knowledge of large and partially QA-measured spaces. SAVRUS reports the degree to which features and pairwise interactions influence a particular QA, like energy efficiency. We validate and evaluate SAVRUS with a selection of likewise systems, which define large searching spaces containing scattered measurements.
•Statistical sampling with smart learning can derive insights of partially unknown spaces.•3% readings can statistically describe the quality of highly configurable systems.•Any 0.25% of the samples can potentially detect strong features-quality interactions.•Partially unknown spaces of 5.3 * 108 solutions can be analysed under 7 min.•SAVRUS suggests Edge/Cloud/B5G changes to reduce up to a 90% the energy requirements.