This study presents the applicability of different types (exothermic and endothermic) of chemical blowing agents (CBAs) in the case of poly(lactic acid) (PLA). The amount of foaming agent is a fixed ...2 wt%. We used a twin-screw extruder and added the individual components in the form of dry mixture through the hopper of the extruder. We characterized the PLA matrix and the chemical blowing agents with different testing methods. In case of the produced foams we carried out morphological and mechanical tests and used scanning electron microscopy to examine cell structure. We showed that PLA can be successfully foamed with the use of chemical blowing agents. The best results were achieved with an exothermic CBA and with PLA type 8052D. The cell population density of PLA foams produced this way was 4.82 × 105 cells/cm3, their expansion was 2.36, their density 0.53 g/cm3 and their void fraction was 57.61%.
Abstract
Complex genetic diseases may be modulated by a large number of epistatic interactions affecting a polygenic phenotype. Identifying these interactions is difficult due to computational ...complexity, especially in the case of higher-order interactions where more than two genomic variants are involved. In this paper, we present BitEpi, a fast and accurate method to test all possible combinations of up to four bi-allelic variants (i.e. Single Nucleotide Variant or SNV for short). BitEpi introduces a novel bitwise algorithm that is 1.7 and 56 times faster for 3-SNV and 4-SNV search, than established software. The novel entropy statistic used in BitEpi is 44% more accurate to identify interactive SNVs, incorporating a
p
-value-based significance testing. We demonstrate BitEpi on real world data of 4900 samples and 87,000 SNPs. We also present EpiExplorer to visualize the potentially large number of individual and interacting SNVs in an interactive Cytoscape graph. EpiExplorer uses various visual elements to facilitate the discovery of true biological events in a complex polygenic environment.
Precise genomic modification using prime editing (PE) holds enormous potential for research and clinical applications. In this study, we generated all-in-one prime editing (PEA1) constructs that ...carry all the components required for PE, along with a selection marker. We tested these constructs (with selection) in HEK293T, K562, HeLa and mouse embryonic stem (ES) cells. We discovered that PE efficiency in HEK293T cells was much higher than previously observed, reaching up to 95% (mean 67%). The efficiency in K562 and HeLa cells, however, remained low. To improve PE efficiency in K562 and HeLa, we generated a nuclease prime editor and tested this system in these cell lines as well as mouse ES cells. PE-nuclease greatly increased prime editing initiation, however, installation of the intended edits was often accompanied by extra insertions derived from the repair template. Finally, we show that zygotic injection of the nuclease prime editor can generate correct modifications in mouse fetuses with up to 100% efficiency.
Industry 4.0 describes an adaptive and changeable production, where its factory cells have to be reconfigured at very short intervals, e.g. after each workpiece. Furthermore, this scenario cannot be ...realized with traditional devices, such as programmable logic controllers. Here the use of well-proven technologies of the information technology are conquering the production hall (IT-OT convergence). Therefore, both virtualization and novel communication technologies are being introduced in the field of industrial automation. In addition, these technologies are seen as the key for facilitating various emerging use cases. However, it is not yet clear whether each of the dedicated hardware and software components, which have been developed for specific control tasks and have performed well over decades, can be upgraded without major adjustments. In this paper, we examine the opportunities and challenges of hardware and operating system-level virtualization based on the stringent requirements imposed by industrial applications. For that purpose, benchmarks for different virtualization technologies are set by determining their computational and networking overhead, configuration effort, accessibility, scalability, and security.
Operational Technology (OT) networks and devices, i.e., all components used in industrial environments, were not designed with security in mind. Efficiency and ease of use were the most important ...design characteristics. However, due to the digitization of industry, an increasing number of devices and industrial networks are opened up to public networks. This is beneficial for the administration and organization of the industrial environments. However, it also increases the attack surface, providing possible points of entry for an attacker. Originally, breaking into production networks meant to break an information technology (IT)-perimeter first, such as a public Website, and then to move laterally to industrial control systems (ICSs) to influence the production environment. However, many OT-devices are connected directly to the Internet, which drastically increases the threat of compromise, especially since OT-devices contain several vulnerabilities. In this work, the presence of OT-devices in the Internet is analyzed from an attacker's perspective. Publicly available tools, such as the search engine Shodan and vulnerability databases, are employed to find commonly used OT-devices and map vulnerabilities to them. These findings are grouped according to the country of origin, manufacturer, and number as well as severity of vulnerability. More than 13000 devices were found, almost all contained at least one vulnerability. European and Northern American countries are by far the most affected ones.
The ground state solution of the random dimer model is at a critical point after, which has been shown with random link excitations. In this paper we test the robustness of the random dimer model to ...the random link excitation by imposing the maximum weight excitation. We numerically compute the scaling exponents of the curves arising in the model as well as the fractal dimension. Although strong finite size corrections are present, the results are compatible with that of the random link excitation. Furthermore, another form of excitation, the {\epsilon} - coupling excitation is studied. We find that near-optimal configurations belong to the same universality class as the travelling salesman problem. Thus, we confirm a conjecture on the scaling properties of combinatorial optimisation problems, for the specific case of minimum weight perfect matchings on 2-dimensional lattices. This document was submitted as my thesis project for the MSc Complex Systems Modelling course at King's College London in 2021. In particular, I would like to thank my supervisor, Dr Gabriele Sicuro for his insights and guidance.
Pre‐clinical responses to fast‐moving infectious disease outbreaks heavily depend on choosing the best isolates for animal models that inform diagnostics, vaccines and treatments. Current approaches ...are driven by practical considerations (e.g. first available virus isolate) rather than a detailed analysis of the characteristics of the virus strain chosen, which can lead to animal models that are not representative of the circulating or emerging clusters. Here, we suggest a combination of epidemiological, experimental and bioinformatic considerations when choosing virus strains for animal model generation. We discuss the currently chosen SARS‐CoV‐2 strains for international coronavirus disease (COVID‐19) models in the context of their phylogeny as well as in a novel alignment‐free bioinformatic approach. Unlike phylogenetic trees, which focus on individual shared mutations, this new approach assesses genome‐wide co‐developing functionalities and hence offers a more fluid view of the ‘cloud of variances’ that RNA viruses are prone to accumulate. This joint approach concludes that while the current animal models cover the existing viral strains adequately, there is substantial evolutionary activity that is likely not considered by the current models. Based on insights from the non‐discrete alignment‐free approach and experimental observations, we suggest isolates for future animal models.
Abstract
In silico predictions combined with in vitro, in vivo, and in situ observations collectively suggest that mouse adaptation of the severe acute respiratory syndrome 2 virus requires an ...aromatic substitution in position 501 or position 498 (but not both) of the spike protein’s receptor binding domain. This effect could be enhanced by mutations in positions 417, 484, and 493 (especially K417N, E484K, Q493K, and Q493R), and to a lesser extent by mutations in positions 486 and 499 (such as F486L and P499T). Such enhancements, due to more favorable binding interactions with residues on the complementary angiotensin-converting enzyme 2 interface, are, however, unlikely to sustain mouse infectivity on their own based on theoretical and experimental evidence to date. Our current understanding thus points to the Alpha, Beta, Gamma, and Omicron variants of concern infecting mice, whereas Delta and “Delta Plus” lack a similar biomolecular basis to do so. This paper identifies 11 countries (Brazil, Chile, Djibouti, Haiti, Malawi, Mozambique, Reunion, Suriname, Trinidad and Tobago, Uruguay, and Venezuela) where targeted local field surveillance of mice is encouraged because they may have come in contact with humans who had the virus with adaptive mutation(s). It also provides a systematic methodology to analyze the potential for other animal reservoirs and their likely locations.
Cloxy Fraunholz, Daniel; Reti, Daniel; Duque Anton, Simon ...
Proceedings of the 5th ACM Workshop on Moving Target Defense,
01/2018
Conference Proceeding
Legacy software, outdated applications and fast changing technologies pose a serious threat to information security. Several domains, such as long-life industrial control systems and Internet of ...Things devices, suffer from it. In many cases, system updates and new acquisitions are not an option. In this paper, a framework that combines a reverse proxy with various deception-based defense mechanisms is presented. It is designed to autonomously provide deception methods to web applications. Context-awareness and minimal configuration overhead make it perfectly suited to work as a service. The framework is built modularly to provide flexibility and adaptability to the application use case. It is evaluated with common web-based applications such as content management systems and several frequent attack vectors against them. Furthermore, the security and performance implications of the additional security layer are quantified and discussed. It is found that, given sound implementation, no further attack vectors are introduced to the web application. The performance of the prototypical framework increases the delay of communication with the underlying web application. This delay is within tolerable boundaries and can be further reduced by a more efficient implementation.