Raman spectra are examples of high dimensional data that can often be limited in the number of samples. This is a primary concern when Deep Learning frameworks are developed for tasks such as ...chemical species identification, quantification, and diagnostics. Open-source data are difficult to obtain and often sparse; furthermore, the collecting and curating of new spectra require expertise and resources. Deep generative modeling utilizes Deep Learning architectures to approximate high dimensional distributions and aims to generate realistic synthetic data. The evaluation of the data and the performance of the deep models is usually conducted on a per-task basis and provides no indication of an increase to robustness, or generalization, on a wider scale. In this study, we compare the benefits and limitations of a standard statistical approach to data synthesis (weighted blending) with a popular deep generative model, the Variational Autoencoder. Two binary data sets are divided into 3-fold to simulate small, limited samples. Synthetic data distributions are created per fold using the two methods and then augmented into the training of two Deep Learning algorithms, a Convolutional Neural Network and a Fully-Connected Neural Network. The goal of this study is to observe the trends in learning as synthetic data are continually augmented to the training data in increasing batches. To determine the impact of each synthetic method, Principal Component Analysis and the discrete Fréchet distance are implemented to visualize and measure the distance between the source and synthetic distributions along with the Machine Learning metric balanced accuracy for evaluating performance on imbalanced data.
The intentional targeting of components in a cloud based application, in order to artificially inflate usage bills, is an issue application owners have faced for many years. This has occurred under ...many guises, such as: Economic Denial of Sustainability (EDoS),
Click Fraud
and even secondary effects of Denial of Service (DoS) attacks. With the advent of commercial offerings of serverless computing circa 2015, a variant of the EDoS attack has emerged, termed,
Denial-of-Wallet
(DoW). We describe our development of a simulation tool as safe means to research these attacks as well as to generate datasets for the training of future mitigation systems to combat DoW. We believe that DoW may become increasingly prevalent as applications further utilise services based on a pay-per-invocation cost model. Given that the damage caused is purely financial, such attacks may not be disclosed as application users are not directly effected. As such, we believe that the development of an attack simulator and specific testing of security measures against this niche attack will be able to provide previously unavailable data and insights for the research community. We have developed a prototype DoW simulator that can emulate multiple months worth of API calls in a matter of hours for ease of training data generation. Our aspiration for the future of this work is to provide a system and starting point for research on this form of attack. We present our work on such a system Denial-of-Wallet Test Simulator (DoWTS) - a system that allows for safe testing of theorised DoW attacks against serverless applications via synthetic data generation. We also expand upon prior research on DoW and provide an analysis on the lack of specific safety measures for DoW.
This paper presents a new approach to classification of high-dimensional spectroscopy data and demonstrates that it outperforms other current state-of-the art approaches. The specific task we ...consider is identifying whether samples contain chlorinated solvents or not, based on their Raman spectra. We also examine robustness to classification of outlier samples that are not represented in the training set (negative outliers). A novel application of a locally connected neural network (NN) for the binary classification of spectroscopy data is proposed and demonstrated to yield improved accuracy over traditionally popular algorithms. Additionally, we present the ability to further increase the accuracy of the locally connected NN algorithm through the use of synthetic training spectra, and we investigate the use of autoencoder based one-class classifiers and outlier detectors. Finally, a two-step classification process is presented as an alternative to the binary and one-class classification paradigms. This process combines the locally connected NN classifier, the use of synthetic training data, and an autoencoder based outlier detector to produce a model which is shown to both produce high classification accuracy and be robust in the presence of negative outliers.
Abstract Serverless computing is an ever-growing programming paradigm being adopted by developers all over the world. Its highly scalable, automatic load balancing, and pay for what you use design is ...a powerful tool that can also greatly reduce operational costs. However, these advantages also leave serverless computing open to a unique threat, Denial-of-Wallet (DoW). It is the intentional targeting of serverless function endpoints with request traffic in order to artificially raise the usage bills for the application owner. A subset of these attacks are leeches. They perform DoW at a rate that could go undetected as it is not a sudden violent influx of requests. We devise a means of detecting such attacks by utilizing a novel approach of representing request traffic as heat maps and training an image classification algorithm to distinguish between normal and malicious traffic behaviour. Our classifier utilizes convolutional neural networks and achieves 97.98% accuracy. We then design a system for the implementation of this model that would allow application owners to monitor their traffic in real time for suspicious behaviour.
Classical homocystinuria (HCU), an inborn error of homocysteine metabolism, has previously been estimated to affect approximately 1 in 100,000-200,000 people in the United States (US). HCU is poorly ...detected by newborn screening, resulting in underestimates of its prevalence. This study compared characteristics, healthcare use and costs, and projected prevalence between patients with diagnosed HCU, elevated total homocysteine (tHcy), and diagnosed phenylketonuria (PKU).
Patients in the MarketScan® Research Databases were identified with strictly-defined HCU (> 2 diagnoses, including 1 ICD-10), broadly-defined HCU (> 1 ICD-10), elevated tHcy (> 20 μmol/L) without an HCU diagnosis, or > 1 ICD-9/ICD-10 PKU diagnosis during 1/1/2010-12/31/2016 (first qualifying claim = index). Demographics and healthcare utilization and costs per patient per month (PPPM) were compared between all cohorts, frequencies of comorbidities and medications were compared between HCU and elevated tHcy patients, and healthcare provider types were assessed among HCU patients. The prevalence of patients meeting each cohort definition was projected to the United States (US) population.
Patients with strictly-defined (N = 2450) and broadly-defined (N = 6613) HCU, and with elevated tHcy (N = 2017), were significantly older than PKU patients (N = 5120) (57 vs. 56 vs. 53 vs. 18 years; p < 0.05). Vitamin D deficiency, hyperlipidemia, folic acid/B vitamins, and lipid-lowering medications, among others, were more common among diagnosed HCU patients vs. those with elevated tHcy (all p < 0.05). Rates of healthcare utilization were generally higher among HCU and elevated tHcy patients, compared to PKU, though total healthcare costs were similar between groups. Most HCU patients (~ 38%) received their index diagnosis from a primary care physician; very few (~ 1%) had any claim from a geneticist during their enrollment. The age-adjusted national prevalence of HCU was projected at 31,162 (95% CI: 30,411 - 31,913; ~ 1 in 10,000 of the US population) using the broad definition.
The actual prevalence of HCU may be > 10 times prior estimates, at 1 in 10,000 in the US, and this study suggests that HCU is not being diagnosed until later in life. Improvements to newborn screening, detection in young children, and physician education regarding HCU among patients may be necessary to alleviate the burden of this genetic disease.
Celotno besedilo
Dostopno za:
CEKLJ, DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
The purpose of this publication is twofold. Firstly, we would like to present an over-arching research proposal idea concerning the development of an interactive and adaptive learning framework that ...harnesses powerful tools from Artificial Intelligence, such as Data Mining, Natural Language Processing and Dynamic Difficulty Adjustment, in order to adapt the learning experience to the individual user's needs. Secondly, we would like to outline some of the preliminary work that we have carried out to date. This involved developing a prototype system to interactively teach Java programming and was developed as a Final Year Project by Michael Gavin, a fourth-year undergraduate computer science student at NUI Galway. This prototype system, developed using the Spring boot framework with a Rest API and MySQL backend, acts as an early "\emphproof of concept '' for developing these ideas further.
Albeit effective, methionine/protein restriction in the management of classical homocystinuria (HCU) is suboptimal and hard to follow. To address unmet need, we developed an enzyme therapy (OT-58), ...which effectively corrected disease symptoms in various mouse models of HCU in the absence of methionine restriction. Here we evaluated short- and long-term efficacy of OT-58 on the background of current dietary management of HCU. Methionine restriction resulted in the lowering of total homocysteine (tHcy) by 38-63% directly proportional to a decreased methionine intake (50-12.5% of normal). Supplemental betaine resulted in additional lowering of tHcy. OT-58 successfully competed with betaine and normalized tHcy on the background of reduced methionine intake, while substantially lowering tHcy in mice on normal methionine intake. Betaine was less effective in lowering tHcy on the background of normal or increased methionine intake, while exacerbating hypermethioninemia. OT-58 markedly reduced both hyperhomocysteinemia and hypermethioninemia caused by the diets and betaine in HCU mice. Withdrawal of betaine did not affect improved metabolic balance, which was established and solely maintained by OT-58 during periods of fluctuating dietary methionine intake. Taken together, OT-58 may represent novel, highly effective enzyme therapy for HCU performing optimally in the presence or absence of dietary management of HCU.
The Fourth Industrial Revolution suggests smart and automated industrial solutions by incorporating Artificial Intelligence into it. Today, the world of technology is highly dependent on Machine ...Learning (ML) and Deep Learning (DL) and their applications. All these ML/DL models, which bring huge benefits and provide Industry 4.0 solutions, require a bulk of data, extensive computational power, and storage for enhanced performance and accuracy. With the current jurisdictions on privacy all over the world, it is hard to access the required amount of data without giving the data ownership to the centralized silos. Taking model to the data source is the idea that makes Federated Learning (FL) a unique and better-suited solution in this situation. In this paper, we present a review of FL, its learning models, aggregation algorithms, frameworks, and the challenges faced by this new paradigm of decentralized and distributed Machine Learning. We discuss the potential applications of FL in various domains that can help improve the efficiency and flexibility of industrial processes. We also talk about their impact on changing the model training trends altogether in terms of data privacy, decentralization, security, and resource management. The main contribution of this work is to provide a comprehensive and concise review and comparative analysis of various frameworks and aggregation algorithms, followed by a discussion of challenges currently faced by FL.
Serverless computing is the latest paradigm in cloud computing, offering a framework for the development of event driven, pay-as-you-go functions in a highly scalable environment. While these traits ...offer a powerful new development paradigm, they have also given rise to a new form of cyber-attack known as Denial of Wallet (forced financial exhaustion). In this work, we define and identify the threat of Denial of Wallet and its potential attack patterns. Also, we demonstrate how this new form of attack can potentially circumvent existing mitigation systems developed for a similar style of attack, Denial of Service. Our goal is twofold. Firstly, we will provide a concise and informative overview of this emerging attack paradigm. Secondly, we propose this paper as a starting point to enable researchers and service providers to create effective mitigation strategies. We include some simulated experiments to highlight the potential financial damage that such attacks can cause and the creation of an isolated test bed for continued safe research on these attacks.