Human falls are a global public health issue resulting in over 37.3 million severe injuries and 646,000 deaths yearly. Falls result in direct financial cost to health systems and indirectly to ...society productivity. Unsurprisingly, human fall detection and prevention are a major focus of health research. In this article, we consider deep learning for fall detection in an IoT and fog computing environment. We propose a Convolutional Neural Network composed of three convolutional layers, two maxpool, and three fully-connected layers as our deep learning model. We evaluate its performance using three open data sets and against extant research. Our approach for resolving dimensionality and modelling simplicity issues is outlined. Accuracy, precision, sensitivity, specificity, and the Matthews Correlation Coefficient are used to evaluate performance. The best results are achieved when using data augmentation during the training process. The paper concludes with a discussion of challenges and future directions for research in this domain.
Software-defined networking and network functions virtualisation are making networks programmable and consequently much more flexible and agile. To meet service-level agreements, achieve greater ...utilisation of legacy networks, faster service deployment, and reduce expenditure, telecommunications operators are deploying increasingly complex service function chains (SFCs). Notwithstanding the benefits of SFCs, increasing heterogeneity and dynamism from the cloud to the edge introduces significant SFC placement challenges, not least adding or removing network functions while maintaining availability, quality of service, and minimising cost. In this paper, an availability- and energy-aware solution based on reinforcement learning (RL) is proposed for dynamic SFC placement. Two policy-aware RL algorithms, Advantage Actor-Critic (A2C) and Proximal Policy Optimisation (PPO), are compared using simulations of a ground truth network topology based on the Rede Nacional de Ensino e Pesquisa Network, Brazil’s National Teaching and Research Network backbone. The simulation results show that PPO generally outperformed A2C and a greedy approach in terms of both acceptance rate and energy consumption. The biggest difference in the PPO when compared to the other algorithms relates to the SFC availability requirement of 99.965%; the PPO algorithm median acceptance rate is 67.34% better than the A2C algorithm. A2C outperforms PPO only in the scenario where network servers had a greater number of computing resources. In this case, the A2C is 1% better than the PPO.
This last decade, the amount of data exchanged on the Internet increased by over a staggering factor of 100, and is expected to exceed well over the 500 exabytes by 2020. This phenomenon is mainly ...due to the evolution of high-speed broadband Internet and, more specifically, the popularization and wide spread use of smartphones and associated accessible data plans. Although 4G with its long-term evolution (LTE) technology is seen as a mature technology, there is continual improvement to its radio technology and architecture such as in the scope of the LTE Advanced standard, a major enhancement of LTE. However, for the long run, the next generation of telecommunication (5G) is considered and is gaining considerable momentum from both industry and researchers. In addition, with the deployment of the Internet of Things (IoT) applications, smart cities, vehicular networks, e-health systems, and Industry 4.0, a new plethora of 5G services has emerged with very diverging and technologically challenging design requirements. These include high mobile data volume per area, high number of devices connected per area, high data rates, longer battery life for low-power devices, and reduced end-to-end latency. Several technologies are being developed to meet these new requirements, and each of these technologies brings its own design issues and challenges. In this context, deep learning models could be seen as one of the main tools that can be used to process monitoring data and automate decisions. As these models are able to extract relevant features from raw data (images, texts, and other types of unstructured data), the integration between 5G and DL looks promising and one that requires exploring. As main contribution, this paper presents a systematic review about how DL is being applied to solve some 5G issues. Differently from the current literature, we examine data from the last decade and the works that address diverse 5G specific problems, such as physical medium state estimation, network traffic prediction, user device location prediction, self network management, among others. We also discuss the main research challenges when using deep learning models in 5G scenarios and identify several issues that deserve further consideration.
Minimizing and Managing Cloud Failures Endo, Patricia Takako; Leoni Santos, Guto; Rosendo, Daniel ...
Computer (Long Beach, Calif.),
11/2017, Letnik:
50, Številka:
11
Journal Article
Recenzirano
Guaranteeing high levels of availability is a huge challenge for cloud providers. The authors look at the causes of cloud failures and recommend ways to prevent them and to minimize their effects ...when they occur.
The term sportswashing has been discussed and analysed within academic circles, as well as the mainstream media. However, the majority of existing research has focused on one-off event-based ...sportswashing strategies (such as autocratic states hosting major international sports events) rather than longer term investment-based strategies (such as state actors purchasing sports clubs and teams). Furthermore, little has been written about the impact of this latter strategy on the existing fanbase of the purchased team and on their relationship with sportswashing and the discourses surrounding it. This paper addresses this lacuna through analysis of a popular Manchester City online fan forum, which illustrates the manner in which this community of dedicated City fans have legitimated the actions of the club's ownership regime, the Abu Dhabi United Group – a private equity group operated by Abu Dhabi royalty and UAE politicians. The discursive strategies of the City fans are discussed, in addition to the wider significance of these strategies on the issue of sportswashing and its coverage by the media.
The network function virtualization (NFV) paradigm is an emerging technology that provides network flexibility by allowing the allocation of network functions over commodity hardware, like legacy ...servers in an IT infrastructure. In comparison with traditional network functions, implemented by dedicated hardware, the use of NFV reduces the operating and capital expenses and improves service deployment. In some scenarios, a complete network service can be composed of several functions, following a specific order, known as a service function chain (SFC). SFC placement is a complex task, already proved to be NP-hard. Moreover, in highly distributed scenarios, the network performance can also be impacted by other factors, such as traffic oscillations and high delays. Therefore, a given SFC placement strategy must be carefully developed to meet the network operator service constraints. In this paper, we present a systematic review of SFC placement advances in distributed scenarios. Differently from the current literature, we examine works over the last 10 years which addressed this problem while focusing on distributed scenarios. We then discuss the main scenarios where SFC placement has been deployed, as well as the several techniques used to create the placement strategies. We also present the main goals considered to create SFC placement strategies and highlight the metrics used to evaluate them.
Over 2.8 million people die each year from being overweight or obese, a largely preventable disease. Social media has fundamentally changed the way we communicate, collaborate, consume, and create ...content. The ease with which content can be shared has resulted in a rapid increase in the number of individuals or organisations that seek to influence opinion and the volume of content that they generate. The nutrition and diet domain is not immune to this phenomenon. Unfortunately, from a public health perspective, many of these 'influencers' may be poorly qualified in order to provide nutritional or dietary guidance, and advice given may be without accepted scientific evidence and contrary to public health policy. In this preliminary study, we analyse the 'healthy diet' discourse on Twitter. While using a multi-component analytical approach, we analyse more than 1.2 million English language tweets over a 16-month period in order to identify and characterise the influential actors and discover topics of interest in the discourse. Our analysis suggests that the discourse is dominated by non-health professionals. There is widespread use of bots that pollute the discourse and seek to create a false equivalence on the efficacy of a particular nutritional strategy or diet. Topic modelling suggests a significant focus on diet, nutrition, exercise, weight, disease, and quality of life. Public health policy makers and professional nutritionists need to consider what interventions can be taken in order to counteract the influence of non-professional and bad actors on social media.
The Internet of Things has the potential of transforming health systems through the collection and analysis of patient physiological data via wearable devices and sensor networks. Such systems can ...offer assisted living services in real-time and offer a range of multimedia-based health services. However, service downtime, particularly in the case of emergencies, can lead to adverse outcomes and in the worst case, death. In this paper, we propose an e-health monitoring architecture based on sensors that relies on cloud and fog infrastructures to handle and store patient data. Furthermore, we propose stochastic models to analyze availability and performance of such systems including models to understand how failures across the Cloud-to-Thing continuum impact on e-health system availability and to identify potential bottlenecks. To feed our models with real data, we design and build a prototype and execute performance experiments. Our results identify that the sensors and fog devices are the components that have the most significant impact on the availability of the e-health monitoring system, as a whole, in the scenarios analyzed. Our findings suggest that in order to identify the best architecture to host the e-health monitoring system, there is a trade-off between performance and delays that must be resolved.
Summary
To assess the availability of different data center configurations, understand the main root causes of data center failures and represent its low‐level details, such as subsystem's behavior ...and their interconnections, we have proposed, in previous works, a set of stochastic models to represent different data center architectures (considering three subsystems: power, cooling, and IT) based on the TIA‐942 standard. In this paper, we propose the Data Center Availability (DCAV), a web‐based software system to allow data center operators to evaluate the availability of their data center infrastructure through a friendly interface, without need of understanding the technical details of the stochastic models. DCAV offers an easy step‐by‐step interface to create and configure a data center model. The main goal of the DCAV system is to low‐level details and modeling complexities, becoming the data center availability analysis a simple and less time‐consuming task.