•Design of a cloud based infrastructure for supporting on EOL/EUL prognosis analysis.•Prognosis tool using nonlinear discrete-time representation of automotive components.•ATOMIC deployment ...configurations is evaluated in terms of scalability and cost.•The proposed prognostic method has been evaluated on the Delphi DFG1596 fuel pump.
Nowadays, the continuous technological advances allow designing novel Integrated Vehicle Health Management (IVHM) systems to deal with strict safety regulations in the automotive field with the aim at improving efficiency and reliability of automotive components. However, challenging issue, which arises in this domain, is handling a huge amount of data that are useful for prognostic. To this aim, in this paper we propose a cloud-based infrastructure, namely Automotive predicTOr Maintenance In Cloud (ATOMIC), for prognostic analysis that leverages Big Data technologies and mathematical models of both nominal and faulty behaviour of the automotive components to estimate on-line the End-Of-Life (EOL) and Remaining Useful Life (EUL) indicators for the automotive systems under investigation. A case study based on the Delphi DFG1596 fuel pump has been presented to evaluate the proposed prognostic method. Finally, we perform a benchmark analysis of the deployment configurations of ATOMIC architecture in terms of scalability and cost.
This paper presents a cloud-based system framework based on Bigtable and MapReduce as the data storage and processing paradigms for providing a web-based service for viewing, storing, and analyzing ...massive building information models (BIMs). Cloud and Web 3D technologies were utilized to develop a BIM data center that can handle the big data of massive BIMs using multiple servers in a distributed manner and can be accessed by multiple users to concurrently submit and view BIMs online in 3D. Traditional BIM include only static information such as the geometric parameters, physical properties, and spatial relations for modeling a physical space. In this study, BIM was extended to dynamic BIM, which includes dynamic data such as historical records from the monitoring of the facility environment and usage. Owing to this extension, a dynamic BIM became a parametric model, which can be used to simulate user behaviors. On the client side, this study applied WebGL in the web interface development to achieve the display of BIMs in 3D on browsers. Users can access the services via various online devices anytime and anywhere to view the 3D model online. On the server side, this study used Apache Hadoop, which can utilize multiple servers to provide mass storage spaces in a distributed manner with Bigtable-like structured storage, to establish the BIM data center. A schema for storing the big data of massive dynamic BIMs in Bigtables was proposed. MapReduce, a Hadoop component for the parallel processing of large data sets, was utilized to process big data from dynamic BIMs. A big data analysis framework to effectively retrieve and calculate required information from dynamic BIMs in the data center for various applications by MapReduce distributed computing was proposed this study. We provide principle and architecture of the proposed framework along with its experimental assessment. The results confirmed that scalable and reliable management of massive BIMs can be achieved using the proposed framework.
•Introducing a cloud-based tool for viewing, storing, and analyzing massive Building Information Models online.•BIM was extended to dynamic BIM, which includes dynamic data of BIM.•WebGL technology was employed to display BIMs in 3D on browsers.•Schema of Bigtables for storing BIMs and dynamic data in HBase was developed.•MapReduce distributed computing was employed to process big data of dynamic BIMs.
This study aims to examine what makes the image content of fashion brands successful on Instagram, while comparing between luxury and fast fashion brands. A quantitative analysis of a massive ...collection of fashion photos posted by notable luxury and fast fashion brands was therefore conducted to identify specific patterns in these images based on four important visual content variables: the use of a brand name, brand logo, text, and hashtag. This study also examined how user engagement levels vary depending on each visual content variable. This study made several interesting findings: (1) luxury brand images with logos and brand names had higher user engagement whereas fast fashion brand images did not show this same trend; (2) the size of the brand name and logo in an image was negatively related to the user engagement or had no effect, regardless of the brand category; and (3) the use of embedded text within an image positively influenced user engagement for luxury brands whereas it negatively influenced user engagement for fast fashion brands.
•The present situation of the adoption of EMRs is reviewed.•The health sensing, medical data analysis, and health cloud computing are surveyed.•The emerging information technologies for new paradigms ...of healthcare service are documented.
The appropriate collection and consumption of electronic health information about an individual patient or population is the bedrock of modern healthcare, where electronic medical records (EMR) serve as the main carrier. This paper first introduces the main goal of this special issue and gives a brief guideline. Then, the present situation of the adoption of EMRs is reviewed. After that, the emerging information technologies are presented which have a great impact on the healthcare provision. These include health sensing for medical data collection, medical data analysis and utilization for accurate detection and prediction. Next, cloud computing is discussed, as it may provide scalable and cost-effective delivery of healthcare services. Accordingly, the current state of academic research is documented on emerging information technologies for new paradigms of healthcare service. At last, conclusions are made.
Parallel to the increasing level of maturity of the field of research on higher education, an increasing number of scholarly works aims at synthesising and presenting overviews of the field. We ...identify three important pitfalls these previous studies struggle with, i.e. a limited scope, a lack of a content-related analysis, and/or a lack of an inductive approach. We take these limitations into account by analysing the abstracts of 16,928 articles on higher education between 1991 and 2018. To investigate this huge collection of texts, we apply topic models, which are a collection of automatic content analysis methods that allow to map the structure of large text data. After an in-depth discussion of the topics differentiated by our model, we study how these topics have evolved over time. In addition, we analyse which topics tend to co-occur in articles. This reveals remarkable gaps in the literature which provides interesting opportunities for future research. Furthermore, our analysis corroborates the claim that the field of research on higher education consists of isolated ‘islands’. Importantly, we find that these islands drift further apart because of a trend of specialisation. This is a bleak finding, suggesting the (further) disintegration of our field.
•Practical application of distributed acoustic sensing technology on dark fibers.•Research on existing communication optical cables as detection arrays.•Extracting surrounding ambient features along ...the highway based on DAS.•Promoting the research and development of DAS in fields with existing communication optical cables.
Distributed optical fiber acoustic sensing technology (DAS) can transform existing communication optical cables into spatially continuous sensing array, and has strong application potential in geophysics and nature disaster monitoring. This paper proposes a DAS big-data analysis method to obtain the surrounding ambient features along the existing communication optical cable. After being synthetically filtered and enhanced, DAS signals are analyzed for feature extraction, including, traffic information and geographical features. The feasibility is verified on 36 km section of a communication optical cable along running road network. It is believed that, this proposed method will assist researchers to determine the actual path of existing optical cables and geophysics signal analysis, greatly promote the application development of DAS in fields with existing cables.
Our aim was to overcome the low evaluation accuracy of traditional random sampling methods for college students' mental health, and to use the values of big data of college students' social network ...behaviors in the prediction and evaluation of their mental health.
We monitored and evaluated college students' mental health through big data analysis. After generating the samples of college students' social network behaviors, a mental health monitoring and evaluation model was established based on a support vector machine (SVM) and decision tree (DT). Then, the DT model was pruned, and input data of the model were optimized by genetic algorithm (GA).
The optimal parameter combination was derived for our model. The maximum number of iterations was 60; the smallest number of samples needed for reclassifying internal nodes was 6; the number of samples with the fewest leaf nodes was 30. The mental health scores of most students fell in the interval 0, 6 for unobvious symptoms of mental crisis. The binary classification results of several models were as follows. On anxiety, all models surpassed the accuracy of 60%, except the traditional SVM. The optimal model, ie, Model 5, achieved an accuracy of 86.7%. On depression, all models exceeded the accuracy of 60%, and the GA-optimized DT 5 realized an accuracy as high as 83.1%. On drooping spirit, the optimal model, ie, GA-optimized DT 5, reached an accuracy of 89.5%, which is comparable to that of the GA-optimized SVM 4.
The characteristic dimensions extracted by GA are representative. The primary mental states of college students can be estimated quickly and accurately by our model with a low cost of data storage, through the feature analysis of social network behaviors.
Environmental exposure assessment is an important step in establishing a list of local priority pollutants and finding the sources of the threats for proposing appropriate protection measures. ...Exposome targeted and non-targeted analysis as well as suspect screening may be applied to reveal these pollutants. The non-targeted screening is a challenging task and requires the application of the most powerful analytical tools available, assuring wide analytical coverage, sensitivity, identification reliability, and quantitation.
Moscow, Russia, is the largest and most rapidly growing European city. That rapid growth is causing changes in the environment which require periodic clarification of the real environmental situation regarding the presence of the classic pollutants and possible new contaminants. Gas chromatography – high resolution time-of-flight mass spectrometry (GC-HR-TOFMS) with electron ionization (EI), positive chemical ionization (PCI), and electron capture negative ionization (ECNI) ion sources were used for the analysis of Moscow snow samples collected in the early spring of 2018 in nine different locations. Collection of snow samples represents an efficient approach for the estimation of long-term air pollution, due to accumulation and preservation of environmental contaminants by snow during winter period. The high separation power of GC, complementary ionization methods, high mass accuracy, and wide mass range of TOFMS allowed for the identification of several hundred organic compounds belonging to the various classes of pollutants, exposure to which could represent a danger to the health of the population. Although quantitative analysis was not a primary aim of the study, targeted analysis revealed that some priority pollutants exceeded the established safe levels. Thus, dibutylphthalate concentration was over 10-fold higher than its safe level (0.001 mg/L), while benzapyrene concentration exceeded Russian maximal permissible concentration value of 5 ng/L in three samples. The large amount of information generated during the combination of targeted and non-targeted analysis and screening samples for suspects makes it feasible to apply the big data analysis to observe the trends and tendencies in the pollution exposome across the city.
Display omitted
•Complementary ionization tools increase environmental analysis' scope and reliability.•Over 500 environmental pollutants were identified in Moscow snow samples.•Electron capture negative ionization helps detecting halogenated and nitro compounds.•Accurate mass measurements increase reliability of the non-targeted screening.
The article proposes a method for representing the economic process as a module for closing economic cycles. Such a representation is relevant for the purposes of compiling a digital map of the ...economic process, which can be further adjusted and supplemented with new parameters depending on the state of the external environment of the economic agent. The proposed model is universal and framework, that is, it can be used to represent any economic process in digital format without restrictions. The extension of the proposed model can be achieved by increasing the number of threads in each of the technological cycles, while each thread will have its own technological process. The practical result of the study is the compilation of a methodology that allows visualizing economic processes within a single roadmap for the entire planning horizon of the production process.