Nowadays gigantic crowd-sourced data from mobile devices have become widely available in social networks, enabling the possibility of many important data mining applications to improve the quality of ...our daily lives. While providing tremendous benefits, the release of crowd-sourced social network data to the public will pose considerable threats to mobile users' privacy. In this paper, we investigate the problem of real-time spatio-temporal data publishing in social networks with privacy preservation. Specifically, we consider continuous publication of population statistics and design RescueDP-an online aggregate monitoring framework over infinite streams with <inline-formula> <tex-math notation="LaTeX">w</tex-math> <inline-graphic xlink:href="wang-ieq1-2599873.gif"/> </inline-formula>-event privacy guarantee. Its key components including adaptive sampling, adaptive budget allocation, dynamic grouping, perturbation and filtering, are seamlessly integrated as a whole to provide privacy-preserving statistics publishing on infinite time stamps. Moreover, we further propose an enhanced RescueDP with neural networks to accurately predict the values of statistics and improve the utility of released data. Both RescueDP and the enhanced RescueDP are proved satisfying <inline-formula><tex-math notation="LaTeX">w</tex-math> <inline-graphic xlink:href="wang-ieq2-2599873.gif"/> </inline-formula>-event privacy. We evaluate the proposed schemes with real-world as well as synthetic datasets and compare them with two <inline-formula> <tex-math notation="LaTeX">w</tex-math> <inline-graphic xlink:href="wang-ieq3-2599873.gif"/> </inline-formula>-event privacy-assured representative methods. Experimental results show that the proposed schemes outperform the existing methods and improve the utility of real-time data sharing with strong privacy guarantee.
Many recent articles have found that atheoretical forecasting methods using many predictors give better predictions for key macroeconomic variables than various small-model methods. The practical ...relevance of these results is open to question, however, because these articles generally use ex post revised data not available to forecasters and because no comparison is made to best actual practice. We provide some evidence on both of these points using a new large dataset of vintage data synchronized with the Fed's Greenbook forecast. This dataset consist of a large number of variables as observed at the time of each Greenbook forecast since 1979. We compare realtime, large dataset predictions to both simple univariate methods and to the Greenbook forecast. For inflation we find that univariate methods are dominated by the best atheoretical large dataset methods and that these, in turn, are dominated by Greenbook. For GDP growth, in contrast, we find that once one takes account of Greenbook's advantage in evaluating the current state of the economy, neither large dataset methods, nor the Greenbook process offers much advantage over a univariate autoregressive forecast.
Real-time data access based on wireless sensor networks (WSNs) is an indispensable and essential foundation for real-time control of industrial processes, especially for the smart industry. But the ...transmission of real-time data between participants faces many security issues in WSNs, such as sensitive data leakage and unauthorized access. Many researchers have made efforts to solve such a problem. But there are no secure and efficient schemes because security schemes based on public key cryptography have excessive computation and communication cost, while non-public key security schemes are vulnerable to various security attacks. We design a secure and efficient anonymous authentication key agreement scheme to balance security and efficiency by combining physical unclonable function (PUF) and elliptic curve cryptography (ECC). It only uses simple cryptographic calculations to resist identity impersonation attacks based on PUF completely. It also achieves the consistency and randomness of the session key with only a small amount of ECC operations and can resist the attack of the session key. We present a formal security analysis based on Mao-Boyd logic and demonstrate that our scheme satisfies the security requirements for real-time data access in smart industry networks. And performance analysis shows that our scheme has a higher security level and also has greater advantages in terms of computation and communication cost.
The paper proposes a modelling framework and evaluation procedure to judge the usefulness of realtime data sets incorporating past data vintages and survey expectations in forecasting. The analysis ...is based on ‘metamodels’ obtained by using model averaging techniques and judged by various statistical and economic criteria, including a novel criterion based on a fair bet. Analysing US output data over 1968, quarter 4–2015, quarter 1, we find that both elements of the realtime data are useful with their contributions varying over time. Revisions data are particularly valuable for point and density forecasts of growth but survey expectations are important in forecasting rare recessionary events.
Updates the data in the data warehouse is not traditionally done every transaction. Retail information systems require the latest data and can be accessed from anywhere for business analysis needs. ...Therefore, in this study will be made data warehouse model that is able to produce the information near real time, and can be accessed from anywhere by end users application. Modeling design integration of nearly real time data warehouse (NRTDWH) with a service oriented architecture (SOA) to support the retail information system is done in two stages. In the first stage will be designed modeling NRTDWH using Change Data Capture (CDC) based Transaction Log. In the second stage will be designed modeling NRTDWH integration with SOA-based web service. Tests conducted by a simulation test applications. Test applications used retail information systems, web-based web service client, desktop, and mobile. Results of this study were (1) ETL-based CDC captures changes to the source table and then store it in the database NRTDWH with the help of a scheduler; (2) Middleware web service makes 6 service based on data contained in the database NRTDWH, and each of these services accessible and implemented by the web service client.
Research on cyborg intelligent insects requires the experiment platform to collect data from the distributed sensing models and process the multi-modal signals in real-time. To overcome these issues, ...we present a novel data model, naming as the Hybrid-synchronizedrealtime (HSR) data model, which can synchronize the hybrid raw data channels by a meta data integration method and process them with a fixed priority scheduling algorithm. Real animal experiments show that the insect-machine experiment platform based on the HSR data model can fully satisfy the requirements of cyborg intelligent insect research, especially in dealing with the technical challenges of data synchronization, real-time processing and hybrid data integration. It provides an efficient approach to implement the experiment platform and thus aid the frontier research on cyborg insects.
Radar technology for health continues to be developed, one of which is from this research, which focuses on Respiratory patients. Not only in terms of FMCW Radar analysis for medical but also in ...terms of flexibility in displaying data that can be integrated with smartphone devices used by many people today. One of the servers used is MQTT, which has been installed and successfully displayed realtime patient respiratory data. In the future, this research will be installed using a permanent casing that is more fixed and stable in the connectivity process between devices, involving the Raspberry Pi 4 B and OmniPreSense Radar. The data is integrated into the MQTT server with the patient's respiratory data and produces realtime data in the form of respiratory data graphs.
In this paper, we study the important performance issues in using the purging-range query to reclaim old data versions to be free blocks in a flash-based multi-version database. To reduce the ...overheads for using the purging-range query in garbage collection, the physical block labeling (PBL) scheme is proposed to provide a better estimation on the purging version number to be used for purging old data versions. With the use of the frequency-based placement (FBP) scheme to place data versions in a block, the efficiency in garbage collection can be further enhanced by increasing the deadspans of data versions and reducing reallocation cost especially when the spaces of the flash memory for the databases are limited.
Real Time Sales Data Analysis Preethi, M.; Kanna, A. Rajesh
2023 9th International Conference on Advanced Computing and Communication Systems (ICACCS),
2023-March-17, Letnik:
1
Conference Proceeding
Now a days the data generated is further due to the increase in the information collected. It offers information on top performing and underperforming products services, dealing issues and request ...openings, deals vaticinations and profit generating deals conditioning. Making a deals data analysis for your establishment will help you understand what products your guests are buying and why they reply the way they do. We must conduct deals analyses for all the products we vend at regular intervals. By the use of colorful machine learning models deals analysis is being carried out efficiently. Therefore Machine Learning plays a vital part in deals data analysis. A Deals Analysis it becomes much easier to punctuate your most profitable guests and keeping these guests engaged with your business can be the key to adding overall profitability.
The aim of this paper is to analyze the performance of alternative forecasting methods to predict the index of industrial production in Italy from 1 to 3 months ahead. We use twelve different models, ...from simple ARIMA to dynamic factor models exploiting the timely information of up to 110 short-term indicators, both qualitative and quantitative. This allows to assess the relevance for the forecasting practice of alternative combinations of types of data (real-time and latest available), estimation methods and periods. Out-of-sample predictive ability tests stress the relevance of more indicators in disaggregate models over sample periods covering a complete business cycle (about 7 years in Italy). Our findings downgrade the emphasis on both the estimation method and data revision issues. In line with the classical “average puzzle”, the use of simple averages of alternative forecasts often improves the predictive ability of their single components, mainly over short horizons. Finally, selected indicators and factor-based models always perform significantly better than ARIMA models, suggesting that the short-run indicator signal always dominates the noise component. On this regard, selected indicators models can further increase the amount of signal extracted to improve up to 30–40% the short-run predictive ability of factor-based models and to forecast-encompass them.