Influence maximization is a fundamental problem in social network analysis. This problem refers to the identification of a set of influential users as initial spreaders to maximize the spread of a ...message in a network. When such a message is spread, some users may be influenced by it. A common assumption of existing work is that the impact of a message is essentially binary: A user is either influenced (activated) or not influenced (non-activated). However, how strongly a user is influenced by a message may play an important role in this user’s attempt to influence subsequent users and spread the message further; existing methods may fail to model accurately the spreading process and identify influential users. In this article, we propose a novel approach to model a social network as a fuzzy graph where a fuzzy variable is used to represent the extent to which a user is influenced by a message (user’s activation level). By extending a diffusion model to simulate the spreading process in such a fuzzy graph, we conceptually formulate the fuzzy influence maximization problem for which three methods are proposed to identify influential users. Experimental results demonstrate the accuracy of the proposed methods in determining influential users in social networks.
Abstract Purpose Ever since National Health Insurance was introduced in 1995, the number of insurants increased to over 96% from 50 to 60%, with a continuous satisfaction rating of about 70%. ...However, the premium accounted for 5.77% of GDP in 2001 and the Bureau of National Health Insurance had pressing financial difficulties, so it reformed its expenditure systems, such as fee for service, capitation, case payment and the global budget system in order to control the rising medical costs. Since the change in health insurance policy, most hospitals attempted to reduce their operating expenses and improve efficiency. Introducing the electronic logistics information system is one way of reducing the cost of the department of central warehouse and the nursing stations. Hence, the study proposes a technology acceptance research model and examines how nurses’ acceptance of the e-logistics information system has been affected in the medical industry. Methods This research combines innovation diffusion theory, technology acceptance model and added two research parameters, trust and perceived financial cost to propose a new hybrid technology acceptance model. Taking Taiwan's medical industry as an experimental example, this paper studies nurses’ acceptance of the electronic logistics information system. The structural equation modeling technique was used to evaluate the causal model and confirmatory factor analysis was performed to examine the reliability and validity of the measurement model. Results The results of the survey strongly support the new hybrid technology acceptance model in predicting nurses’ intention to use the electronic logistics information system. Conclusion The study shows that ‘compatibility’, ‘perceived usefulness’, ‘perceived ease of use’, and ‘trust’ all have great positive influence on ‘behavioral intention to use’. On the other hand ‘perceived financial cost’ has great negative influence on behavioral intention to use.
In this study, we analyse the value creation of Industry 4.0 (I40) technologies in flexible manufacturing (FM) under a sustainability perspective. I40 is a popular strategy that Western manufacturing ...organizations adopt to face competition from low-cost producers. Organizations adopting I40 use advanced digital technologies to make production processes more flexible and increasingly automated. Several pieces of evidence confirm how I40 leads to higher productivity and higher-quality products, improving the economic performance of organizations. However, increasing automation may also lead to the reduction of human labour in the production process, which may contribute to the disappearance of jobs, the reduction of expertise and the loss of know-how in manufacturing organizations. While the literature acknowledges the technical and economic advantages of I40, the sustainability of the value created through these technologies deserves further investigation. To address the gap, we complement the IT value theory with the concept of sustainability, including the three dimensions of economic, environmental and social sustainability. We perform a multiple case study analysis of four Italian manufacturing organizations that have successfully implemented I40 technologies in FM. The cases show that I40 technologies support sustainable organizational value when they are deployed with a worker-centric approach. In this condition, the organization leverages workforce activities to continuously fine-tune the technologies and to exploit the adaptive features of the technologies to continuously improve processes.
Employees' failure to comply with information systems security policies is a major concern for information technology security managers. In efforts to understand this problem, IS security researchers ...have traditionally viewed violations of IS security policies through the lens of deterrence theory. In this article, we show that neutralization theory, a theory prominent in Criminology but not yet applied in the context of IS, provides a compelling explanation for IS security policy violations and offers new insight into how employees rationalize this behavior. In doing so, we propose a theoretical model in which the effects of neutralization techniques are tested alongside those of sanctions described by deterrence theory. Our empirical results highlight neutralization as an important factor to take into account with regard to developing and implementing organizational security policies and practices.
Existing systems dealing with the increasing volume of data series cannot guarantee interactive response times, even for fundamental tasks such as similarity search. Therefore, it is necessary to ...develop analytic approaches that support exploration and decision making by providing progressive results, before the final and exact ones have been computed. Prior works lack both efficiency and accuracy when applied to large-scale data series collections. We present and experimentally evaluate a new probabilistic learning-based method that provides quality guarantees for progressive Nearest Neighbor (NN) query answering. We provide both initial and progressive estimates of the final answer that are getting better during the similarity search, as well suitable stopping criteria for the progressive queries. Experiments with synthetic and diverse real datasets demonstrate that our prediction methods constitute the first practical solution to the problem, significantly outperforming competing approaches.
A fundamental problem in data mining is to effectively build robust classifiers in the presence of skewed data distributions. Class imbalance classifiers are trained specifically for skewed ...distribution datasets. Existing methods assume an ample supply of training examples as a fundamental prerequisite for constructing an effective classifier. However, when sufficient data are not readily available, the development of a representative classification algorithm becomes even more difficult due to the unequal distribution between classes. We provide a unified framework that will potentially take advantage of auxiliary data using a transfer learning mechanism and simultaneously build a robust classifier to tackle this imbalance issue in the presence of few training samples in a particular target domain of interest. Transfer learning methods use auxiliary data to augment learning when training examples are not sufficient and in this paper we will develop a method that is optimized to simultaneously augment the training data and induce balance into skewed datasets. We propose a novel boosting-based instance transfer classifier with a label-dependent update mechanism that simultaneously compensates for class imbalance and incorporates samples from an auxiliary domain to improve classification. We provide theoretical and empirical validation of our method and apply to healthcare and text classification applications.
The aim of process discovery, originating from the area of process mining, is to discover a process model based on business process execution data. A majority of process discovery techniques relies ...on an event log as an input. An event log is a static source of historical data capturing the execution of a business process. In this paper, we focus on process discovery relying on online streams of business process execution events. Learning process models from event streams poses both challenges and opportunities, i.e. we need to handle unlimited amounts of data using finite memory and, preferably, constant time. We propose a generic architecture that allows for adopting several classes of existing process discovery techniques in context of event streams. Moreover, we provide several instantiations of the architecture, accompanied by implementations in the process mining toolkit ProM (
http://promtools.org
). Using these instantiations, we evaluate several dimensions of stream-based process discovery. The evaluation shows that the proposed architecture allows us to lift process discovery to the streaming domain.
An ever growing variety of smart, connected Internet of Things (IoT) devices poses completely new challenges for businesses regarding security and privacy. In fact, the adoption of smart products may ...depend on the ability of organizations to offer systems that ensure adequate sensor data integrity while guaranteeing sufficient user privacy. In light of these challenges, previous research indicates that blockchain technology could be a promising means to mitigate issues of data security arising in the IoT. Building upon the existing body of knowledge, we propose a design theory, including requirements, design principles, and features, for a blockchain-based sensor data protection system (SDPS) that leverages data certification. To support this, we designed and developed an instantiation of an SDPS (CertifiCar) in three iterative cycles intented to prevent the fraudulent manipulation of car mileage data. Following the explication of our SDPS, we provide an ex post evaluation of our design theory considering CertifiCar and two additional use cases in the areas of pharmaceutical supply chains and energy microgrids. Our results suggest that the proposed design ensures the tamper-resistant gathering, processing, and exchange of IoT sensor data in a privacy-preserving, scalable, and efficient manner.