The rise of machine learning increases the current computing capabilities and paves the way to novel disruptive applications. In the current era of big data, the application of image retrieval ...technology for large-scale data is a popular research area. To ensure the robustness and security of digital image watermarking, we propose a novel algorithm using synergetic neural networks. The algorithm first processes a meaningful gray watermark image, then embeds it as a watermark signal into the block Discrete Cosine Transform (DCT) component. The companion algorithm for detection and extraction of the watermark uses a cooperative neural network, where the suspected watermark signal is used as the input while the output consists in the result of the recognition process. The simulation experiments show that the algorithm can complete certain image processing operations with improved performance, not only simultaneously completing watermark detection and extraction, but also efficiently determining the watermark attribution. Compared with other state-of-the-art models, the proposed model obtains an optimal Peak Signal-to-noise ratio (PSNR).
A Survey of Deep Active Learning Ren, Pengzhen; Xiao, Yun; Chang, Xiaojun ...
ACM computing surveys,
12/2022, Volume:
54, Issue:
9
Journal Article
Peer reviewed
Open access
Active learning (AL) attempts to maximize a model’s performance gain while annotating the fewest samples possible. Deep learning (DL) is greedy for data and requires a large amount of data supply to ...optimize a massive number of parameters if the model is to learn how to extract high-quality features. In recent years, due to the rapid development of internet technology, we have entered an era of information abundance characterized by massive amounts of available data. As a result, DL has attracted significant attention from researchers and has been rapidly developed. Compared with DL, however, researchers have a relatively low interest in AL. This is mainly because before the rise of DL, traditional machine learning requires relatively few labeled samples, meaning that early AL is rarely according the value it deserves. Although DL has made breakthroughs in various fields, most of this success is due to a large number of publicly available annotated datasets. However, the acquisition of a large number of high-quality annotated datasets consumes a lot of manpower, making it unfeasible in fields that require high levels of expertise (such as speech recognition, information extraction, medical images, etc.). Therefore, AL is gradually coming to receive the attention it is due.
It is therefore natural to investigate whether AL can be used to reduce the cost of sample annotation while retaining the powerful learning capabilities of DL. As a result of such investigations, deep active learning (DeepAL) has emerged. Although research on this topic is quite abundant, there has not yet been a comprehensive survey of DeepAL-related works; accordingly, this article aims to fill this gap. We provide a formal classification method for the existing work, along with a comprehensive and systematic overview. In addition, we also analyze and summarize the development of DeepAL from an application perspective. Finally, we discuss the confusion and problems associated with DeepAL and provide some possible development directions.
The Internet of Things (IoT) has recently emerged as a revolutionary communication paradigm where a large number of objects and devices are closely interconnected to enable smart industrial ...environments. The tremendous growth of visual sensors can significantly promote the traffic situational awareness, traffic safety management, and intelligent vehicle navigation in intelligent transportation systems (ITSs). However, due to the absorption and scattering of light by the turbid medium in atmosphere, the visual IoT inevitably suffers from imaging quality degradation, e.g., contrast reduction, color distortion, etc. This negative impact can not only reduce the imaging quality, but also bring challenges for the deployment of several high-level vision tasks (e.g., object detection, tracking, recognition, etc.) in the ITS. To improve imaging quality under the hazy environment, we propose a deep network-enabled three-stage dehazing network (termed TSDNet) for promoting the visual IoT-driven ITS. In particular, the proposed TSDNet mainly contains three parts, i.e., multiscale attention module for estimating the hazy distribution in the RGB image domain, two-branch extraction module for learning the hazy features, and multifeature fusion module for integrating all characteristic information and reconstructing the haze-free image. Numerous experiments have been implemented on synthetic and real-world imaging scenarios. Dehazing results illustrated that our TSDNet remarkably outperformed several state-of-the-art methods in terms of both qualitative and quantitative evaluations. The high-accuracy object detection results have also demonstrated the superior dehazing performance of the TSDNet under hazy atmosphere conditions. The source code is available at https://github.com/gy65896/TSDNet .
Smart Grid 2.0 is the energy Internet based on advanced metering infrastructure and distributed systems that require an instantaneous two-way flow of energy information. Edge computing benefits from ...its proximity to the servers and edge nodes of the smart grid distributed systems, which can provide efficient and low latency information transmission to the smart grid. With the massive number of Internet of Things being used, the amount of real-time power usage information generated by that represents a huge challenge for edge computing. To improve the efficiency of information transmission and processing in power systems, this article combines different deep learning algorithms with edge computing to analyze and process distributed renewable energy generation and consumer power data in smart microgrid. Experiments on two real-world datasets from China and Belgium show that the proposed framework can obtain satisfactory prediction accuracy compared to existing approaches.
Secure integration of IoT and Cloud Computing Stergiou, Christos; Psannis, Kostas E.; Kim, Byung-Gyu ...
Future generation computer systems,
January 2018, 2018-01-00, Volume:
78
Journal Article
Peer reviewed
Mobile Cloud Computing is a new technology which refers to an infrastructure where both data storage and data processing operate outside of the mobile device. Another recent technology is Internet of ...Things. Internet of Things is a new technology which is growing rapidly in the field of telecommunications. More specifically, IoT related with wireless telecommunications. The main goal of the interaction and cooperation between things and objects which sent through the wireless networks is to fulfill the objective set to them as a combined entity. In addition, there is a rapid development of both technologies, Cloud Computing and Internet of Things, regard the field of wireless communications. In this paper, we present a survey of IoT and Cloud Computing with a focus on the security issues of both technologies. Specifically, we combine the two aforementioned technologies (i.e Cloud Computing and IoT) in order to examine the common features, and in order to discover the benefits of their integration. Concluding, we present the contribution of Cloud Computing to the IoT technology. Thus, it shows how the Cloud Computing technology improves the function of the IoT. Finally, we survey the security challenges of the integration of IoT and Cloud Computing.
•Presentation of IoT and Cloud technologies which focus on security issues.•Integration benefits of Internet of Things and Cloud Computing technologies.•Part of AES presented for improvement of security issue, resulting from integration.•Contribution of AES and RSA algorithms in the integration of IoT and Cloud technologies.
This work proposes an innovative infrastructure of secure scenario which operates in a wireless-mobile 6G network for managing big data (BD) on smart buildings (SBs). Count on the rapid growth of ...telecommunication field new challenges arise. Furthermore, a new type of wireless network infrastructure, the sixth generation (6G), provides all the benefits of its past versions and also improves some issues which its predecessors had. In addition, relative technologies to the telecommunications filed, such as Internet of Things, cloud computing (CC) and edge computing (EC), can operate through a 6G wireless network. Take into account all these, we propose a scenario that try to combine the functions of the Internet of Things with CC, EC and BD in order to achieve a Smart and Secure environment. The major purpose of this work is to create a novel and secure cache decision system (CDS) in a wireless network that operates over an SB, which will offer the users safer and efficient environment for browsing the Internet, sharing and managing large-scale data in the fog. This CDS consisted of two types of servers, one cloud server and one edge server. In order to come up with our proposal, we study related cache scenarios systems which are listed, presented, and compared in this work.
An optimized curcumin encapsulated PLGA nanoparticle formulation (nano-CUR6, i.e., NCUR6) enhances cellular internalization and shows improved therapeutic effects in metastatic ovarian (A2780CP) and ...breast (MDA-MB-231) cancer cells.
Curcumin, a natural polyphenolic compound, has shown promising chemopreventive and chemotherapeutic activities in cancer. Although phase I clinical trials have shown curcumin as a safe drug even at high doses, poor bioavailability and suboptimal pharmacokinetics largely moderated its anti-cancer activity in pre-clinical and clinical models. To improve its applicability in cancer therapy, we encapsulated curcumin in poly(lactic-
co-glycolide) (PLGA) (biodegradable polymer) nanoparticles, in the presence of poly(vinyl alcohol) and poly(L-lysine) stabilizers, using a nano-precipitation technique. These curcumin nano-formulations were characterized for particle size, zeta potential, drug encapsulation, drug compatibility and drug release. Encapsulated curcumin existed in a highly dispersed state in the PLGA core of the nanoparticles and exhibited good solid–solid compatibility. An optimized curcumin nano-formulation (nano-CUR6) has demonstrated two and sixfold increases in the cellular uptake performed in cisplatin resistant A2780CP ovarian and metastatic MDA-MB-231 breast cancer cells, respectively, compared to free curcumin. In these cells, nano-CUR6 has shown an improved anti-cancer potential in cell proliferation and clonogenic assays compared to free curcumin. This effect was correlated with enhanced apoptosis induced by the nano-CUR6 formulation. Herein, we have also shown antibody conjugation compatibility of our PLGA-NP formulation. Results of this study suggest that therapeutic efficacy of curcumin may be enhanced by such PLGA nanoparticle formulations, and furthermore tumor specific targeted delivery of curcumin is made feasible by coupling of anti-cancer antibody to the NPs.
The integration of NIB with 6G results in a decentralized spatial crowdsourcing in industrial automation, but threatens the security of tasks and answers and also leads to the leakage of sensing ...nodes locations. To address these problems, we propose a secure decentralized spatial crowdsourcing scheme for 6G-Enabled Network in Box (DSC-NIB). Using DSC-NIB, the control station and sensing nodes can gather and transmit information on the blockchain using NIB, without depending on the third party. The control station shares encrypted location strategy parameters set to negotiate session keys and a group key with sensing nodes whose locations satisfy the location strategy, while ensuring the privacy of sensing nodes locations. For the security of tasks and answers, we leverage the Counter with CBC-MAC (CCM) authenticated encryption mechanism to provide confidentiality and integrity. Furthermore, we analyze the security of the proposed DSC-NIB. Compared with existing approaches, the performance is improved by 30%~50%.
With the application of wireless sensor network (WSN) in healthcare field, online sharing of medical data has attracted more and more attention. However, wearable sensor nodes are limited in energy, ...storage space and data processing capacity, which largely restricts their deployment in resource demand application scenarios. Fortunately, cloud storage services can enrich the capabilities of wearable sensors and provide an effective method for people to share data within a group. However, as medical data directly relates to patients' health and privacy information, ensuring the integrity and privacy of medical records stored in cloud servers becomes a key issue to be urgently solved. Many public data auditing schemes have been put forward to address the above issues. Unfortunately, most of them have security vulnerabilities or poor functionality and performance. In this paper, we come up with a secure and efficient certificateless public auditing scheme for cloud-assisted medical WSNs, which not only supports dynamic data sharingand privacy protection, but also achieves efficient group user revocation. Security analysis and performance evaluation demonstrate that our scheme significantly reduce the total computation cost while achieving a higher security level. Compared with other related schemes, our new proposal is more suitable for group user data sharing in cloud-assisted medical WSNs.
Distributed denial of service attacks are common and very severe threat to various computing technology like Cloud, IoT and Blockchain because of the disruption they cause to the services that are ...provided. Many different types of DDoS attacks are there, each with a unique action, making it difficult for network monitoring and control systems to identify and prevent them. The objective of this research work is to explore and select a set of data to represent DDoS attack events and attack traffic information. A pre-processing phase is used to clean and transform the data, and afterwards the generation of a model of machine learning for multi-class classification is done. This is carried out to identify the various classification of different types of DDoS attacks. We have used CIC dataset for the experiment which contains all types of DDoS attack and huge in number of records. Random Forest, Support Vector Machine, Naive Bayes, Decision Tree, XGBoost, and AdaBoost are six different types of machine learning algorithms employed in this research. FRom the results, AdaBoost achieves the best accuracy of 99.87% in 27.4 s of computation time. Naive Bayes has the fastest computing time (3.2 s) with 94.15% accuracy, where as Support Vector Machine has the slowest time, a lazy learner (229m26s for training and 0.2 s for prediction) and has the low accuracy (95.73%).