•An AI algorithm for data enhanced encryption at the end and the intermediate nodes of autonomous IoT systems is proposed.•An AI access strategy is designed for reducing the calculation cost of data ...encryption.•A data shared matrix is proposed to share the encrypted data to achieve the (k, n) threshold strategy.
Aiming at the security issues during the multi-types data storage and data transmission in autonomous Internet of Things (IoT) systems, this paper proposes an AI algorithm for data enhanced encryption used in the ends and the intermediate nodes of IoTs. The algorithm in this paper first constructs a three-dimensional Arnold transformation matrix for data unit value encryption in the end of IoTs, and designs a quantum logic intelligent mapping that effectively diffuses the encrypted data units to reduce the linear correlation of the image data and to improve the security performance of IoT edge data. Furthermore, the algorithm designs an AI access strategy for scrambling sequence nodes and builds a random-access route for the elements of the scrambling sequence which can reduce the calculation cost and improve the operating efficiency of IoT system in the ends and intermediate nodes. Finally, the data shared matrix is used to share the encrypted data to achieve the (k, n) threshold strategy. Experimental results prove that the algorithm has high plaintext and key sensitivity and can effectively resist brute force attacks, statistical analysis and differential attacks. The algorithm in this paper provides an AI solution for data security encryption in the ends and the intermediate nodes of autonomous IoT systems.
•A depth-first search (DFS) framework is designed for ABC.•Two novel search equations are invented respectively in employed and onlooker bee phases.•Our algorithm is better than other ABC variants ...and non-ABC methods on many benchmark functions.
Inspired by the intelligent foraging behavior of honey bees, the artificial bee colony algorithm (ABC), a swarm-based stochastic optimization method, has shown to be very effective and efficient for solving optimization problems. However, since its solution search equation is good at exploration but poor at exploitation, ABC often suffers from a slow convergence speed. To better balance the tradeoff between exploration and exploitation, in this paper, we propose a depth-first search (DFS) framework. The key feature of the DFS framework is to allocate more computing resources to the food sources with better quality and easier to be improved for evolution. We apply the DFS framework to ABC, GABC and CABC, yielding DFSABC, DFSGABC and DFSCABC respectively. The experimental results on 22 benchmark functions show that the DFS framework can speed up convergence rate in most cases. To further improve the performance, we introduce two novel solution search equations: the first equation incorporates the information of elite solutions and can be applied to the employed bee phase, while the second equation not only exploits the information of the elite solutions but also employs the current best solution in the onlooker bee phase. Finally, two novel proposed search equations are combined with DFSABC to form a new variant of ABC, named DFSABC_elite. Through the comparison of DFSABC_elite with other variants of ABC and some non-ABC methods, the experimental results demonstrate that DFSABC_elite is significantly better than the compared algorithms on most of the test functions in terms of solution quality, robustness, and convergence speed.
This note presents further results based on the recent paper J. Liang, H. Chen, and J. Lam, "An improved criterion for controllability of Boolean control networks," IEEE Trans. Autom. Control , ...vol. 62, no. 11, pp. 6012-6018, Nov. 2017. After some optimizations, the conventional method can be more efficient than the method used in the above paper. We also propose an improved method via combining the well known Tarjan's algorithm and depth-first search technique for the controllability analysis of Boolean control networks (BCNs). As a result, the computational complexity will not exceed <inline-formula><tex-math notation="LaTeX">O(N^2)</tex-math></inline-formula> with <inline-formula><tex-math notation="LaTeX">N=2^n</tex-math></inline-formula>, where <inline-formula><tex-math notation="LaTeX">n</tex-math></inline-formula> is the number of state-variables in a BCN.
This paper presents a Cooperative Particle Swarm Optimizer with Depth First Search Strategy (DFS-CPSO), which has better seacrch capality than classical Particle Swarm Optimizer (PSO) in solving ...multimodal optimization problems. In order to improve the quality of information exchange, the Depth First Search (DFS) strategy is hybridized to Cooperative Particle Swarm Optimization(CPSO), which makes information transfer more effectively and generates better quality solution. Specifically, DFS strategy enables different components of solution vector to exchange information separately with PSO and increases the diversity of the population, so that the information of solution components could be preserved by multiple iterations in CPSO. Confirmatory experiments are performed to prove the effectiveness of employing the DFS strategy to CPSO. The comparative results demonstrate superior performance of DFS-CPSO in solving high dimensional multimodal functions than CPSO and other advanced methods.
The COVID-19 has now spread all over the world and causes a huge burden for public health and world economy. Drug repositioning has become a promising treatment strategy in COVID-19 crisis because it ...can shorten drug development process, reduce pharmaceutical costs and reposition approval drugs. Existing computational methods only focus on single information, such as drug and virus similarity or drug–virus network feature, which is not sufficient to predict potential drugs. In this paper, a sequence combined attentive network embedding model SANE is proposed for identifying drugs based on sequence features and network features. On the one hand, drug SMILES and virus sequence features are extracted by encoder–decoder in SANE as node initial embedding in drug–virus network. On the other hand, SANE obtains fields for each node by attention-based Depth-First-Search (DFS) to reduce noises and improve efficiency in representation learning and adopts a bottom-up aggregation strategy to learn node network representation from selected fields. Finally, a forward neural network is used for classifying. Experiment results show that SANE has achieved the performance with 81.98% accuracy and 0.8961 AUC value and outperformed state-of-the-art baselines. Further case study on COVID-19 indicates that SANE has a strong predictive ability since 25 of the top 40 (62.5%) drugs are verified by valuable dataset and literatures. Therefore, SANE is powerful to reposition drugs for COVID-19 and provides a new perspective for drug repositioning.
•SANE addresses drug repositioning task from drug–virus network perspective and firstly designs an attention-based pre-depth-first-search network embedding method to identify potential drugs against COVID-19.•SANE integrates drug and virus basic sequence information including drug SMILES sequence and virus RNA sequence into attentive network embedding model as node attributes.•SANE adopts attention-based Depth-First-Search (DFS) to reduce redundancies and noises in the process of learning node representation and improve model efficiency.•SANE is an easy trained end-to-end model and can be applied to any molecular networks.
A new theoretical model and its computational implementation in two dimensions (2D) for the study of continuum percolation phenomena is presented. The aim was the development of a model which has ...inherent similarity with lattice percolation. The physical medium is simulated as an (infinite) grid comprising of representative surface elements (RSEs). Assuming medium’s homogeneity the RSEs average propagation probability can be interpreted and generalized as the occupation probability for the infinite medium. The RSE’s resulting from a Monte Carlo iterative process involving the creation of the relative small samples and their propagation ability checked individually from their top to bottom. The propagation in the actual physical medium takes place when the calculated probability (p) is higher than the critical propagation probability (pc≈0.5927). The proposed method treats the low dimensional material system as a 2D infinite homogenized medium which can be further reduced leading to a mapping on a square lattice with site occupation. The proposed numerical algorithm considers the particles in the RSE as digitized using sites-pixels without contacts. Following the digitization procedure, traditional computational methods like Depth First Search are involved for the detection of possible propagation paths in the randomly selected square samples. For the confirmation of the theoretical model as well as the algorithm, problems known from the literature were used and it was found that regardless of microstructure at the critical concentration Φc the percolation probability on the RSE converges to the anticipated pc≈0.5927 value. In addition, the results obtained from the proposed methodology compare very well with available predictions in the literature. New results are reported covering a wide range of particle geometrical types (circular, elliptical, rectangular) and surface fractions in matrix-filler or matrix-fillers systems proving the robustness and applicability of the proposed methodology.
•A new computational method is developed in 2D to study continuum percolation•Tunnel effect was taken into account in the modeling approach•The proposed methodology compares well with available predictions in the literature•The results cover different particles and surface fractions in matrix-filler systems•The method is computational affordable requiring moderate computational resources
Reaction mechanisms are at the core of understanding reaction systems and designing high-performance catalysts. A complex reaction system often involves various species and elementary reactions, ...posing a great challenge to determining the reaction mechanism. Here, we proposed a scheme to automatically generate reaction intermediates and elementary reactions to construct a complete reaction network represented by graph theory and employed a depth first search algorithm in the scheme to prune the reaction network to reduce the complexity of the network. With this scheme, microkinetic simulations of CO2 hydrogenation on Pd2Cu using the barriers predicted with the linear thermodynamics–kinetics relations were performed on the network to determine the mechanism and rate- and selectivity-controlling steps of CO2 hydrogenation to ethanol and methanol. Analysis shows that the simulated selectivity of ethanol and methanol agrees well with the experimental results. CO2 + H → COOH is the rate-controlling step, and CHOH + H → CH + H2O, CH2OH + H → CH2 + H2O, and CH2OH + H → CH3OH dominate the ethanol selectivity. Both ethanol and methanol are generated via multiple reaction pathway mechanisms. Investigations of the pruned networks show that quantitatively correct results can be obtained from the pruned or pseudocomplete reaction network, as long as the key pathways are embodied in the network. 94% ethanol selectivity of the complete network can be obtained with the pruned network composed of 60 elementary steps, compared to 176 steps of the complete network. The present work articulates graph theory representation, depth first search algorithm, linear thermodynamics–kinetics relations, and microkinetic simulations to approach complicated heterogeneous reaction systems and exemplifies their comprehensive roles in exploring complex reaction networks.
Pemilihan komputer yang baik adalah menyesuaikan dengan kebutuhan dan ketersediaan anggaran karena hal ini berhubungan dengan spesifikasi perangkat keras di dalamnya. Masih banyak masyarakat awam ...tidak mempunyai pengetahuan komponen komputer rakitan. Hal ini bias menjadi penyebab kurang optimalnya dalam memilih spesifikasi komputer rakitan berdasarkan kebutuhan dan ketersediaan anggaran. Sehingga perlu adanya suatu sistem yang dapat membantu masyarakat dalam memberikan informasi spesifikasi (komponen-komponen) komputer rakitan yang menyesuaikan dengan kebutuhan dan anggaran yang dimiliki calon pembeli. Pengembangan aplikasi dilakukan dengan mengamati langsung proses pembelian Komputer rakitan yang dilakukan di toko-toko komputer. Selanjutnya melakukan pengumpulan data perangkat keras komputer yang akan dijadikan spesifikasi komputer rakitan. Memilih Algoritma Backtracking dengan teknik Depth First Search dengan fungsi pembatas harga dalam proses penyusunan spesifikasi perangkat keras komputer. Pengujian dilakukan pada 4 komponen hardware untuk Paket Office dengan batas anggaran Rp 5.000.000, dan memperoleh output harga Rp 4.978.000 dengan performa indeks 5482 dan pada Paket Grafik/Game, pengujian dilakukan pada 5 komponen hardware dengan batas anggaran Rp 8.000.000 dan memperoleh 2 pilihan output harga Rp 7.973.000 dan Rp 8.000.000 dengan masing-masing indeks 12749 dan 12373. Kedua pengujian menghasilkan keluaran kombinasi hardware yang tidak melebihi batas anggaran yang dimasukkan.
•A novel and efficient semi-external DFS algorithm EP-DFS is presented.•EP-DFS requires simpler CPU calculation and less memory space.•A novel index is devised to reduce the disk random ...accesses.•Extensive experiments are conducted on both real and synthetic datasets.
As graphs grow in size, many real-world graphs are difficult to load into the primary memory of a computer. Thus, computing depth-first search (DFS) results (i.e., depth-first order or DFS-Tree) on the semi-external memory model is important to investigate. Semi-external algorithms assume that the primary memory can at least hold a spanning tree T of a graph G and gradually restructure T into a DFS-Tree, which is nontrivial. In this paper, we present a comprehensive study for the semi-external DFS problem. Based on a theoretical analysis of this problem, we introduce a new semi-external DFS algorithm called EP-DFS with a lightweight index N+-index. Unlike traditional algorithms, we focus on addressing such a complex problem efficiently with fewer I/Os, simpler CPU calculations (implementation-friendly), and less random I/O accesses (key-to-efficiency). Extensive experimental evaluations are performed on both synthetic and real graphs, and experimental results confirm that the proposed EP-DFS algorithm markedly outperforms existing algorithms.
Display omitted
•We limit the gap constraints to make it match weak characters to avoid noise patterns.•We propose a complete and efficient algorithm NWP-Miner.•NWP-Miner employs depth-first search ...and backtracking strategies to calculate support.•NWP-Miner has better performance than other competitive algorithms.•NWP-Miner discovers more meaningful patterns and filters out noise patterns.
Nonoverlapping sequential pattern mining (SPM) is a type of SPM with gap constraints that can mine valuable information in sequences. One of the disadvantages of nonoverlapping SPM is that any characters can match with gap constraints. Hence, there can be a significant difference between the trend of a pattern and those of its occurrences. To tackle this issue, we propose nonoverlapping weak-gap sequential pattern (NWP) mining, where characters are divided into two types: weak and strong. This allows discovering frequent patterns more accurately by limiting the gap constraints to match only weak characters. To discover NWPs, we propose NMP-Miner which involves two key steps: support calculation and candidate pattern generation. To efficiently calculate the support of candidate patterns, depth-first search and backtracking strategies based on a simplified Nettree structure are adopted, which effectively reduce the time and space complexities of the algorithm. Moreover, a pattern join approach is applied to effectively reduce the number of candidate patterns. The experimental results show that NWP-Miner is more efficient than other competitive algorithms. More importantly, the case study of time series shows that NWP-Miner can effectively filter out noise patterns and discover more meaningful patterns. Algorithms and datasets can be downloaded fromhttps://github.com/wuc567/Pattern-Mining/tree/master/NWP-Miner.