A hanok is a traditional Korean house built with wood as the main structural material. It is constructed using cross- and unidirectional joint techniques without the use of steel. A hanok is composed ...of vertical and horizontal members, with columns being the most important vertical members and Daedeulbo being the most important horizontal member. As a cultural heritage structure, a hanok is often deformed due to damage to the wood over the years. In particular, the building beginning to lean is a typical example. Depending on the extent of damage, hanoks are repaired through partial or complete dismantling, but the same phenomenon recurs in many hanoks even after repair. In this study, 69 hanoks with well-documented records were selected to build a building column arrangement DB, column movement DB, and building attribute DB. The constructed DB was optimized in two dimensions by utilizing the features of each element with the UMAP algorithm and then clustered using the DBSCAN algorithm. Using this method, the movement of one column was analyzed individually, and the movement of two, three, and four columns was analyzed in groups, considering the characteristics of a hanok. As a result, similar patterns of column movement were found in hanoks with similar shapes. It was also possible to identify vulnerable locations according to the direction of column movement, and it was found that the deterioration of the joining strength of horizontal members affects the movement of columns.
The problem that multiple Pareto solution sets correspond to the same Pareto front is called multimodal multi-objective optimization problem. Solving all Pareto solution sets in this kind of problem ...can provide decision makers with more convenient and accurate choices. However, the traditional multi-objective optimization algorithm often ignores the distribution of solutions in the decision space when solving such problems, resulting in poor diversity of Pareto solution sets.To solve this problem, a two-stage search algorithm framework is proposed. This framework divides the optimization process into two parts: global search and local search to balance the search ability of the algorithm. When searching globally, locate as many approximate locations with the optimal solution as possible, providing a good population distribution for subsequent local searches. In local search, DBSCAN clustering method with adaptive neighborhood radius is used to divide the population into several subpopulations, so as to enhance the local search ability with the algorithm. At the same time, an individual selection mechanism based on the farthest-candidate approach with two spaces is proposed to keep the diversity of the population in the objective space and decision space. The algorithm is compared with five state-of-the-art algorithms on 22 multimodal and multi-objective optimization test functions. The experimental results indicate that the proposed algorithm can search more Pareto solution sets while maintaining the diversity of solutions in the objective space.
Summary The ability to design smarter, more predictive healthcare solutions for use in the community (at work and at home) and in healthcare facilities has been greatly enhanced by recent ...developments in the Internet of Health Things (IoHT) and cyber physical systems (CPS). These data collected by such medical sensors will be sent in a large quantity to the fog gateway of IoHT networks. These data should be forwarded to far cloud for further analysis and processing. Therefore, sending all of these data over the IoHT network to the data center will impose a significant burden on the IoHT network. This paper proposed a new lossless electroencephalogram (EEG) compression technique (NeLECoT) for fog computing‐based IoHT networks. It encodes the data of the patient at the fog gateway prior to sending it to the data center, thereby reducing the data's size. In the fog node, the NeLECoT combines three efficient techniques: clustering based on density‐based spatial clustering of applications with noise (DBSCAN), RLE (run length encoding), and Huffman encoding (HE). The clustering based on DBSCAN separates a massive volume of captured data into small groups of closely related (or similar) data. RLE encodes clustered EEG data, and the resulting file is encoded with HE. The fog gateway then transmits the encoded file to the cloud. Numerous simulation experiments were carried out, and the findings demonstrated that the suggested NeLECoT achieved better results than the competing techniques in terms of transmitted data size, compression ratio, compression power, compression time, decompression time, and average compression power.
The correct individual tree segmentation of the forest is necessary for extracting the additional information of trees, such as tree height, crown width, and other tree parameters. With the ...development of LiDAR technology, the research method of individual tree segmentation based on point cloud data has become a focus of the research community. In this work, the research area is located in an underground coal mine in Shenmu City, Shaanxi Province, China. Vegetation information with and without leaves in this coal mining area are obtained with airborne LiDAR to conduct the research. In this study, we propose hybrid clustering technique by combining DBSCAN and K-means for segmenting individual trees based on airborne LiDAR point cloud data. First, the point cloud data are processed for denoising and filtering. Then, the pre-processed data are projected to the XOY plane for DBSCAN clustering. The number and coordinates of clustering centers are obtained, which are used as an input for K-means clustering algorithm. Finally, the results of individual tree segmentation of the forest in the mining area are obtained. The simulation results and analysis show that the new method proposed in this paper outperforms other methods in forest segmentation in mining area. This provides effective technical support and data reference for the study of forest in mining areas.
With the increasing use of mobile GPS (global positioning system) devices, a large volume of trajectory data on users can be produced. In most existing work, trajectories are usually divided into a ...set of stops and moves. In trajectories, stops represent the most important and meaningful part of the trajectory; there are many data mining methods to extract these locations. DBSCAN (density-based spatial clustering of applications with noise) is a classical density-based algorithm used to find the high-density areas in space, and different derivative methods of this algorithm have been proposed to find the stops in trajectories. However, most of these methods required a manually-set threshold, such as the speed threshold, for each feature variable. In our research, we first defined our new concept of move ability. Second, by introducing the theory of data fields and by taking our new concept of move ability into consideration, we constructed a new, comprehensive, hybrid feature-based, density measurement method which considers temporal and spatial properties. Finally, an improved DBSCAN algorithm was proposed using our new density measurement method. In the Experimental Section, the effectiveness and efficiency of our method is validated against real datasets. When comparing our algorithm with the classical density-based clustering algorithms, our experimental results show the efficiency of the proposed method.
Nowadays, communication networks are becoming increasingly complex. This paper aims to demonstrate an effective method to achieve the intelligent planning for network base stations (BSs). The various ...parameters such as BS coordinates (x, y), the collaboration of multiple types of BS, and the density of BS construction are taken as design parameters for BS placement. We construct the objective function using the lowest total cost and the total minimum workload of BS to 90%. To solve the problem of siting planning with large data volume and mixed placement of multiple BS, we propose a new practical three-step model for BS siting planning: (I) roughly selecting the alternative coordinates for the BS using the DBSCAN algorithm; (II) correcting and further refining the alternative BS coordinates using the K-means algorithm; (III) determining the optimal BS construction solution to meet the requirements using simulated annealing algorithm (SAA). The real data of a <inline-formula> <tex-math notation="LaTeX">2500\times 2500 </tex-math></inline-formula> area have been used for the simulation test. The simulation result shows that BS placement covers 90.03% of the workload, confirming that the proposed method can handle site planning for large orders of magnitude of data and use a mix of BS to achieve the best economics for the demand. This paper provides basic support for future research on network site optimization.
The prevalence of convolutional neural networks is hindered by their intricate structure and voluminous parameters, which consume significant processing resources during both training and inference. ...This study proposes a novel approach that involves treating convolutional kernels as tensors, utilizing suitable priori metrics to gauge their effectiveness, and employing a clustering algorithm to eliminate redundant convolutional kernels for network pruning. To measure the convolutional kernels, we use appropriate priori metrics and density-Based spatial clustering of applications with noise (DBSCAN) to cluster the convolutional kernels so that the cluster centroids are the reserved convolution kernels. The experimental results show that the method can effectively perform network lightweight while maintaining high accuracy. At the same time, the method can improve the efficiency of convolutional kernel utilization, thus reducing the computational resource consumption of the model. Empirical analyses conducted on datasets indicate that, in certain instances, the proposed pruning method outperforms established state-of-the-art methods.