Abstract
Cigarette butts are one of the toxic residues and have a very serious impact on the environment. Recycling cigarette butts has a great advantage in saving the natural source and solving ...environmental problems. Cellulose diacetate obtained from extraction, purification and acid hydrolysis of cigarette butts and then was blended with low molecular weight partially hydrolyzed polyvinyl alcohol. The blend of cellulose diacetate and partially hydrolyzed polyvinyl alcohol was characterized by Fourier transform infrared spectroscopy and differential scanning calorimetry. To evaluate the viscoelastic properties of the blends dynamic mechanical analysis was performed. The lap shear strength and the 180° peel strength of the adhesive were estimated according to the content of the blends and the biodegradability in water was confirmed. The experimental results showed that the use of cellulose diacetate obtained from cigarette butts in the composition of biodegradable hot melt adhesives can be of great help in solving environmental problems caused by petroleum-based polymers and waste.
Convolutional neural network-based image processing research is actively being conducted for pathology image analysis. As a convolutional neural network model requires a large amount of image data ...for training, active learning (AL) has been developed to produce efficient learning with a small amount of training data. However, existing studies have not specifically considered the characteristics of pathological data collected from the workplace. For various reasons, noisy patches can be selected instead of clean patches during AL, thereby reducing its efficiency. This study proposes an effective AL method for cancer pathology that works robustly on noisy datasets.
Our proposed method to develop a robust AL approach for noisy histopathology datasets consists of the following three steps: 1) training a loss prediction module, 2) collecting predicted loss values, and 3) sampling data for labeling. This proposed method calculates the amount of information in unlabeled data as predicted loss values and removes noisy data based on predicted loss values to reduce the rate at which noisy data are selected from the unlabeled dataset. We identified a suitable threshold for optimizing the efficiency of AL through sensitivity analysis.
We compared the results obtained with the identified threshold with those of existing representative AL methods. In the final iteration, the proposed method achieved a performance of 91.7% on the noisy dataset and 92.4% on the clean dataset, resulting in a performance reduction of less than 1%. Concomitantly, the noise selection ratio averaged only 2.93% on each iteration.
The proposed AL method showed robust performance on datasets containing noisy data by avoiding data selection in predictive loss intervals where noisy data are likely to be distributed. The proposed method contributes to medical image analysis by screening data and producing a robust and effective classification model tailored for cancer pathology image processing in the workplace.
Nowcasting is an important technique for weather forecasting because sudden weather changes significantly affect human life. The encoding-forecasting model, which is a state-of-the-art architecture ...in the field of data-driven radar extrapolation, does not particularly focus on the latest data when forecasting natural phenomena. This paper proposes a weighted broadcasting method that emphasizes the latest data of the time step to improve the nowcasting performance. This weighted broadcasting method allows the most recent rainfall patterns to have a greater impact on the forecasting network by extending the architecture of the existing encoding-forecasting model. Experimental results show that the proposed model is 1.74% and 2.20% better than the existing encoding-forecasting model in terms of mean absolute error and critical success index, respectively. In the case of heavy rainfall with an intensity of 30 mm/h or higher, the proposed model was more than 30% superior to the existing encoding-forecasting model. Therefore, applying the weighted broadcasting method, which explicitly places a high emphasis on the latest information, to the encoding-forecasting model is considered as an improvement that is applicable to the state-of-the-art implementation of data-driven radar-based precipitation nowcasting.
This paper proposes practical weapon target assignment (WTA) algorithms for a defense system to counter multiple targets concentrated within a narrow area, such as low-altitude rocket threats or ...drone swarms. Since the probability of kill (PK) is greatly affected by heading errors between launchers and targets in this type of engagement, WTA problems must first be formulated, considering heading error, to reflect more realistic engagement situations. Two WTA algorithms-a rotation-fixed strategy and a rotation strategy- are proposed based on this formulation. Moreover, we propose a method for determining launchers' initial orientation angles, based on a clustering approach, to further improve the engagement performance of the two algorithms. Numerical simulations were performed to demonstrate the effectiveness of the proposed methods.
CNN-based image processing has been actively applied to histopathological analysis to detect and classify cancerous tumors automatically. However, CNN-based classifiers generally predict a label with ...overconfidence, which becomes a serious problem in the medical domain. The objective of this study is to propose a new training method, called MixPatch, designed to improve a CNN-based classifier by specifically addressing the prediction uncertainty problem and examine its effectiveness in improving diagnosis performance in the context of histopathological image analysis. MixPatch generates and uses a new sub-training dataset, which consists of mixed-patches and their predefined ground-truth labels, for every single mini-batch. Mixed-patches are generated using a small size of clean patches confirmed by pathologists while their ground-truth labels are defined using a proportion-based soft labeling method. Our results obtained using a large histopathological image dataset shows that the proposed method performs better and alleviates overconfidence more effectively than any other method examined in the study. More specifically, our model showed 97.06% accuracy, an increase of 1.6% to 12.18%, while achieving 0.76% of expected calibration error, a decrease of 0.6% to 6.3%, over the other models. By specifically considering the mixed-region variation characteristics of histopathology images, MixPatch augments the extant mixed image methods for medical image analysis in which prediction uncertainty is a crucial issue. The proposed method provides a new way to systematically alleviate the overconfidence problem of CNN-based classifiers and improve their prediction accuracy, contributing toward more calibrated and reliable histopathology image analysis.
Life cycle assessment (LCA) and life cycle cost (LCC) are two primary methods used to assess the environmental and economic feasibility of building construction. An estimation of the building's life ...span is essential to carrying out these methods. However, given the diverse factors that affect the building's life span, it was estimated typically based on its main structural type. However, different buildings have different life spans. Simply assuming that all buildings with the same structural type follow an identical life span can cause serious estimation errors. In this study, we collected 1,812,700 records describing buildings built and demolished in South Korea, analysed the actual life span of each building, and developed a building life-span prediction model using deep-learning and traditional machine learning. The prediction models examined in this study produced root mean square errors of 3.72–4.6 and the coefficients of determination of 0.932–0.955. Among those models, a deep-learning based prediction model was found the most powerful. As anticipated, the conventional method of determining a building's life expectancy using a discrete set of specific factors and associated assumptions of life span did not yield realistic results. This study demonstrates that an application of deep learning to the LCA and LCC of a building is a promising direction, effectively guiding business planning and critical decision making throughout the construction process.
•Actual life span of building is vastly different from mainframe-based life span.•The computational models were trained to predict building life span using big data.•The proposed computational approach is superior over the mainframe-based approach.
The industrial 4.0 era has been opened with the development of artificial intelligence technology, and the realization of smart farms incorporating ICT technology is receiving great attention in the ...livestock industry. Among them, the quality management technology of livestock products and livestock operations incorporating computer vision-based artificial intelligence technology represent key technologies. However, the insufficient number of livestock image data for artificial intelligence model training and the severely unbalanced ratio of labels for recognizing a specific defective state are major obstacles to the related research and technology development. To overcome these problems, in this study, combining oversampling and adversarial case generation techniques is proposed as a method necessary to effectively utilizing small data labels for successful defect detection. In addition, experiments comparing performance and time cost of the applicable techniques were conducted. Through experiments, we confirm the validity of the proposed methods and draw utilization strategies from the study results. 인공지능 기술의 발전으로 산업 4.0시대가 열렸고 축산업에서도 ICT 기술이 접목된 스마트 농장의 구현이 큰 관심을 받고 있다. 그중에서도 컴퓨터 비전 기반 인공지능 기술을 접목한 축산물 및 축산 가공품의 품질 관리 기술은 스마트 축산의 핵심 기술에 해당한다. 그러나 인공지능 모형 훈련을 위한 축산물 이미지 데이터 수의 부족과 특정 범주(class)에 대한 데이터 불균형은 관련 연구 및 기술 개발에 큰 장해물이 되고 있다. 이러한 문제들을 해결하기 위해, 본 연구에서는 오버샘플링과 적대적 사례 생성기법의 활용을 제안한다. 제안되는 방법은 성공적인 불량 탐지 (Defect detection) 관점을 기반으로 하며, 이는 부족한 데이터 레이블을 효과적으로 활용하는데 필요한 방법이다. 최종적으로 실험을 통해 제안된 방법의 타당성을 확인하고 활용 전략을 검토한다.
Going beyond the user–item rating information, recent studies have utilized additional information to improve the performance of recommender systems. Graph neural network (GNN) based approaches are ...among the most common. However, existing models that utilize text data require a lot of computing resources and have a complex structure that makes them difficult to utilize in real-world applications. In this research, we propose a new method, keyword-enhanced graph matrix completion (KGMC), which utilizes keyword sharing relationships in user–item graphs. Our model has a simpler structure and requires less computing resources than existing models that utilize text data, but it has the advantage of cross-domain transferability while providing an intuitive understanding of the inference results. KGMC consists of three steps: (1) keyword extraction from the review text, (2) subgraph extraction and keyword-enhanced subgraph construction, and (3) GNN-based rating prediction. We have conducted extensive experiments over eight benchmark datasets to examine the relative superiority of the proposed KGMC method, compared to state-of-the-art baselines. Additional experiments and case studies have been also conducted to demonstrate the transferability as well as keyword-based explainability of KGMC. Our findings highlight the practical advantages of our model for recommender systems and support its effectiveness in inductive graph-based link prediction.
The Trimeresurus complex consists of diverse medically important venomous pit vipers that cause snakebite envenomation. Antivenoms, however, are in limited supply, and are specific to only two out of ...the many species across Asia. This study thus investigated the immunoreactivities of regional pit viper antivenoms toward selected Trimeresurus pit viper venoms, and examined the neutralization of their hemotoxic activities. Trimeresurus albolabris Monovalent Antivenom (TaMAV, Thailand) exhibited a higher immunoreactivity than Hemato Bivalent Antivenom (HBAV, raised against Trimeresurus stejnegeri and Protobothrops mucrosquamatus, Taiwan) and Gloydius brevicaudus Monovalent Antivenom (GbMAV, China), attributed to its monovalent nature and conserved antigens in the Trimeresurus pit viper venoms. The venoms showed moderate-to-strong in vitro procoagulant and in vivo hemorrhagic effects consistent with hemotoxic envenomation, except for the Sri Lankan Trimeresurus trigonocephalus venom which lacked hemorrhagic activity. TaMAV was able to differentially neutralize both in vitro and in vivo hemotoxic effects of the venoms, with the lowest efficacy shown against the procoagulant effect of T. trigonocephalus venom. The findings suggest that TaMAV is a potentially useful treatment for envenomation caused by hetero-specific Trimeresurus pit vipers, in particular those in Southeast Asia and East Asia. Clinical study is warranted to establish its spectrum of para-specific effectiveness, and dosages need be tailored to the different species in respective regions.
Display omitted
•Hemotoxic activities of ten Trimeresurus pit viper venoms were studied.•Trimeresurus albolabris Monovalent Antivenom (TaMAV, Thailand) was examined.•It was immunoreactive toward the venoms, suggesting conserved protein antigens.•TaMAV also exhibited para-specific neutralization activity.•In vitro procoagulant and in vivo hemorrhagic effects of venoms were neutralized.
In this paper, we propose a novel procedure designed to apply comparable sales method to the automated price estimation of real estates, in particular, that of apartments. Apartments are the most ...popular residential housing type in Korea. The price of a single apartment is influenced by many factors, making it hard to estimate accurately. Moreover, as an apartment is purchased for living, with a sizable amount of money, it is mostly traded infrequently. Thus, its past transaction price may not be particularly helpful to the estimation after a certain period of time. For these reasons, the up-to-date price of an apartment is commonly estimated by certified appraisers, who typically rely on comparable sales method (CSM). CSM requires comparable properties to be identified and used as references in estimating the current price of the property in question. In this research, we develop a procedure to systematically apply this procedure to the automated estimation of apartment prices and assess its applicability using nine years’ real transaction data from the capital city and the most-populated province in South Korea and multiple scenarios designed to reflect the conditions of low and high fluctuations of housing prices. The results from extensive evaluations show that the proposed approach is superior to the traditional approach of relying on real estate professionals and also to the baseline machine learning approach.