Historical ciphers, a special type of manuscripts, contain encrypted information, important for the interpretation of our history. The first step towards decipherment is to transcribe the images, ...either manually or by automatic image processing techniques. Despite the improvements in handwritten text recognition (HTR) thanks to deep learning methodologies, the need of labelled data to train is an important limitation. Given that ciphers often use symbol sets across various alphabets and unique symbols without any transcription scheme available, these supervised HTR techniques are not suitable to transcribe ciphers. In this paper we propose an un-supervised method for transcribing encrypted manuscripts based on clustering and label propagation, which has been successfully applied to community detection in networks. We analyze the performance on ciphers with various symbol sets, and discuss the advantages and drawbacks compared to supervised HTR methods.
Gallium oxide (Ga 2 O 3 ) field-effect transistors (FETs) have high potential for future RF and power devices due to their superior power switching capabilities, high breakdown field, and ...opportunities for single crystal substrate fabrication. However, the high heat dissipation and inefficient heat removal from the channel area of these FETs can limit the device performance and cause reliability issues. In this work, we investigate the thermal characteristics of Ga 2 O 3 based FETs by performing transient temperature measurements using a thermo-reflectance imaging system. The transient temperature distribution in the channel and metallic contacts is obtained at different voltages. We analyze the temperature distribution and the location of hot spots inside the channel, and establish hot spot temperature dependence with the gate voltage. High temperature rise at hot spots is observed in the channel area even at low power levels, which is a consequence of the low thermal conductivity of Ga 2 O 3 . Investigation of thermal characteristics of these FETs is crucial to develop efficient thermal management solution and improving the reliability of Ga 2 O 3 devices. It will also help in better device and package design considering thermal aspects.
Recently, there has been a significant growth of interest in applying software engineering techniques for the quality assurance of deep learning (DL) systems. One popular direction is deep learning ...testing, where adversarial examples (a.k.a.~bugs) of DL systems are found either by fuzzing or guided search with the help of certain testing metrics. However, recent studies have revealed that the commonly used neuron coverage metrics by existing DL testing approaches are not correlated to model robustness. It is also not an effective measurement on the confidence of the model robustness after testing. In this work, we address this gap by proposing a novel testing framework called Robustness-Oriented Testing (RobOT). A key part of RobOT is a quantitative measurement on 1) the value of each test case in improving model robustness (often via retraining), and 2) the convergence quality of the model robustness improvement. RobOT utilizes the proposed metric to automatically generate test cases valuable for improving model robustness. The proposed metric is also a strong indicator on how well robustness improvement has converged through testing. Experiments on multiple benchmark datasets confirm the effectiveness and efficiency of RobOT in improving DL model robustness, with 67.02% increase on the adversarial robustness that is 50.65% higher than the state-of-the-art work DeepGini.
Deep learning (DL) models, especially those large-scale and high-performance ones, can be very costly to train, demanding a great amount of data and computational resources. Unauthorized reproduction ...of DL models can lead to copyright infringement and cause huge economic losses to model owners. Existing copyright protection techniques are mostly based on watermarking, which embeds an owner-specified watermark into the model. While being able to provide exact ownership verification, these techniques are 1) invasive, as they need to tamper with the training process, which may affect the utility or introduce new security risks; 2) prone to adaptive attacks that attempt to remove the watermark; and 3) not robust to the emerging model extraction attacks. Latest fingerprinting work, though being non-invasive, also falls short when facing the diverse and ever-growing attack scenarios. In this paper, we propose a novel testing framework for DL copyright protection: DEEPJUDGE. DEEPJUDGE quantitatively tests the similarities between two DL models: a victim model and a suspect model. It leverages a diverse set of testing metrics and test case generation methods to produce a chain of supporting evidence to help determine whether a suspect model is a copy of the victim model. Advantages of DEEPJUDGE include: 1) non-invasive, as it works directly on the model and does not tamper with the training process; 2) efficient, as it only needs a small set of test cases and a quick scan of models; 3) flexible, as it can easily incorporate new metrics or generation methods to obtain more confident judgement; and 4) fairly robust to model extraction and adaptive attacks. We verify the effectiveness of DEEPJUDGE under typical copyright infringement scenarios, including model finetuning, pruning and extraction, via extensive experiments on both image and speech datasets with a variety of model architectures.
This project explores the feasibility of remote patient monitoring based on the analysis of 3D movements captured with smartwatches. We base our analysis on the Kinematic Theory of Rapid Human ...Movement. We have validated our research in a real case scenario for stroke rehabilitation at the Guttmann Institute5 (neurorehabilitation hospital), showing promising results. Our work could have a great impact in remote healthcare applications, improving the medical efficiency and reducing the healthcare costs. Future steps include more clinical validation, developing multi-modal analysis architectures (analysing data from sensors, images, audio, etc.), and exploring the application of our technology to monitor other neurodegenerative diseases.
The Barcelona Historical Marriage Database (BHMD) gathers records of the more than 600,000 marriages celebrated in the Diocese of Barcelona and their taxation registered in Barcelona Cathedral's ...so-called Marriage Licenses Books for the long period 1451–1905. The Baix Llobregat Demographic Database (BALL) brings together the individual information recorded in the population registers, censuses and fiscal censuses of the main municipalities of the county of Baix Llobregat (Barcelona). In this ongoing collection 263,786 individual observations have been assembled, dating from the period between 1828 and 1965 by December 2020. The two databases started as part of different interdisciplinary research projects
Nowadays, there are still many handwritten historical documents in archives waiting to be transcribed and indexed. Since manual transcription is tedious and time consuming, the automatic ...transcription seems the path to follow. However, the performance of current handwriting recognition techniques is not perfect, so a manual validation is mandatory. Crowdsourcing is a good strategy for manual validation, however it is a tedious task. In this paper we analyze experiences based in gamification in order to propose and design a gamesourcing framework that increases the interest of users. Then, we describe and analyze our experience when validating the automatic transcription using the gamesourcing application. Moreover, thanks to the combination of clustering and handwriting recognition techniques, we can speed up the validation while maintaining the performance.
Ammonia decomposition reactor is an important component in process of ammonia hydrogen energy conversion technology where ammonia is the storage and transportation medium for hydrogen. In this ...research, a tubular ammonia decomposition reactor is modeled according to finite-time thermodynamics. With a fixed hydrogen yield, heat transfer rate and power consumption are taken as optimization targets, and the corresponding optimal temperature distributions outside the tube, that is, the optimal configurations, are obtained through a nonlinear programming method. In addition, the optimized reactor is also analyzed for three parameters: reactant initial temperature, reactant initial pressure, and reaction tube length. The results indicate that heat transfer rate of the optimal reactor with the minimum heat transfer rate and power consumption of the optimal reactor with the minimum power consumption are reduced by 10.5% and 17.26% compared to the reference reactor, respectively. The optimum parameters of the reactor are obtained as a tube length of 8 m, reactant inlet temperature of 450 K, and reactant inlet pressure of 8 bar. The findings of this research are instructive towards the optimal design and operation of ammonia decomposition reactors.
Display omitted
•A tubular ammonia decomposition reactor is modeled using finite time thermodynamics.•Optimal temperature distributions outside tube are obtained by nonlinear programming method.•Heat transfer rate and power consumption are optimization targets with a fixed hydrogen yield.•Heat transfer rate of optimal reactor with minimum heat transfer rate is reduced by 10.5%.•Power consumption of optimal reactor with minimum power consumption is reduced by 17.26%.
Efficient degradation of polycyclic aromatic hydrocarbons (PAHs) in a petroleum-contaminated soil was challenging which requires ample PAH-degrading flora and nutrients. In this study, we ...investigated the effects of ‘natural attenuation’, ‘bioaugmentation’, ‘compost only (raw materials of compost included pig manure and rice husk mixed at a 1:2 proportion, supplemented with 2.5% charcoal)’, and ‘compost with bioaugmentation’ treatments on degradation of polycyclic aromatic hydrocarbons (PAHs) and microbial community shifts during the remediation of petroleum-contaminated soil. After sixteen weeks of incubation, the removal efficiencies of PAHs were 0.52 ± 0.04%, 6.92 ± 0. 32%, 9.53 ± 0.29%, and 18.2 ± 0.64% in the four treatments, respectively. ‘Compost with bioaugmentation’ was the most effective for PAH removal among all the treatments. Illumina sequencing analysis suggested that both the ‘compost only’ and ‘compost with bioaugmentation’ treatments changed soil microbial community structures and enhanced microbial biodiversity. Some of the microorganisms affiliated with the compost including Azomonas, Luteimonas, Pseudosphingobacterium, and Parapedobacter were able to survive and become dominant in the contaminated soil. The ‘bioaugmentation and ‘natural attenuation’ treatments had no significant effects on soil microbial community structure. Inoculation of the PAH degraders including Bacillus, Pseudomonas, and Acinetobacter directly into the contaminated soil led to lower biodiversity under natural conditions. This result suggested that compost addition increased the α-diversity of both the bacterial and fungal communities in petroleum-contaminated soil, leading to higher PAH degradation efficiency in petroleum-contaminated soil.
•Degradation of PAHs were studied using compost inoculated with PAH-degraders.•Removal of PAHs were improved by 18.2% using bioaugmented compost.•Compost with bioaugmentation enhanced the abundance of Penicillium sp. to 6.02%.•Compost addition increased alpha diversity indices compared to the initial soil.•PAH degradation was correlated with the α-diversity and the abundance of Penicillium.