Sportswear comfort is a very important content in the evaluation of sportswear design and development. By reading relevant research literature at home and abroad, it is concluded that the commonly ...used evaluation methods in the study of sportswear comfort are subjective feeling evaluation method, objective evaluation method and subjective and objective comprehensive evaluation method, etc., and the implementation requirements, advantages and disadvantages of different evaluation methods are analyzed. Research shows that the actual performance of sportswear in survey research, is usually subjective feelings and objective evaluation method is used in combination, in the experimental study on the design to improve targeting, as far as possible to improve the effectiveness of the test results, comprehensive analysis of the different test results at the same time, to achieve a comprehensive, scientific, appropriate and accurate evaluation conclusion.
In today's highly advanced industrialised and modernised world, China's economy is still growing, and its demand for energy is increasing daily. It is crucial to examine the connection between energy ...consumption, carbon emissions, and economic growth in order to promote economic growth based on energy conservation and emission reduction. Using Dezhou City in Shandong Province as an example, the study builds a VAR model of carbon emission, energy consumption, and economic growth in Dezhou City based on simplified macroeconomic sub-models, energy sub-models, and environmental sub-models. It then determines the correlation and influence mechanism between the three using tests like ADF unit root and Granger causality. The pertinent elements affecting Dezhou's carbon emissions were then investigated using grey correlation analysis. Finally, based on the study's findings, policy suggestions are made regarding energy use, carbon emissions, and economic expansion. It is necessary not only to restrain high-energy consumption industries and fundamentally optimize the energy consumption structure, but also to find new economic growth points and improve economic growth channels, so as to optimize the industrial structure. In this process, increasing the proportion of the tertiary industry is a key measure. In addition, the government needs to advocate the citizens to adopt a low-carbon lifestyle, and the concept of low-carbon environmental protection will be deeply rooted in the hearts of the people. This study will provide suggestions and theoretical guidance for China's energy consumption and carbon emissions, and help achieve high-quality growth of China and even the world economy.
With the development of remote sensing technology, the application of hyperspectral images is becoming more and more widespread. The accurate classification of ground features through hyperspectral ...images is an important research content and has attracted widespread attention. Many methods have achieved good classification results in the classification of hyperspectral images. This paper reviews the classification methods of hyperspectral images from three aspects: supervised classification, semisupervised classification, and unsupervised classification.
Effective calibration of miniature air quality monitor measurements is an important task to ensure accurate measurements and guarantee sustainable air quality. The aim of this study is to calibrate ...the measurement data of miniature air quality monitors using Stepwise Regression Analysis and Support Vector Regression (SRA-SVR) combined model. Firstly, a stepwise regression analysis model is used to find a linear relationship between the measured data from the miniature air quality monitor and the air pollutant concentration. Secondly, support vector regression is used to extract the non-linear relationships which affect the pollutant concentrations hidden in the residuals of the stepwise regression analysis model. Finally, the residual calibration values of the SVR model outputs are added to the SRA model outputs to obtain the final outputs of the SRA-SVR combined model for the pollutants. Mean absolute error, relative mean absolute percent error and root mean square error are used to compare the effectiveness of the SRA-SVR combined model and some other commonly used statistical models for the calibration of miniature air quality monitors. The results show that the SRA-SVR combination model performs optimally on both the training and test sets, regardless of which pollutant and which indicator. The SRA-SVR combined model not only has the advantages of the SRA model’s strong interpretability and the SVR model’s high accuracy, but also has higher accuracy than the single model. By using this model to calibrate the measurements of the miniature air quality monitor, its accuracy can be improved by 61.33%–87.43%.
Ubiquitous sensors and smart devices from factories and communities are generating massive amounts of data, and ever-increasing computing power is driving the core of computation and services from ...the cloud to the edge of the network. As an important enabler broadly changing people's lives, from face recognition to ambitious smart factories and cities, developments of artificial intelligence (especially deep learning, DL) based applications and services are thriving. However, due to efficiency and latency issues, the current cloud computing service architecture hinders the vision of "providing artificial intelligence for every person and every organization at everywhere". Thus, unleashing DL services using resources at the network edge near the data sources has emerged as a desirable solution. Therefore, edge intelligence , aiming to facilitate the deployment of DL services by edge computing, has received significant attention. In addition, DL, as the representative technique of artificial intelligence, can be integrated into edge computing frameworks to build intelligent edge for dynamic, adaptive edge maintenance and management. With regard to mutually beneficial edge intelligence and intelligent edge , this paper introduces and discusses: 1) the application scenarios of both; 2) the practical implementation methods and enabling technologies, namely DL training and inference in the customized edge computing framework; 3) challenges and future trends of more pervasive and fine-grained intelligence. We believe that by consolidating information scattered across the communication, networking, and DL areas, this survey can help readers to understand the connections between enabling technologies while promoting further discussions on the fusion of edge intelligence and intelligent edge , i.e., Edge DL.
Glaucoma is one of the leading causes of irreversible vision loss. Many approaches have recently been proposed for automatic glaucoma detection based on fundus images. However, none of the existing ...approaches can efficiently remove high redundancy in fundus images for glaucoma detection, which may reduce the reliability and accuracy of glaucoma detection. To avoid this disadvantage, this paper proposes an attention-based convolutional neural network (CNN) for glaucoma detection, called AG-CNN. Specifically, we first establish a large-scale attention-based glaucoma (LAG) database, which includes 11 760 fundus images labeled as either positive glaucoma (4878) or negative glaucoma (6882). Among the 11 760 fundus images, the attention maps of 5824 images are further obtained from ophthalmologists through a simulated eye-tracking experiment. Then, a new structure of AG-CNN is designed, including an attention prediction subnet, a pathological area localization subnet, and a glaucoma classification subnet. The attention maps are predicted in the attention prediction subnet to highlight the salient regions for glaucoma detection, under a weakly supervised training manner. In contrast to other attention-based CNN methods, the features are also visualized as the localized pathological area, which are further added in our AG-CNN structure to enhance the glaucoma detection performance. Finally, the experiment results from testing over our LAG database and another public glaucoma database show that the proposed AG-CNN approach significantly advances the state-of-the-art in glaucoma detection.
The photoexcited state lifetimes of iron complexes are typically much shorter than those of iridium and ruthenium complexes. For that reason, iron complexes find less application in photochemical ...organic synthesis. Through iron photocatalysis, a mild and effective protocol for decarboxylative C–C and C–N bond formation has been achieved. The carboxylic acids readily undergo radical decarboxylation in the presence of Fe2(SO4)3 and di‐(2‐picolyl)amine under visible light irradiation. The resulting alkyl radicals then react with Michael acceptors or azodicarboxylates to furnish the adducts.
A mild, practical protocol for the decarboxylative C–C and C–N bond formation has been accomplished. The method proceeds through an intramolecular charge transfer pathway of iron–substrate complexes under visible light irradiation.
Submicron drops from flapping bursting bubbles Jiang, Xinghua; Rotily, Lucas; Villermaux, Emmanuel ...
Proceedings of the National Academy of Sciences - PNAS,
01/2022, Volume:
119, Issue:
1
Journal Article
Peer reviewed
Open access
Tiny water drops produced from bubble bursting play a critical role in forming clouds, scattering sunlight, and transporting pathogens from water to the air. Bubbles burst by nucleating a hole at ...their cap foot and may produce jets or film drops. The latter originate from the fragmentation of liquid ligaments formed by the centripetal destabilization of the opening hole rim. They constitute a major fraction of the aerosols produced from bubbles with cap radius of curvature (
) > ∼0.4 × capillary length (
). However, our present understanding of the corresponding mechanisms does not explain the production of most submicron film drops, which represent the main number fraction of sea spray aerosols. In this study, we report observations showing that bursting bubbles with
< ∼0.4
are actually mainly responsible for submicron film drop production, through a mechanism involving the flapping shear instability of the cap with the outer environment. With this proposed pathway, the complex relations between bubble size and number of drops produced per bubble can be better explained, providing a fundamental framework for understanding the production flux of aerosols and the transfer of substances mediated by bubble bursting through the air-water interface and the sensitivity of the process to the nature of the environment.
As the 5G communication networks are being widely deployed worldwide, both industry and academia have started to move beyond 5G and explore 6G communications. It is generally believed that 6G will be ...established on ubiquitous Artificial Intelligence (AI) to achieve data-driven Machine Learning (ML) solutions in heterogeneous and massive-scale networks. However, traditional ML techniques require centralized data collection and processing by a central server, which is becoming a bottleneck of large-scale implementation in daily life due to significantly increasing privacy concerns. Federated learning, as an emerging distributed AI approach with privacy preservation nature, is particularly attractive for various wireless applications, especially being treated as one of the vital solutions to achieve ubiquitous AI in 6G. In this article, we first introduce the integration of 6G and federated learning and provide potential federated learning applications for 6G. We then describe key technical challenges, the corresponding federated learning methods, and open problems for future research on federated learning in the context of 6G communications.
Discovering dynamic characteristics in traffic flow is the significant step to design effective traffic managing and controlling strategy for relieving traffic congestion in urban cities. A new ...method based on complex network theory is proposed to study multivariate traffic flow time series. The data were collected from loop detectors on freeway during a year. In order to construct complex network from original traffic flow, a weighted Froenius norm is adopt to estimate similarity between multivariate time series, and Principal Component Analysis is implemented to determine the weights. We discuss how to select optimal critical threshold for networks at different hour in term of cumulative probability distribution of degree. Furthermore, two statistical properties of networks: normalized network structure entropy and cumulative probability of degree, are utilized to explore hourly variation in traffic flow. The results demonstrate these two statistical quantities express similar pattern to traffic flow parameters with morning and evening peak hours. Accordingly, we detect three traffic states: trough, peak and transitional hours, according to the correlation between two aforementioned properties. The classifying results of states can actually represent hourly fluctuation in traffic flow by analyzing annual average hourly values of traffic volume, occupancy and speed in corresponding hours.
•We construct network from multivariate traffic flow time series.•A weighted Froenius norm is adopt to estimate similarity between multivariate time series.•Principal Component Analysis is implemented to determine the weights.•We analyzed normalized network structure entropy and cumulative probability of degree.•We classify traffic state according to above two properties.