Images captured in hazy or foggy weather conditions are seriously degraded by the scattering of atmospheric particles, which directly influences the performance of outdoor computer vision systems. In ...this paper, a fast algorithm for single image dehazing is proposed based on linear transformation by assuming that a linear relationship exists in the minimum channel between the hazy image and the haze-free image. First, the principle of linear transformation is analyzed. Accordingly, the method of estimating a medium transmission map is detailed and the weakening strategies are introduced to solve the problem of the brightest areas of distortion. To accurately estimate the atmospheric light, an additional channel method is proposed based on quad-tree subdivision. In this method, average grays and gradients in the region are employed as assessment criteria. Finally, the haze-free image is obtained using the atmospheric scattering model. Numerous experimental results show that this algorithm can clearly and naturally recover the image, especially at the edges of sudden changes in the depth of field. It can, thus, achieve a good effect for single image dehazing. Furthermore, the algorithmic time complexity is a linear function of the image size. This has obvious advantages in running time by guaranteeing a balance between the running speed and the processing effect.
Images captured in hazy or foggy weather conditions can be seriously degraded by scattering of atmospheric particles,which reduces the contrast,changes the color,and makes the object features ...difficult to identify by human vision and by some outdoor computer vision systems.Therefore image dehazing is an important issue and has been widely researched in the field of computer vision.The role of image dehazing is to remove the influence of weather factors in order to improve the visual effects of the image and provide benefit to post-processing.This paper reviews the main techniques of image dehazing that have been developed over the past decade.Firstly,we innovatively divide a number of approaches into three categories:image enhancement based methods,image fusion based methods and image restoration based methods.All methods are analyzed and corresponding sub-categories are introduced according to principles and characteristics.Various quality evaluation methods are then described,sorted and discussed in detail.Finally,research progress is summarized and future research directions are suggested.
Images captured under poor illumination conditions often exhibit characteristics such as low brightness, low contrast, a narrow gray range, and color distortion, as well as considerable noise, which ...seriously affect the subjective visual effect on human eyes and greatly limit the performance of various machine vision systems. The role of low-light image enhancement is to improve the visual effect of such images for the benefit of subsequent processing. This paper reviews the main techniques of low-light image enhancement developed over the past decades. First, we present a new classification of these algorithms, dividing them into seven categories: gray transformation methods, histogram equalization methods, Retinex methods, frequency-domain methods, image fusion methods, defogging model methods and machine learning methods. Then, all the categories of methods, including subcategories, are introduced in accordance with their principles and characteristics. In addition, various quality evaluation methods for enhanced images are detailed, and comparisons of different algorithms are discussed. Finally, the current research progress is summarized, and future research directions are suggested.
Texture filtering depends on high-quality texture measurement to separate structures from textures. However, the existing methods employ axis-aligned box windows for texture measurement, which may ...cover different texture regions, and so lowering the measurement quality because structure edges are not always parallel to the axes. In addition, the existing texture measurements consider intensity contrast at the pixel level and do not account for the linear characteristics of structure edges in filtering windows; thus, their measurement effectiveness is limited. This results in a dilemma for texture filtering. Large-scale textures are not smoothed using smaller windows, while small structures are removed using larger windows. In this paper, we present edge-aware measures to improve texture measurement. Edge-aware windows are constructed such that each window is inside a texture region to the greatest extent possible, and the linear characteristics of structure edges are accounted for in the texture measurement. Furthermore, we use large box windows for texture filtering and long and narrow edge-aware small windows for texture measurement to filter out large-scale textures while preserving small structures. The experimental results show improved texture filtering with our method compared with existing methods.
Texture filtering seeks to smooth out textured details in order to present structures prominently. To filter out multi-scale textured details while preserving structures, some methods propose to ...adjust the size of the filtering windows by handling the pixels near structures with small windows and the other pixels with large windows. Unfortunately, their adjustment measures are not very effective in treating complex situations. They may handle the pixels inside small structures with large windows, which overly smooths them, or they may be incapable of smoothing out large-scale textures. With regard to this, in this paper, we present a novel method that adjusts the window size for a pixel by checking similar pixels in its neighborhood. In general, pixels nearer structures have fewer similar pixels in their neighborhoods than the pixels that are farther away from structures. Thus, our adjustment can adaptively adjust window sizes by the distances from pixels to structures, which improves the potential to treat complex situations. The experimental results show that we can well filter out very large-scale textures while preserving small structures with high quality in a manner superior to the state-of-the-art methods. In addition, our method is very simple and efficient, and can be used easily in applications.
To solve the issue of hexavalent chromium (Cr(VI)) contamination in water bodies, blue coke powder (LC) was chemically changed using potassium hydroxide to create the modified material (GLC), which ...was then used to treat a Cr(VI)-containing wastewater solution. The differences between the modified and unmodified blue coke's adsorption characteristics for Cr(VI) were studied, and the impact of pH, starting solution concentration, and adsorption period on the GLC's adsorption performance was investigated. The adsorption behavior of the GLC was analyzed using isothermal adsorption models, kinetic models, and adsorption thermodynamic analysis. The mechanism of Cr(VI) adsorption by the GLC was investigated using characterization techniques such as Fourier Transform Infrared Spectroscopy (FTIR), Field Emission Scanning Electron Microscope (FE-SEM), X-Ray Diffraction (XRD), and X-Ray Photoelectron Spectroscopy (XPS). With the biggest difference in removal rate at pH = 2, which was 2.42 times that of LC, batch adsorption experiments revealed that, under the same adsorption conditions, the GLC always performed better than LC. With a specific surface area that was three times that of LC and an average pore diameter that was 0.67 times that of LC, GLC had a more porous structure than LC. The alteration significantly increased the number of hydroxyls on the surface of GLC by altering the structural makeup of LC. The ideal pH for removing Cr(VI) was 2, and the ideal GLC adsorbent dosage was 2.0 g/L. Pseudo-second-order kinetic (PSO) model and Redlich-Peterson (RP) model can effectively describe the adsorption behavior of GLC for Cr(VI). Physical and chemical adsorption work together to remove Cr(VI) by GLC in a spontaneous, exothermic, and entropy-increasing process, with oxidation-reduction processes playing a key role. GLC is a potent adsorbent that can be used to remove Cr(VI) from aqueous solutions.
Full text
Available for:
IZUM, KILJ, NUK, PILJ, PNG, SAZU, UL, UM, UPUK
China's rapid urbanization and high traffic accident frequency have received many researchers' attention. It is important to reveal how urban infrastructures and other risk factors affects the ...traffic accident frequency. A growing amount of research has examined the local risk factors impact on traffic accident frequency at certain time. Some studies considered these spatial influences but overlooked the temporal correlation/heterogeneity of traffic accidents and related risk factors. This study explores risk factors' influence on urban traffic accidents frequency while considering both the spatial and temporal correlation/heterogeneity of traffic accidents. The study area is split into 100 equally sized rectangle traffic analysis zones (TAZs), and the urban traffic accident frequency and attributes in each TAZ are extracted. The linear regression model, spatial lag model (SLM), spatial error model (SEM) and time-fixed effects error model (T-FEEM) are established and compared respectively. The proposed methodologies are illustrated using ten-month traffic accident data from the urban area of Guiyang City, China. The results reveal that the time-fixed effects error model, which considers both spatial and temporal correlation/heterogeneity of traffic accidents, is superior to other models. More traffic accidents will happen in those TAZs that have more hospitals or schools. Moreover, hospitals have a greater influence on traffic accidents than schools. Because of the location in the margin of the city, those TAZs that have passenger stations have more traffic accidents. This study provides policy makers with more detailed characterization about the impact of related risk factors on traffic accident frequencies, and it is suggested that not only the spatial correlation/heterogeneity but also the temporal correlation/heterogeneity should be taken into account in guiding traffic accident control of urban area.
Full text
Available for:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
•The impact of container reshuffling is integrated with space allocation problem.•A simulation model is developed to capture the relations between the reshuffling factors.•The impact of reshuffling ...is approximated by a high-dimensional interpolation method.•Empirical approaches are explored to solve the problem.•Experiment results show that the potential losses from ignoring reshuffling activities can be significant.
Space allocation problem (SAP) in container terminals is one of the yard management challenges in short-term planning, however, the impact of container reshuffling to the SAP is neglected in much of the literature as well as in real terminal planning. This study focuses on the SAP that accounts for container reshuffling, where the macro-level impact of reshuffling is derived from discrete event simulation and integrated into mix integer programming model. Empirical approaches are developed to achieve a trade-off between fast computation and good solutions. The result shows that ignoring reshuffling activities during planning will lead to overestimation on yard capacity.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
Apocynum venetum L. belongs to the Apocynaceae family and is a plant that is highly resistant to stress. It is important in the fields of ecology, feeding, industry and medicine. The molecular ...mechanism underlying salt tolerance has not been elucidated. In this study, RNA-seq based transcriptome sequencing of A. venetum leaves after 0, 2, 6, 12, 24 and 48 h of treatment with 300 mM NaCl was performed. We conducted a comprehensive analysis of the transcriptome expression profiles of A. venetum under salt stress using the WGCNA method and identified red, black, and brown as the core modules regulating the salt tolerance of A. venetum. A co-expression regulatory network was constructed to identify the core genes in the module according to the correlations between genes. The genes TRINITY_DN102_c0_g1 (serine carboxypeptidase), TRINITY_DN3073_c0_g1 (SOS signaling pathway) and TRINITY_DN6732_c0_g1 (heat shock transcription factor) in the red module were determined to be the core genes. Two core genes in the black module, TRINITY_DN9926_c0_g1 and TRINITY_DN7962_c0_g1, are pioneer candidate salt tolerance-associated genes in A. venetum. The genes in the brown module were mainly enriched in two pathways, namely photosynthesis and osmotic balance. Among them, the TRINITY_DN6321_c0_g2 and TRINITY_DN244_c0_g1 genes encode aquaporin, which is helpful for maintaining the cell water balance and plays a protective role in defending A. venetum under abiotic stress. Our findings contribute to the identification of core genes involved in the response of A. venetum to salt stress.
Full text
Available for:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK