A decline in imaging quality will occur when the sampling data of ghost imaging is recorded by binarization. Based on the Otsu binarization, a method named Otsu binary ghost imaging (OBGI) is ...proposed to enhance imaging quality. Both theoretical and experimental results show that, with an appropriate threshold value, OBGI can enhance imaging quality significantly when compared with ordinary binary ghost imaging, even providing better imaging quality than that of traditional ghost imaging. It is also shown the method is more applicable under the conditions of low-light level and with a complicated object.
Fisheries management is generally based on age structure models. Thus, fish ageing data are collected by experts who analyze and interpret calcified structures (scales, vertebrae, fin rays, otoliths, ...etc.) according to a visual process. The otolith, in the inner ear of the fish, is the most commonly used calcified structure because it is metabolically inert and historically one of the first proxies developed. It contains information throughout the whole life of the fish and provides age structure data for stock assessments of all commercial species. The traditional human reading method to determine age is very time-consuming. Automated image analysis can be a low-cost alternative method, however, the first step is the transformation of routinely taken otolith images into standardized images within a database to apply machine learning techniques on the ageing data. Otolith shape, resulting from the synthesis of genetic heritage and environmental effects, is a useful tool to identify stock units, therefore a database of standardized images could be used for this aim. Using the routinely measured otolith data of plaice (Pleuronectes platessa Linnaeus, 1758) and striped red mullet (Mullus surmuletus Linnaeus, 1758) in the eastern English Channel and north-east Arctic cod (Gadus morhua Linnaeus, 1758), a greyscale images matrix was generated from the raw images in different formats. Contour detection was then applied to identify broken otoliths, the orientation of each otolith, and the number of otoliths per image. To finalize this standardization process, all images were resized and binarized. Several mathematical morphology tools were developed from these new images to align and to orient the images, placing the otoliths in the same layout for each image. For this study, we used three databases from two different laboratories using three species (cod, plaice and striped red mullet). This method was approved to these three species and could be applied for others species for age determination and stock identification.
La gestion des pêches est généralement basée sur des modèles structurés en âge. De ce fait, les données sur l’âge des poissons sont collectées par des experts qui analysent et interprètent des pièces calcifiées (écailles, vertèbres, rayons de nageoires, otolithes, etc.). L’otolithe, située dans l’oreille interne du poisson, est la principale pièce calcifiée utilisée, car elle est la seule métaboliquement inerte et est historiquement l’une des premiers proxys de données développées. L’otolithe contient également toutes les informations de l’histoire de vie du poisson et fournit des données d’âge pour toutes les évaluations de stocks des espèces commerciales. Cette méthode traditionnelle d’estimation de l’âge par un processus d’interprétation réalisée par un scientifique expert est donc très chronophage. L’analyse d’images peut être une méthode alternative peu coûteuse. Cependant, la première étape consiste à transformer les images d’otolithes prises en routine en images standardisées au sein d’une base de données afin d’appliquer des techniques d’apprentissage automatique sur les images. La forme des otolithes, résultant de la synthèse du patrimoine génétique et des effets de l’environnement, est un outil utile pour identifier les populations, une base de données d’images standardisées pourrait donc être également utilisée pour cela. À partir des données d’otolithes de plie (Pleuronectes platessa Linnaeus, 1758) et de rouget barbet de roche (Mullus surmuletus Linnaeus, 1758) en Manche Orientale ainsi que de la morue du nord-est de l’Arctique (Gadus morhua Linnaeus, 1758), un protocole de standardisation des données a été proposé. Toutes les étapes méthodologiques ont été développées sous un environnement R. Une matrice d’images en niveaux de gris a été générée à partir des images brutes dans différents formats. La détection des contours a été appliquée pour identifier les otolithes cassés, l’orientation de chaque otolithe et le nombre d’otolithes par image. Pour finaliser ce processus de standardisation, toutes les images ont été redimensionnées et binéarisées. Plusieurs outils mathématiques de morphologie ont été appliqués pour aligner et orienter les images, en plaçant les otolithes dans la même disposition pour chaque image. Pour cette étude, nous avons utilisé trois bases de données de deux laboratoires différents sur trois espèces (la morue, la plie et le rouget barbet de roche). Cette méthode a été approuvée sur ces trois espèces différentes et pourra être utilisée sur de multiples espèces pour la détermination de l’âge et l’identification des stocks.
•Gradient optimization alleviates gradient mismatch brought by forward binarization.•Quantization causes manifold distortion for feature maps of point cloud samples.•Manifold preserving based ...learnable scaling benefits representation fidelity more.•Pooling correction based on manifold preserving alleviates severe feature homogeneity.
With significant progress of deep learning on 3D point cloud, the demand for deployment of point cloud neural network on the edge devices is growing. Binary neural network, a type of quantization compression method, with extreme low bit and fast inference speed, attracts more attention. It is more challenging, but has greater potentiality. Most of the researches on binary networks focus on images rather than point cloud. Considering the particularity of point cloud neural network, this paper presents a novel binarization framework, which includes two main contributions. Firstly, a gradient optimization method is proposed to overcome the shortcomings of Straight Through Estimator (STE) commonly used in the back propagation of binary network training. Secondly, based on the analysis of manifold distortion caused by the binary convolution and pooling operations, we propose an optimized scaling recovery method to restore manifold for the convoluted feature, and also, a pooling correction method to improve the pooled feature's fidelity. Manifold distortion leads to the severe feature homogeneity problem, which brings trouble in generating features with sufficient discrimination for classification and segmentation. The manifold preserving optimizations are designed to introduce minimum extra parameters to balance the accuracy with the computation and storage consumption. Experiments show that the proposed method outperforms state-of-the-art in accuracy with ignored overhead, and also has good scalability.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
•Network binarization inevitably leads to feature binarization residual.•We conduct the baseline-auxiliary topology to boost the binary model capability.•We devise a hybrid performance estimation ...indicator to guide the search phase.
While network binarization is a promising method in memory saving and speedup on hardware, it inevitably leads to binarization residual of intermediate features, resulting in performance capability degradation. To alleviate the above issue, we focus on the network topology design scheme to the more suitable network structure for the extreme-low-bit scenario. In this paper, we propose the baseline-auxiliary expanding network design method to compensate for the binarization residual of features via searching for auxiliary branches, denoted as AuxBranch. The intermediate feature maps are reasonably enhanced by combining baseline and auxiliary features, mimicking the corresponding feature output of the full-precision network. In addition, we devise a hybrid performance estimator (PE) with three elements of preliminary accuracy, feature similarity, and computational complexity. The PE jointly performs an efficient architecture search for binarization baseline and enables automatic computation complexity adjustment under diverse constraints. Extensive experiments show that our approach is superior in terms of accuracy and computational performance, and is plug-and-play for different network backbones and binarization policies. Our code is available at https://github.com/VipaiLab/AuxBranch.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
Purpose
: To evaluate the the effects of seasonal allergic counjunctivitis (AC) and its treatment upon choroidal structure such as choroidal thickness (ChT) and choroidal vascular index (CVI) through ...the use of topical antihistamine agents.
Methods
. The 60 eyes of 30 patients were included in the AC group. Another 30 patients were included in the control group. The choroid was imaged by using enhanced depth imaging optical coherence tomography (EDI-OCT) instrument without pupillary dilation. ChT was defined as the area between the outer hyperreflective border of the RPE and the sclerochoroidal border at the fovea, 750 µm temporal to the fovea and 750 µm nasal to the fovea. Image J was used to measure CVI. With the images obtained from EDI-OCT, the total choroidal area (TCA), luminal area (LA), stromal area (SA), and CVI were calculated using the binarization method.
Results
. The mean ChT value in the AC group was 358.5 ± 93.8 µm at baseline and 356.8 ± 86.6 µm following 1 month of treatment. Meanwhile, the mean ChT in the control group was 316.6 ± 60.7 µm. The mean CVI value was 66.65 ± 2.98 in the control group, 70.75 ± 3.26 in the AC group at baseline, and 69.50 ± 3.17 following 1 month of treatment. Statistically significant difference was tracked between control and AC group (p = 0.028) and control group and posttreatment values (p=0,031). There were no statistically significant difference between initial AC treatment values and posttreatment values for all of the measurements.
Conclusion
. ChT and CVI can increase in patients with AC due to inflammation and increased vascular permeability. Although symptoms and signs related to AC may disappear after the treatment, effects in the choroid do not immediately normalize.
High-density coherent fiber bundles, mainly used for high-resolution fiberscopes, consist of non-circular cores and have various optical noises such as auto-fluorescence, crosstalk, and honeycomb ...artifacts that may require excessive image processing. Binarization was applied to post-image processing to eliminate a calibration used as pre-image processing for traditional fiberscope image reconstruction. It was able to significantly reduce the complexity of implementing core peak detection and effectively reduce the overall processing time. For binarization, the global thresholding suggested by Otsu and the local thresholdings suggested by Bernsen, Niblack, and White and Rohrer were used. By applying a dynamic method to our method, we can further simplify the process and apply it to image sequencing. When evaluating the results comprehensively, the local thresholding proposed by Bernsen was best suited for interpolating our fiberscope images. By introducing binarization combined with local thresholding in post-image processing, we were able to successfully eliminate the preliminary calibration process and reduce the burden of overall image processing for image reconstruction in a high-resolution fiberscope.
•Binarization is applied to eliminate a calibration used as pre-image processing.•Binarization can reduce the complexity of implementing core peak detection.•Binarization reduces overall image processing time.•Binarization with local thresholding proposed by Bernsen is best for our fiberscope images.•Dynamic method can be applied to image sequencing with further simplification of the process.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
Classic adaptive binarization methodologies threshold pixels intensity with respect to adjacent pixels exploiting integral images. In turn, integral images are generally computed optimally by using ...the summed-area-table algorithm (SAT). This document presents a new adaptive binarization technique based on fuzzy integral images. Which, in turn, this technique is supported by an efficient design of a modified SAT for generalized Sugeno fuzzy integrals. We define this methodology as FLAT (Fuzzy Local Adaptive Thresholding). Experimental results show that the proposed methodology produced a better image quality thresholding than well-known global and local thresholding algorithms. We proposed new generalizations of different fuzzy integrals to improve existing results and reaching an accuracy ≈0.94 on a wide dataset. Moreover, due to high performances, these new generalized Sugeno fuzzy integrals created ad hoc for adaptive binarization, can be used as tools for grayscale processing and more complex real-time thresholding applications.
•Generalized Sugeno fuzzy thresholding algorithms are used for image binarization.•Fuzzy integrals methods outperform traditional adaptive thresholding.•Fuzzy integral adaptive algorithms are used for fast image binarization.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
We propose a new local-binary ghost imaging by using point-by-point method. This method can compensate the degradation of imaging quality due to the loss of information during binarization process. ...The numerical and experimental results show that the target details can be reconstructed well by this method when compared with traditional ghost imaging. By comparing the differences of the speckle patterns from different binarization methods, we also give the corresponding explanation. Our results may have the potential applications in areas with high requirements for imaging details, such as target recognition.
•A new local-binary ghost imaging (GI) based on point-by-point method is presented.•This method chooses the threshold according to the each pixel-point to perform the binarization.•The details of the imaging object can be more accurately reconstructed when compared with traditional GI.•The qualitative explanation to this phenomenon is given.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
•A novel binarization method for degraded document image is presented.•We apply cGANs strategy on document image binarization and propose a proper framework for this task.•We solve the core problem ...of multi-scale information combination using cascaded sub-generators.•Extensive experiments on various datasets show that our method is robust and effective.
Binarization is often the first step in many document analysis tasks and plays a key role in the subsequent steps. In this paper, we formulate binarization as an image-to-image generation task and introduce the conditional generative adversarial networks (cGANs) to solve the core problem of multi-scale information combination in binarization task. Our generator consists of two stages: In the first stage, sub-generator G1 learns to extract text pixels from an input image. Different scales of the input image are processed by G1 and corresponding binary images are generated. In the second stage, our sub-generator G2 learns a combination of results at different scales from the first stage and produces the final binary result. We conduct comprehensive experiments of the proposed method on nine public document image binarization datasets. Experimental results show that compared with many classical and state-of-the-art approaches, our method gains promising performance in the accuracy and robustness of binarization.
Full text
Available for:
GEOZS, IJS, IMTLJ, KILJ, KISLJ, NLZOH, NUK, OILJ, PNG, SAZU, SBCE, SBJE, UILJ, UL, UM, UPCLJ, UPUK, ZAGLJ, ZRSKP
In this paper, we propose an efficient method for reliably detecting road lanes based on spatiotemporal images. In an aligned spatiotemporal image generated by accumulating the pixels on a scanline ...along the time axis and aligning consecutive scanlines, the trajectory of the lane points appears smooth and forms a straight line. The aligned spatiotemporal image is binarized, and two dominant parallel straight lines resulting from the temporal consistency of lane width on a given scanline are detected using a Hough transform, reducing alignment errors. The left and right lane points are then detected near the intersections of the straight lines and the current scanline. Our spatiotemporal domain approach is more robust missing or occluded lanes than existing frame-based approaches. Furthermore, the experimental results show not only computation times reduced to as little as one-third but also a slightly improved rate of detection.