Change detection of natural lake boundaries is one of the important tasks in remote sensing image interpretation. In an ordinary fully connected network, or CNN, the signal of neurons in each layer ...can only be propagated to the upper layer, and the processing of samples is independent at each moment. However, for time-series data with transferability, the learned change information needs to be recorded and utilized. To solve the above problems, we propose a lake boundary change prediction model combining U-Net and LSTM. The ensemble of LSTMs helps to improve the overall accuracy and robustness of the model by capturing the spatial and temporal nuances in the data, resulting in more precise predictions. This study selected Lake Urmia as the research area and used the annual panoramic remote sensing images from 1996 to 2014 (Lat: 37°00′ N to 38°15′ N, Lon: 46°10′ E to 44°50′ E) obtained by Google Earth Professional Edition 7.3 software as the research data set. This model uses the U-Net network to extract multi-level change features and analyze the change trend of lake boundaries. The LSTM module is introduced after U-Net to optimize the predictive model using historical data storage and forgetting as well as current input data. This method enables the model to automatically fit the trend of time series data and mine the deep information of lake boundary changes. Through experimental verification, the model’s prediction accuracy for lake boundary changes after training can reach 89.43%. Comparative experiments with the existing U-Net-STN model show that the U-Net-LSTM model used in this study has higher prediction accuracy and lower mean square error.
The identification of small land targets in remote sensing imagery has emerged as a significant research objective. Despite significant advancements in object detection strategies based on deep ...learning for visible remote sensing images, the performance of detecting a small and densely distributed number of small targets remains suboptimal. To address this issue, this study introduces an improved model named YOLOV4_CPSBi, based on the YOLOV4 architecture, specifically designed to enhance the detection capability of small land targets in remote sensing imagery. The proposed model enhances the traditional CSPNet by redefining its channel partitioning and integrating this enhanced structure into the neck part of the YOLO network model. Additionally, the conventional pyramid fusion structure used in the traditional BiFPN is removed. By integrating a weight-based bidirectional multi-scale mechanism for feature fusion, the model is capable of effectively reasoning about objects of various sizes, with a particular focus on detecting small land targets, without introducing a significant increase in computational costs. Using the DOTA dataset as research data, this study quantifies the object detection performance of the proposed model. Compared with various baseline models, for the detection of small targets, its AP performance has been improved by nearly 8% compared with YOLOV4. By combining these modifications, the proposed model demonstrates promising results in identifying small land targets in visible remote sensing images.
The atmospheric density of the thermosphere is a fundamental parameter for spacecraft launch and orbit control. Under magnetic storm conditions, the thermospheric atmospheric density experiences ...significant fluctuations, which have a negative impact on spacecraft control. Exploring thermospheric density during geomagnetic storms can help to mitigate the effects of such events. Research on the inversion of accelerometer measurements for different satellites and the variations of atmospheric density under extreme conditions is still in its infancy. In this paper, the distribution of atmospheric density during three geomagnetic storms is investigated from the inversion results of the Swarm-C accelerometer. Three major geomagnetic storms and their recovery phases are selected as case studies. The thermospheric density obtained by Swarm-C is separated into day and night regions. The empirical orthogonal function analysis method is used to study the spatiotemporal distribution of thermospheric density during geomagnetic storms. The results indicate that storms have a more significant impact on nighttime thermospheric density. The impact of magnetic storms on the temporal distribution of thermospheric density is considerable. The first-order empirical orthogonal function (EOF) time coefficient value on the day after the storm is the largest, reaching 2–3 times that before the magnetic storm. The impact of magnetic storms on atmospheric density is mainly reflected in the time distribution. The spatial distribution of atmospheric density is less affected by magnetic storms and is relatively stable in the short term. The impact of magnetic storms on the spatial distribution of nighttime thermospheric density is more significant than that of daytime regions, and the response of daytime regions to magnetic storms is slower.
The modeling and simulation of biological tissue is the core part of a virtual surgery system. In this study, the geometric and physical methods related to soft tissue modeling were investigated. ...Regarding geometric modeling, the problem of repeated inverse calculations of control points in the Bezier method was solved via re-parameterization, which improved the calculation speed. The base surface superposition method based on prior information was proposed to make the deformation model not only have the advantages of the Bezier method but also have the ability to fit local irregular deformation surfaces. Regarding physical modeling, the fitting ability of the particle spring model to the anisotropy of soft tissue was improved by optimizing the topological structure of the particle spring model. Then, the particle spring model had a more extensive nonlinear fitting ability through the dynamic elastic coefficient parameter. Finally, the secondary modeling of the elastic coefficient based on the virtual body spring enabled the model to fit the creep and relaxation characteristics of biological tissue according to the elongation of the virtual body spring.
In recent years, haze pollution is frequent, which seriously affects daily life and production process. The main factors to measure the degree of smoke pollution are the concentrations of PM2.5 and ...PM10. Therefore, it is of great significance to study the prediction of PM2.5/PM10 concentration. Since PM2.5 and PM10 concentration data are time series, their time characteristics should be considered in their prediction. However, the traditional neural network is limited by its own structure and has some weakness in processing time related data. Recurrent neural network is a kind of network specially used for sequence data modeling, that is, the current output of the sequence is correlated with the historical output. In this paper, a haze prediction model is established based on a deep recurrent neural network. We obtained air pollution data in Chengdu from the China Air Quality Online Monitoring and Analysis Platform, and conducted experiments based on these data. The results show that the new method can predict smog more effectively and accurately, and can be used for social and economic purposes.
Swarm-C satellite, a new instrument for atmospheric study, has been the focus of many studies to evaluate its usage and accuracy. This paper takes the Swarm-C satellite as a research object to verify ...the Swarm-C accelerometer’s inversion results. This paper uses the two-row orbital elements density inversion to verify the atmospheric density accuracy results of the Swarm-C satellite accelerometer. After the accuracy of the satellite data is verified, this paper conducts comparative verification and empirical atmospheric model evaluation experiments based on the Swarm-C accelerometer’s inversion results. After comparing with the inversion results of the Swarm-C semi-major axis attenuation method, it is found that the atmospheric density obtained by inversion using the Swarm-C accelerometer is more dynamic and real-time. It shows that with more available data, the Swarm-C satellite could be a new high-quality instrument for related studies along with the well-established satellites. After evaluating the performance of the JB2008 and NRLMSISE-00 empirical atmospheric models using the Swarm-C accelerometer inversion results, it is found that the accuracy and real-time performance of the JB2008 model at the altitude where the Swarm-C satellite is located are better than the NRLMSISE-00 model.
Most of the 3D reconstruction requirements of microscopic scenes exist in industrial detection, and this scene requires real-time object reconstruction and can get object surface information quickly. ...However, this demand is challenging to obtain for micro scenarios. The reason is that the microscope’s depth of field is shallow, and it is easy to blur the image because the object’s surface is not in the focus plane. Under the video microscope, the images taken frame by frame are mostly defocused images. In the process of 3D reconstruction, a single sheet or a few 2D images are used for geometric-optical calculation, and the affine transformation is used to obtain the 3D information of the object and complete the 3D reconstruction. The feature of defocus image is that its complete information needs to be restored by a whole set of single view defocus image sequences. The defocused image cannot complete the task of affine transformation due to the lack of information. Therefore, using defocus image sequence to restore 3D information has higher processing difficulty than ordinary scenes, and the real-time performance is more difficult to guarantee. In this paper, the surface reconstruction process based on point-cloud data is studied. A Delaunay triangulation method based on plane projection and synthesis algorithm is used to complete surface fitting. Finally, the 3D reconstruction experiment of the collected image sequence is completed. The experimental results show that the reconstructed surface conforms to the surface contour information of the selected object.
In recent years, with economic development, urbanization has been accelerating. In the past 100 years, the global average temperature has increased by 0.5°C, and according to the predictions of most ...global climate models, the temperature will continue to increase by 1.5°C to 3.0°C in the next 100 years. Under the influence of global warming, the urban heat island effect problem is becoming more and more serious, bringing much harm. Studying the thermal effect and its influencing factors is of great significance for sustainable urban development. In this study, the seasonal and interannual surface temperature changes of the study area are inverted based on the SW-TES algorithm, and the LST inversion of the algorithm is simply verified. At the same time, a cell dichotomous model was established to explore the influencing factors of the urban heat island effect and the influence of different land use types and normalized vegetation indices in the Hangzhou metropolitan area was analyzed. The results showed that the overall heating trend of the study area showed a heating trend, the expansion rate of regional construction land was relatively fast, and the scale of urban land was increasing. Based on the results of surface temperature changes of areas with different land types, we obtained the temperature trends of each land type from 2005 to 2018 and found that there was a negative correlation between surface temperature and normalized vegetation index. This study provides a theoretical basis to evaluate the urban heat island effect and analyze factors that impact it and would promote sustainable urban development.
Air pollution with fluidity can influence a large area for a long time and can be harmful to the ecological environment and human health. Haze, one form of air pollution, has been a critical problem ...since the industrial revolution. Though the actual cause of haze could be various and complicated, in this paper, we have found out that many gases’ distributions and wind power or temperature are related to PM2.5/10’s concentration. Thus, based on the correlation between PM2.5/PM10 and other gaseous pollutants and the timing continuity of PM2.5/PM10, we propose a multilayer long short-term memory haze prediction model. This model utilizes the concentration of O3, CO, NO2, SO2, and PM2.5/PM10 in the last 24 h as inputs to predict PM2.5/PM10 concentrations in the future. Besides pre-processing the data, the primary approach to boost the prediction performance is adding layers above a single-layer long short-term memory model. Moreover, it is proved that by doing so, we could let the network make predictions more accurately and efficiently. Furthermore, by comparison, in general, we have obtained a more accurate prediction.
At present, in the application of feature-based medical 3D reconstruction technology, there are still problems such as low matching accuracy of feature points in endoscope images and slow processing ...speed of image data. Therefore, the feature-based 3D reconstruction theory is of great research value and has great application value. This paper proposed a new feature detection method to improve the problems. This paper divides feature detection into two parts for further improvements: feature extraction and feature description. For feature extraction, the FAST algorithm shows a poor classification effect, so this paper adds the decision tree based on the C4.5 algorithm into the traditional FAST. The original data are divided into two decision trees to make the feature extraction performance more stable and feature point extraction more efficient. For the feature description part, the FREAK descriptor is used, combined with this paper's improved feature extraction algorithm. The feature points are extracted in scale space. The second-order function fitting is carried out according to the feature points' response scores in different scales. The scale-invariant descriptor of sub-pixel precision is obtained. The experimental results on the endoscope image show that the feature extraction method has a higher extraction accuracy and faster extraction speed. In addition, the feature description algorithm has higher calculation efficiency.