In recent years, haze pollution is frequent, which seriously affects daily life and production process. The main factors to measure the degree of smoke pollution are the concentrations of PM2.5 and ...PM10. Therefore, it is of great significance to study the prediction of PM2.5/PM10 concentration. Since PM2.5 and PM10 concentration data are time series, their time characteristics should be considered in their prediction. However, the traditional neural network is limited by its own structure and has some weakness in processing time related data. Recurrent neural network is a kind of network specially used for sequence data modeling, that is, the current output of the sequence is correlated with the historical output. In this paper, a haze prediction model is established based on a deep recurrent neural network. We obtained air pollution data in Chengdu from the China Air Quality Online Monitoring and Analysis Platform, and conducted experiments based on these data. The results show that the new method can predict smog more effectively and accurately, and can be used for social and economic purposes.
Air pollution with fluidity can influence a large area for a long time and can be harmful to the ecological environment and human health. Haze, one form of air pollution, has been a critical problem ...since the industrial revolution. Though the actual cause of haze could be various and complicated, in this paper, we have found out that many gases’ distributions and wind power or temperature are related to PM2.5/10’s concentration. Thus, based on the correlation between PM2.5/PM10 and other gaseous pollutants and the timing continuity of PM2.5/PM10, we propose a multilayer long short-term memory haze prediction model. This model utilizes the concentration of O3, CO, NO2, SO2, and PM2.5/PM10 in the last 24 h as inputs to predict PM2.5/PM10 concentrations in the future. Besides pre-processing the data, the primary approach to boost the prediction performance is adding layers above a single-layer long short-term memory model. Moreover, it is proved that by doing so, we could let the network make predictions more accurately and efficiently. Furthermore, by comparison, in general, we have obtained a more accurate prediction.
In this article, we report a composite of MnO2 nanoparticles supported by three-dimensionally ordered macroporous carbon (MnO2/3DOM carbon nanocomposites) fabricated by means of a simple ...multi-component infiltration of three-dimensional templates. MnO2 nanoparticles of 2 nm–6 nm are observed to be highly dispersed on the 3DOM carbon scaffolds. Cyclic voltammetry, galvanostatic charge/discharge and electrochemical impedance spectroscopy techniques are employed to assess the properties of these nanocomposites for use in supercapacitors. The results demonstrate that MnO2 can be effectively utilized with assistance of the 3DOM carbon in the electrode. The specific capacitance of the nanocomposite electrode can reach as high as 347 F g−1 at a current density of 0.5 A g−1. Moreover, the electrode exhibit excellent charge/discharge rate and good cycling stability, retaining over 92% of its initial charge after 5500 cycles at a current density of 2.5 A g−1. Such MnO2/3DOM carbon nanocomposite represents a promising exploring direction for enhancing the device performance of metal oxide-based electrochemical supercapacitors.
•Three-dimensionally ordered macroporous MnO2–carbon nanocomposites (3DOM MCNs) are prepared.•The 3DOM MCNs possess hierarchical pore structure with highly dispersed MnO2 nanoparticles.•The 3DOM MCNs are promising for supercapacitor applications.
•The Quaternary Tan-Lu fault zone in the Bohai Bay consists of two main faults dipping steeply to the west.•The Tan-Lu fault zone was dominated by reverse dextral slip during the Holocene.•The Tan-Lu ...fault zone in the southern and central sections of the Bohai Bay was last active during the Holocene.•The ENE–WSW compression led to the Holocene faulting associated with many recent earthquakes in the Bohai Bay.
The middle segment of the NNE–SSW-striking Tan-Lu Fault Zone (TLFZ) crosses the Bohai Bay in East China. This fault zone is a large seismically active belt that was responsible for the 1969 M (magnitude) 7.4 Bohai earthquake. However, the geometry, kinematics, and detailed characteristics of the most recent Quaternary seismic activity along the TLFZ in the Bohai Bay area are debated, primarily as the fault crops out below sea level in this area. Here, we present the results of a series of seismic profiles that enable the identification of Quaternary activity along the TLFZ within the Bohai Bay area. These data indicate that the Quaternary TLFZ crops out over a wide area of the Bohai Bay and is dominated by two main faults, both of which dip steeply to the west. Each of these two main faults consists of several left-stepping en-echelon faults. Quaternary activity along the TLFZ was generally focused along pre-existing Paleogene–Neogene faults. The seismic profiles, combined with the distribution of earthquakes in this area, demonstrate that the TLFZ in the southern and central areas of the Bohai Bay was last active during the Holocene, whereas the last movements along the faults in the northern Bohai Bay area range from late Pleistocene in the south to early–middle Pleistocene and the end of the Neogene in the north. The seismic profiles show both normal and reverse offset of the base of the Quaternary sediments by the Tan-Lu faults, although the normal offset predominates. Anticlines developed in Quaternary sediments occur close to the Quaternary Tan-Lu faults, and the TLFZ is also associated with an angular unconformity between Holocene and Pleistocene sediments. It is likely that the TLFZ was dominated by reverse dextral slip during the Holocene, whereas all of the Quaternary faulting before this involved alternating reverse and normal dextral slip. Oblique extension is most likely responsible for the normal dextral slip along the TLFZ, whereas focal mechanisms, GPS (global positioning system) data, and in situ stress measurements all suggest that ENE–WSW compression led to the reverse dextral faulting associated with numerous recent earthquakes in the Bohai Bay area.
This study aimed to investigate the association of serum high-mobility group box-1 (HMGB1) and toll-like receptor 4 (TLR4) expressions with the risk of epilepsy as well as their correlations with ...disease severity and resistance to anti-epilepsy drugs. One hundred and five epilepsy patients and 100 healthy controls (HCs) were enrolled in this case-control study, and serum samples were collected from all participants to assess the HMGB1 and TLR4 expressions by enzyme-linked immunosorbent assay (ELISA). Both serum HMGB1 (P<0.001) and TLR4 (P<0.001) expressions were higher in epilepsy patients than in HCs, and they displayed good predictive values for risk of epilepsy. Moreover, HMGB1 was positively correlated with TLR4 level (r=0.735, P<0.001). HMGB1 and TLR4 levels were both elevated in patients with an average seizure duration >5 min compared to patients with a seizure duration ≤5 min (P=0.001 and P=0.014, respectively). Also, HMGB1 and TLR4 were increased in patients with seizure frequency >3 times per month compared to patients with seizure frequency ≤3 times per month (both P=0.001). In addition, HMGB1 and TLR4 expressions were higher in intractable cases compared to drug-responsive cases (P<0.001). In conclusion, both HMGB1 and TLR4 expressions were correlated with increased risk and severity of epilepsy and their level was higher in patients resistant to anti-epilepsy drugs.
Natural fracture data from one of the Carboniferous shale masses in the eastern Qaidam Basin were used to establish a stochastic model of a discrete fracture network and to perform discrete element ...simulation research on the size effect and mechanical parameters of shale. Analytical solutions of fictitious joints in transversely isotropic media were derived, which made it possible for the proposed numerical model to simulate the bedding and natural fractures in shale masses. The results indicate that there are two main factors influencing the representative elementary volume (REV) size of a shale mass. The first and most decisive factor is the presence of natural fractures in the block itself. The second is the anisotropy ratio: the greater the anisotropy is, the larger the REV. The bedding angle has little influence on the REV size, whereas it has a certain influence on the mechanical parameters of the rock mass. When the bedding angle approaches the average orientation of the natural fractures, the mechanical parameters of the shale blocks decrease greatly. The REV representing the mechanical properties of the Carboniferous shale masses in the eastern Qaidam Basin were comprehensively identified by considering the influence of bedding and natural fractures. When the numerical model size is larger than the REV, the fractured rock mass discontinuities can be transformed into equivalent continuities, which provides a method for simulating shale with natural fractures and bedding to analyze the stability of a borehole wall in shale.
The implementation of cognitive diagnostic computerized adaptive testing often depends on a high-quality item bank. How to online estimate the item parameters and calibrate the
Q
-matrix required by ...items becomes an important problem in the construction of the high-quality item bank for personalized adaptive learning. The related previous research mainly focused on the calibration method with the random design in which the new items were randomly assigned to examinees. Although the way of randomly assigning new items can ensure the randomness of data sampling, some examinees cannot provide enough information about item parameter estimation or Q-matrix calibration for the new items. In order to increase design efficiency, we investigated three adaptive designs under different practical situations: (a) because the non-parametric classification method needs calibrated item attribute vectors, but not item parameters, the first study focused on an optimal design for the calibration of the Q-matrix of the new items based on Shannon entropy; (b) if the Q-matrix of the new items was specified by subject experts, an optimal design was designed for the estimation of item parameters based on Fisher information; and (c) if the Q-matrix and item parameters are unknown for the new items, we developed a hybrid optimal design for simultaneously estimating them. The simulation results showed that, the adaptive designs are better than the random design with a limited number of examinees in terms of the correct recovery rate of attribute vectors and the precision of item parameters.
We present a case of a 48-year-old woman with 27 months of exposure to aluminum dust and silica owing to polishing processing. The patient was admitted to our hospital with intermittent cough and ...expectoration. Chest high-resolution computed tomography showed diffuse ill-defined centrilobular nodules and patchy ground-glass opacities in bilateral lungs. A video-assisted thoracoscopic surgery biopsy demonstrated multiple isolated and confluent granulomas in an otherwise normal parenchyma without malignancy or signs of infection. Elemental analysis was performed on the grinding wheel powder in the workplace using an X-ray fluorescence spectrometric analyzer, showing 72.7% of Al
O
and 22.8% of SiO
as raw materials. She was diagnosed with aluminum-associated sarcoid-like granulomatous lung disease, rather than sarcoidosis, according to occupational exposure by a multidisciplinary panel.
Occupational aluminum dust exposure may induce pulmonary sarcoid-like granulomatosis recognized by a multidisciplinary diagnostic panel.
One purpose of cognitive diagnostic model (CDM) is designed to make inferences about unobserved latent classes based on observed item responses. A heuristic for test construction based on the CDM ...information index (CDI) proposed by Henson and Douglas (2005) has a far-reaching impact, but there are still many shortcomings. He and other researchers had also proposed new methods to improve or overcome the inherent shortcomings of the CDI test assembly method. In this study, one test assembly method of maximizing the minimum inter-class distance is proposed by using mixed-integer linear programming, which aims to overcome the shortcomings that the CDI method is limited to summarize the discriminating power of each item into a single CDI index while neglecting the discriminating power for each pair of latent classes. The simulation results show that compared with the CDI test assembly and random test assembly, the new test assembly method performs well and has the highest accuracy rate in terms of pattern and attributes correct classification rates. Although the accuracy rate of the new method is not very high under item constraints, it is still higher than the CDI test assembly with the same constraints.
Classification consistency and accuracy are viewed as important indicators for evaluating the reliability and validity of classification results in cognitive diagnostic assessment (CDA). ...Pattern-level classification consistency and accuracy indices were introduced by Cui, Gierl, and Chang. However, the indices at the attribute level have not yet been constructed. This study puts forward a simple approach to estimating the indices at both the attribute and the pattern level through one single test administration. Detailed elaboration is made on how the upper and lower bounds for the attribute-level accuracy can be derived from the variance of error of the attribute mastery probability estimate. In addition, based on Cui's pattern-level indices, an alternative approach to estimating the attribute-level indices is also proposed. Comparative analysis of simulation results indicate that the new indices are very desirable for evaluating test-retest consistency and correct classification rate.