•Error’s regularity and randomness depend on the perspectives of observing error.•Error cannot be classified according its regularity and randomness.•The influence characteristics of error depend on ...the method of repeated measurement.•Error cannot be classified according its influence characteristics.•Both function model and random model can be used to process the same error.
In several literatures, the authors give a new thinking of measurement theory system based on error non-classification philosophy, which completely overthrows the existing measurement concept system of precision, trueness and accuracy. In this paper, by focusing on the issues of error’s regularities and effect characteristics, the authors will do a thematic interpretation, and prove that the error’s regularities actually come from different cognitive perspectives, are also unable to be used for classifying errors, and that the error’s effect characteristics actually depend on artificial condition rules of repeated measurement, and are still unable to be used for classifying errors. Thus, from the perspectives of error’s regularities and effect characteristics, the existing error classification philosophy is still incorrect; and an uncertainty concept system, which must be interpreted by the error non-classification philosophy, naturally becomes the only way out of measurement theory.
Abstract
The traditional sculptures are all from the artist’s careful carving, which is also the artistic presentation. With the rapid development of computer technology, digital technology has been ...applied to art landscape sculpture. Through RPM technology and 3D printing technology, we can rapidly shape and mass produce art landscape sculpture. Through computer aided technology, RPM technology and 3D printing technology, we can save the labor cost and mold opening cost of traditional sculpture, which shortens the production cycle and improves the precision and structure of sculpture. At the same time, we can also develop personalized sculpture, which has seriously impacted the traditional sculpture industry. Computer aided design is in line with the digital age of new technical concepts, which has become the main means of landscape sculpture creation. Through computer software, designers can complete the virtual sculpture design intention and intention. Through the generation of VR, manufacturers can better communicate with customers, which makes up for the shortcomings of traditional sculpture in construction. Firstly, this paper analyzes the related concepts. Then, this paper puts forward the necessity of applying computer aided technology to modern landscape sculpture. Finally, some applications are proposed.
The new concepts of measurement error theory Ye, Xiaoming; Xiao, Xuebin; Shi, Junbo ...
Measurement : journal of the International Measurement Confederation,
04/2016, Volume:
83
Journal Article
Peer reviewed
In current measurement theory, there are various logical and philosophical troubles, and the evaluation concepts of measurement error are various and inconsistent among different schools. This paper ...throws away the shackle of the current measurement theories and concepts, and re-discusses basic measurement concepts. By proving a new measurement error theory that any error is a bias and follows random distribution, this paper points out the misunderstanding of traditional measurement theory, subverts the traditional error category theory, gives out a new interpretation of measurement uncertainty concept, proposes abolishing the concept system of precision, trueness, and accuracy, and thereby achieves the united measurement concept system across all the disciplines, including geodesy, geomatics, metrology, instrumentation, and so on.
We assessed the contamination levels of Mn, Zn, Cr, Cu, Ni, Pb, As and Hg and the risks posed by these potentially harmful elements in top-soils around a municipal solid waste incinerator (MSWI). We ...collected 20 soil samples, with an average pH of 8.1, and another fly ash sample emitted from the MSWI to investigate the concentrations of these elements in soils. We determined the concentrations of these elements by inductively coupled plasma-optical emission spectrometer (ICP-OES), except for Hg, which we measured by AF-610B atomic fluorescence spectrometer (AFS). We assessed the risks of these elements through the use of geoaccumulation index (/geo), potential ecological risk index (R/), hazard quotient (HQi) and cancer risk (Riski). The results showed that concentrations of potentially harmful elements in soil were influenced by the wind direction, and the concentrations of most elements were higher in the area northwest of the MSWI, compared with the area southeast of the incinerator, with the exception of As; these results were in accordance with those results acquired from our contour maps. According to the I~o values, some soil samples were clearly polluted by Hg emissions. However, the health risk assessment indicated that the concentrations of Hg and other elements in soil did not pose non-carcinogenic risks to the local populations. This was also the case for the carcinogenic risks posed by As Cr and Ni. The carcinogenic risk posed by As was higher in the range 6.49 × 10 -9.58 × 10 -6, but this was still considered to be an acceptable level of risk.
High-performance computing (HPC) environment provides researchers with high level high-performance computing application services with unified access entrance, unified use method and user technical ...support through shielding the heterogeneity of job management system, access mode, management system and so on. With the development of the environment, more and more supercomputing centers, application communities, and service platforms are connected. Accounts of the supercomputing centers, communities, and service platforms are required to log in to the HPC environment using their original ways. The existing high-performance computing environment supports only grid accounts authenticated by LDAP (lightweight directory access protocol). Application communities and service platforms have their own users and different authentication modes. In order to provide a unified authentication center for the environment, this paper studies the multi-source account authentication technology and develops the multi-source user a
Log analysis plays an important role in the stable operation of computer system. However, logs are usua-lly unstructured, which is not conducive to automatic analysis. How to categorize logs and turn ...them into structured data automatically is of great practical significance. In this paper, LDmatch algorithm is proposed, which imple-ments a log pattern extracting algorithm based on word matching rate. Traditional log matching algorithms use one-to-one word matching method in similarity calculation, while the proposed LDmatch algorithm calculates the simi-larity between logs according to the longest common subsequence (LCS) of words contained in two logs, and classi-fies logs based on the LCS. LDmatch algorithm can also get real-time log template and update. In addition, the pat-tern warehouse of the algorithm uses a data structure based on hash table for storage, which refines the classification of logs and reduces the times of comparison during log matching, thus improving the matching efficiency of the algor
With the increasing amount of logs produced by nodes in CNGrid (China National Grid), traditional manual methods for user behavior analysis can no longer meet the need of daily analysis. In recent ...years, deep learning has shown good results in key tasks related to computer sciences, such as intrusion detection, image recognition, natural language processing and malware detection. This paper demonstrates how to apply deep learning models to user behavior analysis. To this end, this paper classifies user behavior in CNGrid and extracts a large number of user operation sequences bounded to sessions. These sequences are put into deep learning models. This paper proposes a deep learning model that combines recurrent neural network (RNN) with graph neural network (GNN) to predict the user behavior. Graph neural network can catch the hidden state of the user’s local behavior, so it can be used as preprocessing. Recurrent neural network can catch the message of time sequence. The model is built by combining GNN and R
With the increasing amount of logs produced by nodes in CNGrid, traditional manual methods for abnormal log analysis can no longer meet the need of daily analysis. In order to analyze the log ...automatically and efficiently, a two-stage detection method is proposed in this paper. In the first stage, the log patterns are classified during preprocessing, then the principal component analysis is used for anomaly detection and the sequence of log types is defined as a log flow pattern. The abnormal flow patterns obtained from anomaly detection are extracted by the definition. Finally, the hierarchical clustering algorithm is used to simplify the results of the flow pattern and the results are saved. In the second stage, through the detection model and flow pattern obtained in the first stage, the log flow information can be monitored and analyzed in real time and the corresponding flow pattern can be matched. Finally, the experiment is carried out on real logs in CNGrid, and the results are visualized in real time.
A high-performance computing environment, also known as a supercomputing environment, e-Science environment or cyberinfrastructure, is a crucial system that connects users’ applications to ...supercomputers,and provides usability, eiciency, sharing, and collaboration capabilities. his review presents important lessons drawn from China’s nationwide eforts to build and use a high-performance computing environment over the past 20 years(1995–2015), including three observations and two open problems. We present evidence that such an environment helps to grow China’s nationwide supercomputing ecosystem by orders of magnitude, where a loosely coupled architecture accommodates diversity. An important open problem is why technology for global networked supercomputing has not yet become as widespread as the Internet or Web. In the next 20 years, high-performance computing environments will need to provide zetalops computing capability and 10 000 times beter energy eiciency, and support seamless human-cyber-physical ternary computing.
In several literatures, the authors give a new thinking of measurement theory system based on error non-classification philosophy, which completely overthrows the existing measurement concept system ...of precision, trueness and accuracy. In this paper, by focusing on the issues of error's regularities and effect characteristics, the authors will do a thematic interpretation, and prove that the error's regularities actually come from different cognitive perspectives, are also unable to be used for classifying errors, and that the error's effect characteristics actually depend on artificial condition rules of repeated measurement, and are still unable to be used for classifying errors. Thus, from the perspectives of error's regularities and effect characteristics, the existing error classification philosophy is still incorrect; and an uncertainty concept system, which must be interpreted by the error non-classification philosophy, naturally becomes the only way out of measurement theory.