This work illustrates a method to detect and separate the broken rotor bars (BRBs) from load torque oscillations (LTOs) in motor’s line current signature. The LTOs (due to mechanical load condition ...abnormalities, load fluctuations like speed reduction couplings or a defective transmission) can introduce similar symptoms as the rotor cage breaks do. The proposed policy is based on the set of two rotating coordinates (same and inverse angular velocity as the current’s fundamental frequency ω) for the stator current vector, and its decomposition into positive and negative components. The extracted components of the positive sequence allow to separate the similar effects produced by rotor defects and the oscillating load . The detection and separation process is performed through the demodulation of the amplitude modulating signal due to BRBs and the phase modulating signal due to LTOs. An experimental test bench has been conducted to validate the simulation results and demonstrate the effectiveness of the proposed approach.
Non-stationary fault detection under bearing fault operation of induction motor is investigated in this paper. For this aim, the vibration signal is analyzed by wavelet method and pencil matrix ...method. The pencil matrix (PM) or (MP) method has been combined with wavelet transform (WT), in order to reconstruct the non-stationary signal and detect the bearing fault frequency. For validation of results, an experimental setup is used for an induction motor under different load operation and with failure on its inner race. The application of the proposed technique on vibration signal under non-stationary state show that fault can be characterized by a particular signature that it is not possible with fast Fourier transform (FFT).
We examine the effects of the spontaneous background event rate and aftershock triggering characteristics on the temporal statistics of seismicity in the epidemic‐type aftershock sequence model. ...Recent work has shown that the earthquake interevent time distribution is generally bimodal: a superposition of a gamma component from triggered aftershocks at short time intervals and an exponential component at longer intervals from spontaneous events and the overlapping of independent aftershock sequences. The relative size of these two components varies between catalogs, so there is no simple, universal scaling; at the extreme of high spontaneous rate, e.g., in large regions, the high probability of temporally overlapping aftershock sequences causes the exponential component to dominate. Here we further explore the effects of both the spontaneous rate and the aftershock triggering parameters. We show that the analytical theory of Saichev and Sornette (2007), although valid under their assumptions, gives the impression of a more “universal” behavior if used outside its stated range of applicability. We also show that within the high‐overlap (high‐spontaneous rate) regime, a maximum likelihood inversion of the model's temporal parameters is both less accurate and biased; specifically, the background rate is systematically overestimated. This has implications on the suitable range of region sizes for which parameter inversion may be reliable and must therefore be taken into account in any inversion for temporal variations in background rate in time‐dependent hazard calculation.
The design of the n2EDM experiment Ayres, N. J; Ban, G; Bienstman, L ...
European physical journal. C, Particles and fields,
06/2021, Letnik:
81, Številka:
6
Journal Article
Recenzirano
Odprti dostop
We present the design of a next-generation experiment, n2EDM, currently under construction at the ultracold neutron source at the Paul Scherrer Institute (PSI) with the aim of carrying out a ...high-precision search for an electric dipole moment of the neutron. The project builds on experience gained with the previous apparatus operated at PSI until 2017, and is expected to deliver an order of magnitude better sensitivity with provision for further substantial improvements. An overview is of the experimental method and setup is given, the sensitivity requirements for the apparatus are derived, and its technical design is described.
Seismic activity is routinely quantified using means in event rate or interevent time. Standard estimates of the error on such mean values implicitly assume that the events used to calculate the mean ...are independent. However, earthquakes can be triggered by other events and are thus not necessarily independent. As a result, the errors on mean earthquake interevent times do not exhibit Gaussian convergence with increasing sample size according to the central limit theorem. In this paper we investigate how the errors decay with sample size in real earthquake catalogues and how the nature of this convergence varies with the spatial extent of the region under investigation. We demonstrate that the errors in mean interevent times, as a function of sample size, are well estimated by defining an effective sample size, using the autocorrelation function to estimate the number of pieces of independent data that exist in samples of different length. This allows us to accurately project error estimates from finite natural earthquake catalogues into the future and promotes a definition of stability wherein the autocorrelation function is not varying in time. The technique is easy to apply, and we suggest that it is routinely applied to define errors on mean interevent times as part of seismic hazard assessment studies. This is particularly important for studies that utilize small catalogue subsets (fewer than ∼1000 events) in time‐dependent or high spatial resolution (e.g., for catastrophe modeling) hazard assessment.
This paper presents several theoretical and fundamental results on the register need in periodic schedules, also known as MAXLIVE. Our first contribution is a novel formula for computing the exact ...number of registers needed by a scheduled loop. This formula has two advantages: Its computation can be done by using a polynomial algorithm with O(n lg n) complexity (n is the number of instructions in the loop) and it allows the generalization of a previous result 13. Second, during software pipelining, we show that the minimal number of registers needed may increase when incrementing the initiation interval (II), which is contrary to intuition. For the case of zero architectural delays in accessing registers, we provide a sufficient condition for keeping the minimal number of registers from increasing when incrementing the II. Third, we prove an interesting property that enables us to optimally compute the minimal periodic register sufficiency of a loop for all its valid periodic schedules, irrespective of II. Fourth and last, we prove that the problem of optimal stage scheduling under register constraints is polynomially solvable for a subclass of data dependence graphs, whereas this problem is known to be NP-complete for arbitrary dependence graphs 7. Our latter result generalizes a previous achievement 13 which addressed data dependence trees and forest of trees. In this study, we consider cyclic data dependence graphs without taking into account any resource constraints. The aim of our theoretical results on the periodic register need is to help current and future software pipeliners achieve significant performance improvements by making better (if not the best) use of the available resources.