The intrinsic complexity associated with passive sonar data makes the task of target recognition extremely challenging. The conventional classifier architectures based on hand-engineered feature ...transforms often fail miserably to disentangle the high-dimensional non-linear structures in the observed target records. Although the modern deep learning algorithms through hierarchical feature learning yield acceptable success rates, they often require tremendous amounts of data when trained in a supervised manner. An unsupervised generative framework utilizing a variational autoencoder (VAE) is proposed in this work in order to create better disentangled representations for the downstream classification task. The disentanglement is further enforced using a hyperparameter <inline-formula><tex-math notation="LaTeX">\beta </tex-math></inline-formula>. For the purpose of better segregating the spectro-temporal features, an intermediate non-linearly scaled time-frequency representation is also employed in conjunction with <inline-formula><tex-math notation="LaTeX">\beta </tex-math></inline-formula>-VAE. Experimental analysis of various classifier configurations yields encouraging results in terms of data efficiency and classification accuracy on target records collected from various locations of the Indian Ocean.
Here we propose a new class of probability distributions as an extended version of the exponential hyper-Poisson distribution and Weibull Poisson distribution. We investigate several important ...aspects of the distribution through deriving expressions for its probability density function (pdf), cumulative distribution function, survival function, failure rate function, pdf of the order statistics, r-th raw moments, etc. The method of maximum likelihood estimation procedures along with EM algorithm is discussed for estimating the parameters of the distribution and a test procedure is suggested for testing the significance of the additional parameters of the proposed model. The use of the proposed distribution is illustrated through real-life data sets. Further, a brief simulation study is carried out for evaluating the performance of the estimators obtained for the parameters of the distribution.
Count data with excess zeros are so common in several areas of scientific research. In particular, the zero-inflated version of count data models has been used for modelling data sets with excessive ...number of zeros. In this regard, zero-inflated Poisson distribution has received much attention in the literature. Through this paper, we propose a generalized class of zero-inflated Poisson distribution namely 'zero-inflated Hermite distribution (ZIHD)', which can be considered as a more flexible class of zero-inflated Poisson-type distribution suitable for tackling overdispersed data sets. Here we investigate several important properties of the ZIHD along with a discussion on certain inference aspects of the model. Certain test procedures for checking zero-inflation have also been developed and these tests have been investigated by using simulation studies. Further, two real life data applications are given for illustrating the usefulness of the model.
The zero-inflated models have becomes fairly popular in the research literature. Medical and public health research involve the analysis of count data that exhibits a substantially large proportion ...of zeroes. The first zero-inflated model is the zero-inflated Poisson model, which concerns a random event containing excess zero-count in unit time. In this paper we consider a zero-inflated version of the modified hyper-Poisson distribution as a generalization of the zero-inflated Hyper-Poisson distribution of Kumar and Ramachandran (Commun.Statist.Simul.Comp., 2019) and study some of its important properties through deriving its probability generating function and expressions for factorial moments, mean, variance, recursion formulae for factorial moments, raw moments and probabilities. The estimation of the parameters of the proposed distribution is attempted and it has been fitted to certain real life data sets to test its goodness of fit. Further, certain test procedures are constructed for examining the significance of the parameters of the model and a simulation study is carried out for assessing the performance of the maximum likelihood estimators of the parameters of the distribution.
The exponential model is the simplest among all lifetime distribution models and it possesses a constant failure rate. Here we propose a new class of lifetime distribution having decreasing failure ...rate which we developed through compounding exponential distribution with the positive hyper-Poisson distribution. We investigate some of its statistical properties and employed various methods of estimation for estimating the parameters of the distribution along with certain test procedures. All the procedures discussed in the paper are illustrated with the help of real-life data sets. Further, a brief simulation study is conducted for examining the performance of the estimators of the parameters of the distribution.
Here we consider a zero-inflated version of the hyper-Poisson distribution and study some of its important properties through deriving its probability generating function and expressions for ...factorial moments, mean, variance, recursion formulae for probabilities, raw moments, and factorial moments. The estimation of the parameters of the zero-inflated hyper-Poisson distribution is attempted. The distribution has been fitted to certain real life data sets and thereby shown that the proposed model gives better fit to the data sets compared to existing models such as zero-inflated Poisson distibution (ZIPD), zero-inflated negative binomial distribution (ZINBD), zero-inflated Conway-Maxwell Poisson distribution (ZICMPD), and zero-inflated generalized Poisson distribution (ZIGPD). Further, Rao's efficient score test procedure is applied for examining the significance of the parameters and a simulation study is carried out for assessing the performance of the maximum likelihood estimators of the parameters of the distribution.
This experimental work aims to devise and establish quadratic regression equations, including various input criteria of a friction stir welding (FSW) technique to predict and determine the responses ...during the fabrication of AZ91C Mg alloy joints. The input process parameters taken into consideration include the traversing speed of the tool, the speed of rotation of the tool, its pin profile (geometry) and the axial force. A five-level, 4 four-factor composite design (of central nature) was applied, and response surface methodology (RSM) was used to formulate quadratic regression models, to develop 3D response surface charts, and to anticipate the responses for various mechanical properties. The generated quadratic mathematical model was tested and validated using the technique of analysis of variance. Validation experimental trial results outlined in the form of scatter diagrams revealed precedented coincidence with that of the generated models. The AZ91C Mg alloy joints obtained using the tool having taper cylindrical pin geometry employed at 1045 rpm, 1.5 mm/s traversing speed, under the exertion of an axial load of 4.87 kN was found to exhibit improved mechanical properties.
A bivariate generalized Yule distribution is introduced here as the distribution of the random sum of certain types of independent and identically distributed bivariate Bernoulli random variables. We ...study some important properties of the distribution through deriving explicit expressions for its probability mass function, factorial moments, and the probability generating function (p.g.f.) of its conditional distributions. Certain recurrence relations for its probabilities, raw moments and factorial moments are also developed. The method of maximum likelihood is employed for estimating the parameters of the distribution and applied to real life data sets for illustrating the relevance of the distribution.
To monitor electrical indications from the heart and assess its performance, the electrocardiogram (ECG) is the most common and routine diagnostic instrument employed. Cardiac arrhythmias are only ...one example of the many heart conditions people might have. ECG records are used to diagnose an arrhythmia, an abnormal cardiac beat that can cause a stroke in extreme circumstances. However, due to the extensive data that an ECG contains, it is quite difficult to glean the necessary information through visual analysis. Therefore, it is crucial to develop an effective (automatic) method to analyze the vast amounts of data available from ECG. For decades, researchers have focused on developing methods to automatically and computationally categorize and identify cardiac arrhythmias. However, monitoring for arrhythmias in real-time is challenging. To streamline the detection and classification process, this research presents a hybrid deep learning-based technique. There are two major contributions to this study. To automate the noise reduction and feature extraction, 1D ECG data are first transformed into 2D Scalogram images. Following this, a combined approach called the Residual attention-based 2D-CNN-LSTM-CNN (RACLC) is recommended by merging multiple learning models, specifically the 2D convolutional neural network (CNN) and the Long Short-Term Memory (LSTM) system, based on research findings. The name of this model comes from a combination of the two deep learning. Both the beats themselves, which provide morphological information, and the beats paired with neighboring segments, which provide temporal information, are essential. Our suggested model simultaneously collects time-domain and morphological ECG signal data and combines them. The application of the attention block to the network helps to strengthen the valuable information, acquire the confidential message in the ECG signal, and boost the efficiency of the model when it comes to categorization. To evaluate the efficacy of the proposed RACLC method, we carried out a complete experimental investigation making use of the MIT-BIH arrhythmia database, which is used by a large number of researchers. The results of our experiments show that the automated detection method we propose is effective.