We study how cross‐country variance in institutions that aim to address core agency problems influences consequential strategic decisions of firms around the world. Scholars frequently argue that the ...interests of minority shareholders are threatened by merger and acquisitions (M&As) due to principal‐agency problems. Rather than acting in shareholders’ best interests, managers potentially act as viceroys, using M&As to cushion themselves from risk and extract more pay. Yet equally salient is the issue of principal‐principal agency, where controlling shareholders can behave as emperors who use M&As to siphon off assets and profits, and appropriate wealth of shareholders with fewer control rights. Taking an institution‐based perspective on these ‘viceroy’ and ‘emperor’ problems, we conjecture that institutions aimed to address these agency problems can generate the desired outcome regarding M&A prevalence, but may also produce unintentional negative consequences for shareholder value as a side‐effect. Empirical evidence covering M&As from 73 countries supports our hypotheses.
Many industrial processes contain both linear and nonlinear parts, and kernel principal component analysis (KPCA), widely used in nonlinear process monitoring, may not offer the most effective means ...for dealing with these nonlinear processes. This paper proposes a new hybrid linear-nonlinear statistical modeling approach for nonlinear process monitoring by closely integrating linear principal component analysis (PCA) and nonlinear KPCA using a serial model structure, which we refer to as serial PCA (SPCA). Specifically, PCA is first applied to extract PCs as linear features, and to decompose the data into the PC subspace and residual subspace (RS). Then, KPCA is performed in the RS to extract the nonlinear PCs as nonlinear features. Two monitoring statistics are constructed for fault detection, based on both the linear and nonlinear features extracted by the proposed SPCA. To effectively perform fault identification after a fault is detected, an SPCA similarity factor method is built for fault recognition, which fuses both the linear and nonlinear features. Unlike PCA and KPCA, the proposed method takes into account both linear and nonlinear PCs simultaneously, and therefore, it can better exploit the underlying process's structure to enhance fault diagnosis performance. Two case studies involving a simulated nonlinear process and the benchmark Tennessee Eastman process demonstrate that the proposed SPCA approach is more effective than the existing state-of-the-art approach based on KPCA alone, in terms of nonlinear process fault detection and identification.
We propose that in a context where corporate ownership is concentrated, the controlling shareholder of a firm tends to symbolically comply to regulatory requirements that aim to protect minority ...shareholders; yet the presence of multiple large shareholders can serve as an internal monitoring mechanism that can reduce symbolic compliance. We test this argument through examining firm responses to a regulatory requirement regarding independent accounting director appointments in China. Using data on China's listed non‐state‐owned enterprises, we find that the presence of multiple large shareholders decreases the likelihood of symbolic compliance, and this negative effect is stronger when noncontrolling large shareholders have low incentives to collude with the controlling shareholder. We also find that a firm engaging in symbolic compliance tends to have a greater level of tunnelling (by the largest shareholder) and earnings management. Our study contributes to the literature on symbolic management in an institutional setting where ownership is concentrated.
In recent years, radar has been employed as a fall detector because of its effective sensing capabilities and penetration through walls. In this paper, we introduce a multilinear subspace human ...activity recognition scheme that exploits the three radar signal variables: slow-time, fast-time, and Doppler frequency. The proposed approach attempts to find the optimum subspaces that minimize the reconstruction error for different modes of the radar data cube. A comprehensive analysis of the optimization considerations is performed, such as initialization, number of projections, and convergence of the algorithms. Finally, a boosting scheme is proposed combining the unsupervised multilinear principal component analysis (PCA) with the supervised methods of linear discriminant analysis and shallow neural networks. Experimental results based on real radar data obtained from multiple subjects, different locations, and aspect angles (0°, 30°, 45°, 60°, and 90°) demonstrate that the proposed algorithm yields the highest overall classification accuracy among spectrogram-based methods including predefined physical features, one- and two-dimensional PCA and convolutional neural networks.
As an unsupervised dimensionality reduction method, the principal component analysis (PCA) has been widely considered as an efficient and effective preprocessing step for hyperspectral image (HSI) ...processing and analysis tasks. It takes each band as a whole and globally extracts the most representative bands. However, different homogeneous regions correspond to different objects, whose spectral features are diverse. Therefore, it is inappropriate to carry out dimensionality reduction through a unified projection for an entire HSI. In this paper, a simple but very effective superpixelwise PCA (SuperPCA) approach is proposed to learn the intrinsic low-dimensional features of HSIs. In contrast to classical PCA models, the SuperPCA has four main properties: 1) unlike the traditional PCA method based on a whole image, the SuperPCA takes into account the diversity in different homogeneous regions, that is, different regions should have different projections; 2) most of the conventional feature extraction models cannot directly use the spatial information of HSIs, while the SuperPCA is able to incorporate the spatial context information into the unsupervised dimensionality reduction by superpixel segmentation; 3) since the regions obtained by superpixel segmentation have homogeneity, the SuperPCA can extract potential low-dimensional features even under noise; and 4) although the SuperPCA is an unsupervised method, it can achieve a competitive performance when compared with supervised approaches. The resulting features are discriminative, compact, and noise-resistant, leading to an improved HSI classification performance. Experiments on three public data sets demonstrate that the SuperPCA model significantly outperforms the conventional PCA-based dimensionality reduction baselines for HSI classification, and some state-of-the-art feature extraction approaches. The MATLAB source code is available at https://github.com/junjun-jiang/SuperPCA .
Edge-preserving features (EPFs) obtained by the application of edge-preserving filters to hyperspectral images (HSIs) have been found very effective in characterizing significant spectral and spatial ...structures of objects in a scene. However, a direct use of the EPFs can be insufficient to provide a complete characterization of spatial information when objects of different scales are present in the considered images. Furthermore, the edge-preserving smoothing operation unavoidably decreases the spectral differences among objects of different classes, which may affect the following classification. To overcome these problems, in this paper, a novel principal component analysis (PCA)-based EPFs (PCA-EPFs) method for HSI classification is proposed, which consists of the following steps. First, the standard EPFs are constructed by applying edge-preserving filters with different parameter settings to the considered image, and the resulting EPFs are stacked together. Next, the spectral dimension of the stacked EPFs is reduced with the PCA, which not only can represent the EPFs in the mean square sense but also highlight the separability of pixels in the EPFs. Finally, the resulting PCA-EPFs are classified by a support vector machine (SVM) classifier. Experiments performed on several real hyperspectral data sets show the effectiveness of the proposed PCA-EPFs, which sharply improves the accuracy of the SVM classifier with respect to the standard edge-preserving filtering-based feature extraction method, and other widely used spectral-spatial classifiers.
•We initiate a rigorous and comprehensive review of RPCA-PCP based methods.•We investigate how these methods are solved.•We investigate if incremental algorithms can be achieved.•We investigate if ...real-time implementations can be achieved.•A comparative evaluation with the BMC dataset. Shows the performance of 13 recent RPCA methods.
Foreground detection is the first step in video surveillance system to detect moving objects. Recent research on subspace estimation by sparse representation and rank minimization represents a nice framework to separate moving objects from the background. Robust Principal Component Analysis (RPCA) solved via Principal Component Pursuit decomposes a data matrix A in two components such that A=L+S, where L is a low-rank matrix and S is a sparse noise matrix. The background sequence is then modeled by a low-rank subspace that can gradually change over time, while the moving foreground objects constitute the correlated sparse outliers. To date, many efforts have been made to develop Principal Component Pursuit (PCP) methods with reduced computational cost that perform visually well in foreground detection. However, no current algorithm seems to emerge and to be able to simultaneously address all the key challenges that accompany real-world videos. This is due, in part, to the absence of a rigorous quantitative evaluation with synthetic and realistic large-scale dataset with accurate ground truth providing a balanced coverage of the range of challenges present in the real world. In this context, this work aims to initiate a rigorous and comprehensive review of RPCA-PCP based methods for testing and ranking existing algorithms for foreground detection. For this, we first review the recent developments in the field of RPCA solved via Principal Component Pursuit. Furthermore, we investigate how these methods are solved and if incremental algorithms and real-time implementations can be achieved for foreground detection. Finally, experimental results on the Background Models Challenge (BMC) dataset which contains different synthetic and real datasets show the comparative performance of these recent methods.
•A data mining procedure to forecast daily stock market return is proposed.•The raw data includes 60 financial and economic features over a 10-year period.•Combining ANNs with PCA gives slightly ...higher classification accuracy.•Combining ANNs with PCA provides significantly higher risk-adjusted profits.
In financial markets, it is both important and challenging to forecast the daily direction of the stock market return. Among the few studies that focus on predicting daily stock market returns, the data mining procedures utilized are either incomplete or inefficient, especially when a large amount of features are involved. This paper presents a complete and efficient data mining process to forecast the daily direction of the S&P 500 Index ETF (SPY) return based on 60 financial and economic features. Three mature dimensionality reduction techniques, including principal component analysis (PCA), fuzzy robust principal component analysis (FRPCA), and kernel-based principal component analysis (KPCA) are applied to the whole data set to simplify and rearrange the original data structure. Corresponding to different levels of the dimensionality reduction, twelve new data sets are generated from the entire cleaned data using each of the three different dimensionality reduction methods. Artificial neural networks (ANNs) are then used with the thirty-six transformed data sets for classification to forecast the daily direction of future market returns. Moreover, the three different dimensionality reduction methods are compared with respect to the natural data set. A group of hypothesis tests are then performed over the classification and simulation results to show that combining the ANNs with the PCA gives slightly higher classification accuracy than the other two combinations, and that the trading strategies guided by the comprehensive classification mining procedures based on PCA and ANNs gain significantly higher risk-adjusted profits than the comparison benchmarks, while also being slightly higher than those strategies guided by the forecasts based on the FRPCA and KPCA models.
The principal-principal perspective is tested and extended in the context of corporate takeovers of Chinese publicly listed firms from 1998 to 2007. The resistance of a target firm's controlling ...shareholder toward potential takeovers reflects the conflict between the principal and minority shareholders. It was found that this resistance weakens when target firms are located in regions with more institutional development, where the minority shareholders' interests are better protected. The resistance also decreases for target firms with CEOs who are politically connected, as these CEOs may be more interested in their own political careers than in representing the interests of the controlling shareholders.
This study sought to explore teachers' perceptions of new principals (NPs) and how these perceptions influenced different aspects of their work environment. The research was conducted using case ...study methodology of three schools in Melbourne, Australia. Data collection tools included semi-structured interviews of teachers and principals, supported by non-participant observations and the study of school documents. The results showed that teachers' perceptions of their NP were a function of the incomer's personal and leadership qualities and practices, which, in turn, were informed by three contextual factors: school leadership history, the origin and background of the NP, and teacher expectations. These perceptions appeared to influence several domains within teachers' work environment, mainly teacher morale and, to a lesser extent, teacher professional development. A new conceptual model for understanding teachers' perceptions of an NP has been distilled from the data.