Objective. Most current electroencephalography (EEG)-based brain-computer interfaces (BCIs) are based on machine learning algorithms. There is a large diversity of classifier types that are used in ...this field, as described in our 2007 review paper. Now, approximately ten years after this review publication, many new algorithms have been developed and tested to classify EEG signals in BCIs. The time is therefore ripe for an updated review of EEG classification algorithms for BCIs. Approach. We surveyed the BCI and machine learning literature from 2007 to 2017 to identify the new classification approaches that have been investigated to design BCIs. We synthesize these studies in order to present such algorithms, to report how they were used for BCIs, what were the outcomes, and to identify their pros and cons. Main results. We found that the recently designed classification algorithms for EEG-based BCIs can be divided into four main categories: adaptive classifiers, matrix and tensor classifiers, transfer learning and deep learning, plus a few other miscellaneous classifiers. Among these, adaptive classifiers were demonstrated to be generally superior to static ones, even with unsupervised adaptation. Transfer learning can also prove useful although the benefits of transfer learning remain unpredictable. Riemannian geometry-based methods have reached state-of-the-art performances on multiple BCI problems and deserve to be explored more thoroughly, along with tensor-based methods. Shrinkage linear discriminant analysis and random forests also appear particularly useful for small training samples settings. On the other hand, deep learning methods have not yet shown convincing improvement over state-of-the-art BCI methods. Significance. This paper provides a comprehensive overview of the modern classification algorithms used in EEG-based BCIs, presents the principles of these methods and guidelines on when and how to use them. It also identifies a number of challenges to further advance EEG classification in BCI.
This paper reviews recent progress in the diagnosis of Alzheimer's disease (AD) from electroencephalograms (EEG). Three major effects of AD on EEG have been observed: slowing of the EEG, reduced ...complexity of the EEG signals, and perturbations in EEG synchrony. In recent years, a variety of sophisticated computational approaches has been proposed to detect those subtle perturbations in the EEG of AD patients. The paper first describes methods that try to detect slowing of the EEG. Next the paper deals with several measures for EEG complexity, and explains how those measures have been used to study fluctuations in EEG complexity in AD patients. Then various measures of EEG synchrony are considered in the context of AD diagnosis. Also the issue of EEG pre-processing is briefly addressed. Before one can analyze EEG, it is necessary to remove artifacts due to for example head and eye movement or interference from electronic equipment. Pre-processing of EEG has in recent years received much attention. In this paper, several state-of-the-art pre-processing tech- niques are outlined, for example, based on blind source separation and other non-linear filtering paradigms. In addition, the paper outlines opportunities and limitations of computational approaches for diagnosing AD based on EEG. At last, future challenges and open problems are discussed.
It is well known that EEG signals of Alzheimer's disease (AD) patients are generally less synchronous than in age-matched control subjects. However, this effect is not always easily detectable. This ...is especially the case for patients in the pre-symptomatic phase, commonly referred to as mild cognitive impairment (MCI), during which neuronal degeneration is occurring prior to the clinical symptoms appearance. In this paper, various synchrony measures are studied in the context of AD diagnosis, including the correlation coefficient, mean-square and phase coherence, Granger causality, phase synchrony indices, information-theoretic divergence measures, state space based measures, and the recently proposed stochastic event synchrony measures. Experiments with EEG data show that many of those measures are strongly correlated (or anti-correlated) with the correlation coefficient, and hence, provide little complementary information about EEG synchrony. Measures that are only weakly correlated with the correlation coefficient include the phase synchrony indices, Granger causality measures, and stochastic event synchrony measures. In addition, those three families of synchrony measures are mutually uncorrelated, and therefore, they each seem to capture a specific kind of interdependence. For the data set at hand, only two synchrony measures are able to convincingly distinguish MCI patients from age-matched control patients, i.e., Granger causality (in particular, full-frequency directed transfer function) and stochastic event synchrony. Those two measures are used as features to distinguish MCI patients from age-matched control subjects, yielding a leave-one-out classification rate of 83%. The classification performance may be further improved by adding complementary features from EEG; this approach may eventually lead to a reliable EEG-based diagnostic tool for MCI and AD.
Trichloroethylene (TCE) and perchloroethylene or tetrachloroethylene (PCE) are high-production volume chemicals with numerous industrial applications. As a consequence of their widespread use, these ...chemicals are ubiquitous environmental contaminants to which the general population is commonly exposed. It is widely assumed that TCE and PCE are toxicologically similar; both are simple olefins with three (TCE) or four (PCE) chlorines. Nonetheless, despite decades of research on the adverse health effects of TCE or PCE, few studies have directly compared these two toxicants. Although the metabolic pathways are qualitatively similar, quantitative differences in the flux and yield of metabolites exist. Recent human health assessments have uncovered some overlap in target organs that are affected by exposure to TCE or PCE, and divergent species- and sex-specificity with regard to cancer and noncancer hazards. The objective of this minireview is to highlight key similarities, differences, and data gaps in target organ metabolism and mechanism of toxicity. The main anticipated outcome of this review is to encourage research to 1) directly compare the responses to TCE and PCE using more sensitive biochemical techniques and robust statistical comparisons; 2) more closely examine interindividual variability in the relationship between toxicokinetics and toxicodynamics for TCE and PCE; 3) elucidate the effect of coexposure to these two toxicants; and 4) explore new mechanisms for target organ toxicity associated with TCE and/or PCE exposure.
A multilayer approach to nonnegative matrix factorisation algorithms is proposed. It considerably improves their performance; especially if a problem is ill-conditioned, or data are badly scaled, and ...projected gradient algorithms are used. This is fully confirmed by extensive simulations with diverse types of data in application to blind source separation.
A new generalized multilinear regression model, termed the higher order partial least squares (HOPLS), is introduced with the aim to predict a tensor (multiway array) Y from a tensor X through ...projecting the data onto the latent space and performing regression on the corresponding latent variables. HOPLS differs substantially from other regression models in that it explains the data by a sum of orthogonal Tucker tensors, while the number of orthogonal loadings serves as a parameter to control model complexity and prevent overfitting. The low-dimensional latent space is optimized sequentially via a deflation operation, yielding the best joint subspace approximation for both X and Y. Instead of decomposing X and Y individually, higher order singular value decomposition on a newly defined generalized cross-covariance tensor is employed to optimize the orthogonal loadings. A systematic comparison on both synthetic data and real-world decoding of 3D movement trajectories from electrocorticogram signals demonstrate the advantages of HOPLS over the existing methods in terms of better predictive ability, suitability to handle small sample sizes, and robustness to noise.
Addition of menthol to cigarettes may be associated with increased initiation of smoking. The potential mechanisms underlying this association are not known. Menthol, likely due to its effects on ...cold-sensing peripheral sensory neurons, is known to inhibit the sensation of irritation elicited by respiratory irritants. However, it remains unclear whether menthol modulates cigarette smoke irritancy and nicotine absorption during initial exposures to cigarettes, thereby facilitating smoking initiation. Using plethysmography in a C57Bl/6J mouse model, we examined the effects of L-menthol, the menthol isomer added to cigarettes, on the respiratory sensory irritation response to primary smoke irritants (acrolein and cyclohexanone) and smoke of Kentucky reference 2R4 cigarettes. We also studied L-menthol's effect on blood levels of the nicotine metabolite, cotinine, immediately after exposure to cigarette smoke. L-menthol suppressed the irritation response to acrolein with an apparent IC₅₀ of 4 ppm. Suppression was observed even at acrolein levels well above those necessary to produce a maximal response. Cigarette smoke, at exposure levels of 10 mg/m³ or higher, caused an immediate and marked sensory irritation response in mice. This response was significantly suppressed by L-menthol even at smoke concentrations as high as 300 mg/m³. Counterirritation by L-menthol was abolished by treatment with a selective inhibitor of Transient Receptor Potential Melastatin 8 (TRPM8), the neuronal cold/menthol receptor. Inclusion of menthol in the cigarette smoke resulted in roughly a 1.5-fold increase in plasma cotinine levels over those observed in mice exposed to smoke without added menthol. These findings document that, L-menthol, through TRPM8, is a strong suppressor of respiratory irritation responses, even during highly noxious exposures to cigarette smoke or smoke irritants, and increases blood cotinine. Therefore, L-menthol, as a cigarette additive, may promote smoking initiation and nicotine addiction.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
Information geometry of divergence functions Amari, S.; Cichocki, A.
Bulletin of the Polish Academy of Sciences. Technical sciences,
03/2010, Letnik:
58, Številka:
1
Journal Article
Recenzirano
Odprti dostop
Measures of divergence between two points play a key role in many engineering problems. One such measure is a distance function, but there are many important measures which do not satisfy the ...properties of the distance. The Bregman divergence, Kullback-Leibler divergence and f-divergence are such measures. In the present article, we study the differential-geometrical structure of a manifold induced by a divergence function. It consists of a Riemannian metric, and a pair of dually coupled affine connections, which are studied in information geometry. The class of Bregman divergences are characterized by a dually flat structure, which is originated from the Legendre duality. A dually flat space admits a generalized Pythagorean theorem. The class of f-divergences, defined on a manifold of probability distributions, is characterized by information monotonicity, and the Kullback-Leibler divergence belongs to the intersection of both classes. The f-divergence always gives the α-geometry, which consists of the Fisher information metric and a dual pair of ±α-connections. The α-divergence is a special class of f-divergences. This is unique, sitting at the intersection of the f-divergence and Bregman divergence classes in a manifold of positive measures. The geometry derived from the Tsallis q-entropy and related divergences are also addressed.
In addition to helping better understand how the human brain works, the brain-computer interface neuroscience paradigm allows researchers to develop a new class of bioengineering control devices and ...robots, offering promise for rehabilitation and other medical applications as well as exploring possibilities for advanced human-computer interfaces.
—The development of reinforced learning methods has extended application to many areas including algorithmic trading. In this paper trading on the stock exchange is interpreted into a game with a ...Markov property consisting of states, actions, and rewards. A system for trading the fixed volume of a financial instrument is proposed and experimentally tested; this is based on the asynchronous advantage actor-critic method with the use of several neural network architectures. The application of recurrent layers in this approach is investigated. The experiments were performed on real anonymized data. The best architecture demonstrated a trading strategy for the RTS Index futures (MOEX:RTSI) with a profitability of 66% per annum accounting for commission. The project source code is available via the following link:
http://github.com/evgps/a3c_trading
.