Our ability to acknowledge and recognize our own identity — our “self” — is a characteristic doubtless unique to humans. Where does this feeling come from? How does the combination of ...neurophysiological processes coupled with our interaction with the outside world construct this coherent identity? We know that our social interactions contribute via the eyes, ears, etc. However, our self is not only influenced by our senses. It is also influenced by the actions we perform and those we see others perform. Our brain anticipates the effects of our own actions and simulates the actions of others. In this way, we become able to understand ourselves and to understand the actions and emotions of others. This book describes the new field of “Motor Cognition”. Though motor actions have long been studied by neuroscientists and physiologists, it is only recently that scientists have considered the role of actions in building the self. How consciousness of action is part of self-consciousness, how one's own actions determine the sense of being an agent, how actions performed by others impact on ourselves for understanding others, differentiating ourselves from them and learning from them: these questions are raised and discussed throughout the book, drawing on experimental, clinical, and theoretical bases. The advent of new neuroscience techniques, such as neuroimaging and direct electrical brain stimulation, together with a renewal of behavioral methods in cognitive psychology, provide new insights into this area. Mental imagery of action, self-recognition, consciousness of actions, imitation can be objectively studied using these new tools. The results of these investigations shed light on clinical disorders in neurology, psychiatry, and in neuro-development.
The modeling of stochastic dependence is fundamental for understanding random systems evolving in time. When measured through linear correlation, many of these systems exhibit a slow correlation ...decay--a phenomenon often referred to as long-memory or long-range dependence. An example of this is the absolute returns of equity data in finance. Selfsimilar stochastic processes (particularly fractional Brownian motion) have long been postulated as a means to model this behavior, and the concept of selfsimilarity for a stochastic process is now proving to be extraordinarily useful. Selfsimilarity translates into the equality in distribution between the process under a linear time change and the same process properly scaled in space, a simple scaling property that yields a remarkably rich theory with far-flung applications. After a short historical overview, this book describes the current state of knowledge about selfsimilar processes and their applications. Concepts, definitions and basic properties are emphasized, giving the reader a road map of the realm of selfsimilarity that allows for further exploration. Such topics as noncentral limit theory, long-range dependence, and operator selfsimilarity are covered alongside statistical estimation, simulation, sample path properties, and stochastic differential equations driven by selfsimilar processes. Numerous references point the reader to current applications.
Tempering stable processes ROSINSKI, Jan
Stochastic processes and their applications,
06/2007, Letnik:
117, Številka:
6
Journal Article
Recenzirano
Odprti dostop
A tempered stable Lévy process combines both the
α
-stable and Gaussian trends. In a short time frame it is close to an
α
-stable process while in a long time frame it approximates a Brownian motion. ...In this paper we consider a general and robust class of multivariate tempered stable distributions and establish their identifiable parametrization. We prove short and long time behavior of tempered stable Lévy processes and investigate their absolute continuity with respect to the underlying
α
-stable processes. We find probabilistic representations of tempered stable processes which specifically show how such processes are obtained by cutting (tempering) jumps of stable processes. These representations exhibit
α
-stable and Gaussian tendencies in tempered stable processes and thus give probabilistic intuition for their study. Such representations can also be used for simulation. We also develop the corresponding representations for Ornstein–Uhlenbeck-type processes.
Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received increased attention in the machine-learning community over the past ...decade, and this book provides a long-needed systematic and unified treatment of theoretical and practical aspects of GPs in machine learning. The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics.The book deals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective. Many connections to other well-known techniques from machine learning and statistics are discussed, including support-vector machines, neural networks, splines, regularization networks, relevance vector machines and others. Theoretical issues including learning curves and the PAC-Bayesian framework are treated, and several approximation methods for learning with large datasets are discussed. The book contains illustrative examples and exercises, and code and datasets are available on the Web. Appendixes provide mathematical background and a discussion of Gaussian Markov processes.