Brian 2 allows scientists to simply and efficiently simulate spiking neural network models. These models can feature novel dynamical equations, their interactions with the environment, and ...experimental protocols. To preserve high performance when defining new models, most simulators offer two options: low-level programming or description languages. The first option requires expertise, is prone to errors, and is problematic for reproducibility. The second option cannot describe all aspects of a computational experiment, such as the potentially complex logic of a stimulation protocol. Brian addresses these issues using runtime code generation. Scientists write code with simple and concise high-level descriptions, and Brian transforms them into efficient low-level code that can run interleaved with their code. We illustrate this with several challenging examples: a plastic model of the pyloric network, a closed-loop sensorimotor model, a programmatic exploration of a neuron model, and an auditory model with real-time input.
Neural heterogeneity promotes robust learning Perez-Nieves, Nicolas; Leung, Vincent C H; Dragotti, Pier Luigi ...
Nature communications,
10/2021, Letnik:
12, Številka:
1
Journal Article
Recenzirano
Odprti dostop
The brain is a hugely diverse, heterogeneous structure. Whether or not heterogeneity at the neural level plays a functional role remains unclear, and has been relatively little explored in models ...which are often highly homogeneous. We compared the performance of spiking neural networks trained to carry out tasks of real-world difficulty, with varying degrees of heterogeneity, and found that heterogeneity substantially improved task performance. Learning with heterogeneity was more stable and robust, particularly for tasks with a rich temporal structure. In addition, the distribution of neuronal parameters in the trained networks is similar to those observed experimentally. We suggest that the heterogeneity observed in the brain may be more than just the byproduct of noisy processes, but rather may serve an active and important role in allowing animals to learn in changing environments.
The brian simulator Goodman, Dan F M; Brette, Romain
Frontiers in neuroscience,
09/2009, Letnik:
3, Številka:
2
Journal Article
Recenzirano
Odprti dostop
"Brian" is a simulator for spiking neural networks (http://www.briansimulator.org). The focus is on making the writing of simulation code as quick and easy as possible for the user, and on ...flexibility: new and non-standard models are no more difficult to define than standard ones. This allows scientists to spend more time on the details of their models, and less on their implementation. Neuron models are defined by writing differential equations in standard mathematical notation, facilitating scientific communication. Brian is written in the Python programming language, and uses vector-based computation to allow for efficient simulations. It is particularly useful for neuroscientific modelling at the systems level, and for teaching computational neuroscience.
Cluster analysis faces two problems in high dimensions: the “curse of dimensionality”
that can lead to overfitting and poor generalization performance and the sheer time taken
for conventional ...algorithms to process large amounts of high-dimensional data. We describe
a solution to these problems, designed for the application of spike sorting for
next-generation, high-channel-count neural probes. In this problem, only a small subset of
features provides information about the cluster membership of any one data vector, but
this informative feature subset is not the same for all data points, rendering classical
feature selection ineffective. We introduce a “masked EM” algorithm that allows accurate
and time-efficient clustering of up to millions of points in thousands of dimensions. We
demonstrate its applicability to synthetic data and to real-world high-channel-count spike
sorting data.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
Graph Drawing by Stochastic Gradient Descent Zheng, Jonathan X.; Pawar, Samraat; Goodman, Dan F. M.
IEEE transactions on visualization and computer graphics,
2019-Sept.-1, 2019-09-00, 2019-9-1, 20190901, Letnik:
25, Številka:
9
Journal Article
Recenzirano
Odprti dostop
A popular method of force-directed graph drawing is multidimensional scaling using graph-theoretic distances as input. We present an algorithm to minimize its energy function, known as stress, by ...using stochastic gradient descent (SGD) to move a single pair of vertices at a time. Our results show that SGD can reach lower stress levels faster and more consistently than majorization, without needing help from a good initialization. We then show how the unique properties of SGD make it easier to produce constrained layouts than previous approaches. We also show how SGD can be directly applied within the sparse stress approximation of Ortmann et al. 1, making the algorithm scalable up to large graphs.
Recent research resolves the challenging problem of building biophysically plausible spiking neural models that are also capable of complex information processing. This advance creates new ...opportunities in neuroscience and neuromorphic engineering, which we discussed at an online focus meeting.
Developments in microfabrication technology have enabled the production of neural electrode arrays with hundreds of closely spaced recording sites, and electrodes with thousands of sites are under ...development. These probes in principle allow the simultaneous recording of very large numbers of neurons. However, use of this technology requires the development of techniques for decoding the spike times of the recorded neurons from the raw data captured from the probes. Here we present a set of tools to solve this problem, implemented in a suite of practical, user-friendly, open-source software. We validate these methods on data from the cortex, hippocampus and thalamus of rat, mouse, macaque and marmoset, demonstrating error rates as low as 5%.
"Brian" is a popular Python-based simulator for spiking neural networks, commonly used in computational neuroscience. GeNN is a C++-based meta-compiler for accelerating spiking neural network ...simulations using consumer or high performance grade graphics processing units (GPUs). Here we introduce a new software package, Brian2GeNN, that connects the two systems so that users can make use of GeNN GPU acceleration when developing their models in Brian, without requiring any technical knowledge about GPUs, C++ or GeNN. The new Brian2GeNN software uses a pipeline of code generation to translate Brian scripts into C++ code that can be used as input to GeNN, and subsequently can be run on suitable NVIDIA GPU accelerators. From the user's perspective, the entire pipeline is invoked by adding two simple lines to their Brian scripts. We have shown that using Brian2GeNN, two non-trivial models from the literature can run tens to hundreds of times faster than on CPU.
Scientific conferences and meetings have an important role in research, but they also suffer from a number of disadvantages: in particular, they can have a massive carbon footprint, they are ...time-consuming, and the high costs involved in attending can exclude many potential participants. The COVID-19 pandemic has led to the cancellation of many conferences, forcing the scientific community to explore online alternatives. Here, we report on our experiences of organizing an online neuroscience conference, neuromatch, that attracted some 3000 participants and featured two days of talks, debates, panel discussions, and one-on-one meetings facilitated by a matching algorithm. By offering most of the benefits of traditional conferences, several clear advantages, and with fewer of the downsides, we feel that online conferences have the potential to replace many legacy conferences.
Animals continuously detect information via multiple sensory channels, like vision and hearing, and integrate these signals to realise faster and more accurate decisions; a fundamental neural ...computation known as multisensory integration. A widespread view of this process is that multimodal neurons linearly fuse information across sensory channels. However, does linear fusion generalise beyond the classical tasks used to explore multisensory integration? Here, we develop novel multisensory tasks, which focus on the underlying statistical relationships between channels, and deploy models at three levels of abstraction: from probabilistic ideal observers to artificial and spiking neural networks. Using these models, we demonstrate that when the information provided by different channels is not independent, linear fusion performs sub-optimally and even fails in extreme cases. This leads us to propose a simple nonlinear algorithm for multisensory integration which is compatible with our current knowledge of multimodal circuits, excels in naturalistic settings and is optimal for a wide class of multisensory tasks. Thus, our work emphasises the role of nonlinear fusion in multisensory integration, and provides testable hypotheses for the field to explore at multiple levels: from single neurons to behaviour.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK