Machine learning advances chemistry and materials science by enabling large-scale exploration of chemical space based on quantum chemical calculations. While these models supply fast and accurate ...predictions of atomistic chemical properties, they do not explicitly capture the electronic degrees of freedom of a molecule, which limits their applicability for reactive chemistry and chemical analysis. Here we present a deep learning framework for the prediction of the quantum mechanical wavefunction in a local basis of atomic orbitals from which all other ground-state properties can be derived. This approach retains full access to the electronic structure via the wavefunction at force-field-like efficiency and captures quantum mechanics in an analytically differentiable representation. On several examples, we demonstrate that this opens promising avenues to perform inverse design of molecular structures for targeting electronic property optimisation and a clear path towards increased synergy of machine learning and quantum chemistry.
SchNetPack is a toolbox for the development and application of deep neural networks that predict potential energy surfaces and other quantum-chemical properties of molecules and materials. It ...contains basic building blocks of atomistic neural networks, manages their training, and provides simple access to common benchmark datasets. This allows for an easy implementation and evaluation of new models. For now, SchNetPack includes implementations of (weighted) atom-centered symmetry functions and the deep tensor neural network SchNet, as well as ready-to-use scripts that allow one to train these models on molecule and material datasets. Based on the PyTorch deep learning framework, SchNetPack allows one to efficiently apply the neural networks to large datasets with millions of reference calculations, as well as parallelize the model across multiple GPUs. Finally, SchNetPack provides an interface to the Atomic Simulation Environment in order to make trained models easily accessible to researchers that are not yet familiar with neural networks.
In this book some 50 papers published by K A Müller as author or co-author over several decades, amplified by more recent work mainly by T W Kool with collaborators, are reproduced. The main subject ...is Electron Paramagnetic Resonance (EPR) applied to the study of perovskites and other oxides with related subjects. This wealth of papers is organized into eleven chapters, each with an introductory text written in the light of current understanding. The contributions of the first editor on structural phase transitions have been immense, and because K A Müller and J C Fayet have published a review paper on the subject, the latter is reproduced in chapter VII. Not related to EPR is a part of chapter VIII on the dipolar and quantum paraelectric behavior with dielectric studies. In chapter X two papers proving the existence of Fermi glasses are reproduced.
We demonstrate electrical control of the A-exciton interband transition in mono- and few-layer MoS2 crystals embedded into photocapacitor devices via the DC Stark effect. Electric field-dependent ...low-temperature photoluminescence spectroscopy reveals a significant tuneability of the A-exciton transition energy up to ∼16 meV from which we extract the mean DC exciton polarizability ⟨β̅ N ⟩ = (0.58 ± 0.25) × 10–8 Dm V–1. The exciton polarizability is shown to be layer-independent, indicating a strong localization of both electron and hole wave functions in each individual layer.
Quantum optical circuits can be used to generate, manipulate, and exploit nonclassical states of light to push semiconductor based photonic information technologies to the quantum limit. Here, we ...report the on-chip generation of quantum light from individual, resonantly excited self-assembled InGaAs quantum dots, efficient routing over length scales ≥1 mm via GaAs ridge waveguides, and in situ detection using evanescently coupled integrated NbN superconducting single photon detectors fabricated on the same chip. By temporally filtering the time-resolved luminescence signal stemming from single quantum dots we use the quantum optical circuit to perform time-resolved excitation spectroscopy on single dots and demonstrate resonance fluorescence with a line-width of 10 ± 1 μeV; key elements needed for the use of single photons in prototypical quantum photonic circuits.
High-throughput density functional calculations of solids are highly time-consuming. As an alternative, we propose a machine learning approach for the fast prediction of solid-state properties. To ...achieve this, local spin-density approximation calculations are used as a training set. We focus on predicting the value of the density of electronic states at the Fermi energy. We find that conventional representations of the input data, such as the Coulomb matrix, are not suitable for the training of learning machines in the case of periodic solids. We propose a novel crystal structure representation for which learning and competitive prediction accuracies become possible within an unrestricted class of spd systems of arbitrary unit-cell size.
•Three frequency estimation measures called instantaneous frequency, local frequency and peak frequency are explained and compared.•We also present three novel multivariate methods for the extraction ...of frequency shifts. They are based on frequency changes detected by instantaneous frequency, local frequency or peak frequency.•The proposed decomposition methods extract brain sources whose frequency estimate of interest is maximally correlated to the external/internal variable of interest.•All methods were thoroughly validated in realistic simulations and with real EEG data of 24 participants who performed a steady-state visual evoked paradigm in a BCI experiment.
Instantaneous and peak frequency changes in neural oscillations have been linked to many perceptual, motor, and cognitive processes. Yet, the majority of such studies have been performed in sensor space and only occasionally in source space. Furthermore, both terms have been used interchangeably in the literature, although they do not reflect the same aspect of neural oscillations. In this paper, we discuss the relation between instantaneous frequency, peak frequency, and local frequency, the latter also known as spectral centroid. Furthermore, we propose and validate three different methods to extract source signals from multichannel data whose (instantaneous, local, or peak) frequency estimate is maximally correlated to an experimental variable of interest. Results show that the local frequency might be a better estimate of frequency variability than instantaneous frequency under conditions with low signal-to-noise ratio. Additionally, the source separation methods based on local and peak frequency estimates, called LFD and PFD respectively, provide more stable estimates than the decomposition based on instantaneous frequency. In particular, LFD and PFD are able to recover the sources of interest in simulations performed with a realistic head model, providing higher correlations with an experimental variable than multiple linear regression. Finally, we also tested all decomposition methods on real EEG data from a steady-state visual evoked potential paradigm and show that the recovered sources are located in areas similar to those previously reported in other studies, thus providing further validation of the proposed methods.
This paper considers nonstandard hypothesis testing problems that involve a nuisance parameter. We establish an upper bound on the weighted average power of all valid tests, and develop a numerical ...algorithm that determines a feasible test with power close to the bound. The approach is illustrated in six applications: inference about a linear regression coefficient when the sign of a control coefficient is known; small sample inference about the difference in means from two independent Gaussian samples from populations with potentially different variances; inference about the break date in structural break models with moderate break magnitude; predictability tests when the regressor is highly persistent; inference about an interval identified parameter; and inference about a linear regression coefficient when the necessity of a control is in doubt.