Quantum Metropolis sampling TEMME, K; OSBORNE, T. J; VOLLBRECHT, K. G ...
Nature (London),
03/2011, Letnik:
471, Številka:
7336
Journal Article
Recenzirano
Odprti dostop
The original motivation to build a quantum computer came from Feynman, who imagined a machine capable of simulating generic quantum mechanical systems--a task that is believed to be intractable for ...classical computers. Such a machine could have far-reaching applications in the simulation of many-body quantum physics in condensed-matter, chemical and high-energy systems. Part of Feynman's challenge was met by Lloyd, who showed how to approximately decompose the time evolution operator of interacting quantum particles into a short sequence of elementary gates, suitable for operation on a quantum computer. However, this left open the problem of how to simulate the equilibrium and static properties of quantum systems. This requires the preparation of ground and Gibbs states on a quantum computer. For classical systems, this problem is solved by the ubiquitous Metropolis algorithm, a method that has basically acquired a monopoly on the simulation of interacting particles. Here we demonstrate how to implement a quantum version of the Metropolis algorithm. This algorithm permits sampling directly from the eigenstates of the Hamiltonian, and thus evades the sign problem present in classical simulations. A small-scale implementation of this algorithm should be achievable with today's technology.
We analyze the validity of the chaining syllogism in fuzzy systems, i.e., whether two fuzzy rules IF F, THEN G, and IF G, THEN H imply the rule IF F, THEN H. Conditions are given under which this ...basic deduction scheme holds. "If A is predicated of all B, and B of all C, A must necessarily be predicated of all C." ;-The chaining syllogism according to Aristotle's Prior Analytics.
Bacterial pathogenesis requires the precise spatial and temporal control of gene expression, the dynamics of which are controlled by regulatory networks. A network encoded within
Salmonella ...Pathogenicity Island 1 controls the expression of a type III protein secretion system involved in the invasion of host cells. The dynamics of this network are measured in single cells using promoter-green fluorescent protein (gfp) reporters and flow cytometry. During induction, there is a temporal order of gene expression, with transcriptional inputs turning on first, followed by structural and effector genes. The promoters show varying stochastic properties, where graded inputs are converted into all-or-none and hybrid responses. The relaxation dynamics are measured by shifting cells from inducing to noninducing conditions and by measuring fluorescence decay. The gfp expressed from promoters controlling the transcriptional inputs (
hilC and
hilD) and structural genes (
prgH) decay exponentially, with a characteristic time of 50–55 min. In contrast, the gfp expressed from a promoter controlling the expression of effectors (
sicA) persists for 110
±
9 min. This promoter is controlled by a genetic circuit, formed by a transcription factor (InvF), a chaperone (SicA), and a secreted protein (SipC), that regulates effector expression in response to the secretion capacity of the cell. A mathematical model of this circuit demonstrates that the delay is due to a split positive feedback loop. This model is tested in a Δ
sicA knockout strain, where
sicA is complemented with and without the feedback loop. The delay is eliminated when the feedback loop is deleted. Furthermore, a robustness analysis of the model predicts that the delay time can be tuned by changing the affinity of SicA:InvF multimers for an operator in the
sicA promoter. This prediction is used to construct a targeted library, which contains mutants with both longer and shorter delays. This combination of theory and experiments provides a platform for predicting how genetic perturbations lead to changes in the global dynamics of a regulatory network.
When no expert knowledge is available, fuzzy if-then rules may be extracted from examples of performance of a system. For this, an a priori decision on the number of linguistic terms of the ...linguistic variables may be required. This may induce a "rigid granularity", usually finer than that actually required by the system. Fuzzy Decision Diagrams are introduced as an efficient data structure to represent fuzzy rule bases and to systematically check their completeness and consistency. Moreover if the hypothesis of rigid granularity holds, reordering of the variables of a Fuzzy Decision Diagram may lead to a compacter and more precise rule base. The concept of reconvergent subgraphs is introduced to support the search for effective reorderings.
One part of today's environmental pollution problem is referred to by the term contaminated soils. In Germany specialists expect to find approximately 260 000 sites suspected of being contaminated ...after completion of the registration phase.
This large number, the urgency of carrying out measures to protect the environment and limited personnel and financial resources result in the necessity for a computer support. Detailed considerations of deficiencies of existing computer systems and formal estimation methods lead to a bipartite knowledge-based approach supporting the relative estimation of hazard.
Multidimensional classification, the core of this approach, is mathematically well-founded and serves as an extension of methods already used successfully in practical operation.
The unsuitability of formal risk analysis methods and various sources of incompleteness, uncertainty and vagueness of the whole research field motivate the use of fuzzy methods, and in particular the use of fuzzy classification providing a rough ranking method. Feature generation, the other main part of the approach, allows selecting, valuating and tuning the properties of the sites in such a way to ensure an optimal classification. For maximizing the expressive power of the system's results, the user is enabled to compromise between a detailed survey of a site and an easy to survey representation of a site with resulting loss of information caused by a certain a priori aggregation of properties.
The transmitter and receiver positions of a bistatic radar are highly influential on its performance in radar target identification since the radar cross-section of a target varies with these ...positions. In this study, radar target identification performance using calculated bistatic scattering data for three full-scale models and measured data for four-scale-model targets is analyzed and compared. FFT-based CLEAN is used for shift-invariant feature extraction from the bistatic scattering data of each target, and a multilayered perceptron neural network is used as a classifier. The optimum receiver position is found by comparing the calculated identification probabilities while changing the position of the bistatic radar receiver. The identification results using calculated data and measured data show that an optimally positioned bistatic radar yields better identification results, demonstrating the importance of the positions of the transmitter and receiver for bistatic radar.
The density matrix in quantum mechanics parameterizes the statistical properties of the system under observation, just like a classical probability distribution does for classical systems. The ...expectation value of observables cannot be measured directly, it can only be approximated by applying classical statistical methods to the frequencies by which certain measurement outcomes (clicks) are obtained. In this paper, we make a detailed study of the statistical fluctuations obtained during an experiment in which a hypothesis is tested, i.e. the hypothesis that a certain setup produces a given quantum state. Although the classical and quantum problem are very much related to each other, the quantum problem is much richer due to the additional optimization over the measurement basis. Just as in the case of classical hypothesis testing, the confidence in quantum hypothesis testing scales exponentially in the number of copies. In this paper, we will argue 1) that the physically relevant data of quantum experiments is only contained in the frequencies of the measurement outcomes, and that the statistical fluctuations of the experiment are essential, so that the correct formulation of the conclusions of a quantum experiment should be given in terms of hypothesis tests, 2) that the (classical) \(\chi^2\) test for distinguishing two quantum states gives rise to the quantum \(\chi^2\) divergence when optimized over the measurement basis, 3) present a max-min characterization for the optimal measurement basis for quantum goodness of fit testing, find the quantum measurement which leads both to the maximal Pitman and Bahadur efficiency, and determine the associated divergence rates.
Engineers in many disciplines use the electromagnetic properties of materials to evaluate and predict the performance of systems. Microwave engineers use these properties in the design of high ...frequency devices. Civil, mechanical, and biomedical engineers use the properties, either directly or indirectly, as indicators of the performance of structures and mechanical systems, e,g, in the evaluation of water-to-cement ratio (Mubarak, K. et al., Instrumentation and Measurement, IEEE Trans, v50, no.5, pp.1255-1263, Oct. 2001).