Quantum coherence is an essential ingredient in quantum information processing and plays a central role in emergent fields such as nanoscale thermodynamics and quantum biology. However, our ...understanding and quantitative characterization of coherence as an operational resource are still very limited. Here we show that any degree of coherence with respect to some reference basis can be converted to entanglement via incoherent operations. This finding allows us to define a novel general class of measures of coherence for a quantum system of arbitrary dimension, in terms of the maximum bipartite entanglement that can be generated via incoherent operations applied to the system and an incoherent ancilla. The resulting measures are proven to be valid coherence monotones satisfying all the requirements dictated by the resource theory of quantum coherence. We demonstrate the usefulness of our approach by proving that the fidelity-based geometric measure of coherence is a full convex coherence monotone, and deriving a closed formula for it on arbitrary single-qubit states. Our work provides a clear quantitative and operational connection between coherence and entanglement, two landmark manifestations of quantum theory and both key enablers for quantum technologies.
Quantum coherence is the key resource for quantum technology, with applications in quantum optics, information processing, metrology, and cryptography. Yet, there is no universally efficient method ...for quantifying coherence either in theoretical or in experimental practice. I introduce a framework for measuring quantum coherence in finite dimensional systems. I define a theoretical measure which satisfies the reliability criteria established in the context of quantum resource theories. Then, I present an experimental scheme implementable with current technology which evaluates the quantum coherence of an unknown state of a d-dimensional system by performing two programmable measurements on an ancillary qubit, in place of the O(d2) direct measurements required by full state reconstruction. The result yields a benchmark for monitoring quantum effects in complex systems, e.g., certifying nonclassicality in quantum protocols and probing the quantum behavior of biological complexes.
A
bstract
We calculate the gauge terms of the one-loop anomalous dimension matrix for the dimension-six operators of the Standard Model effective field theory (SM EFT). Combining these results with ...our previous results for the
λ
and Yukawa coupling terms completes the calculation of the one-loop anomalous dimension matrix for the dimension-six operators. There are 1350
CP
-even and 1149
CP
-odd parameters in the dimension-six Lagrangian for 3 generations, and our results give the entire 2499 × 2499 anomalous dimension matrix. We discuss how the renormalization of the dimension-six operators, and the additional renormalization of the dimension
d
≤ 4 terms of the SM Lagrangian due to dimension-six operators, lays the groundwork for future precision studies of the SM EFT aimed at constraining the effects of new physics through precision measurements at the electroweak scale. As some sample applications, we discuss some aspects of the full RGE improved result for essential processes such as
gg
→
h
,
h
→
γγ
and
h
→
Zγ
, for Higgs couplings to fermions, for the precision electroweak parameters
S
and
T
, and for the operators that modify important processes in precision electroweak phenomenology, such as the three-body Higgs boson decay
h
→
Z ℓ
+
ℓ
−
and triple gauge boson couplings. We discuss how the renormalization group improved results can be used to study the flavor problem in the SM EFT, and to test the minimal flavor violation (MFV) hypothesis. We briefly discuss the renormalization effects on the dipole coefficient
C
eγ
which contributes to
μ
→
eγ
and to the muon and electron magnetic and electric dipole moments.
With the introduction of computed tomography (CT) for purpose of dimensional quality control, the problem has arisen of identifying the measurement uncertainty of CT-based length measurements. This ...paper presents a conceptual framework for CT measurement uncertainty based on the ISO-GUM, starting from the product of the voxel size and the number of voxels as the description of every CT measurement. Those two main uncertainty contributors are further subdivided and illustrated based on various measurements. Finally, this paper shows how a measurement procedure, based on voxel size and edge correction can eliminate some terms of the total uncertainty budget.
This paper presents a novel fringe projection technique for fast three-dimensional (3-D) shape measurements of moving highly reflective objects. By combining the standard three-step phase-shifting ...fringe patterns with a digital speckle image, dynamic 3-D reconstructions of shiny surfaces can be efficiently achieved with only four projected patterns. The phase measurement is performed by three-step phase-shifting algorithm as it uses the theoretical minimum number of fringe patterns for phase-shifting profilometry. To avoid the camera saturation, a dual-camera fringe projection system is built to measure shiny objects from two different directions. The erroneous phase obtained from a saturated pixel is corrected by the phase of its corresponding pixel in the other view which is free from the saturation problem. To achieve high measurement accuracy, the corresponding high light intensity areas in cameras are found by sub-pixel matches of the speckle pattern in either view. Benefited from the trifocal tensor constraint, the corresponding points in the two wrapped phase maps can be directly established, and thus, the difficulties in determining the correct fringe order for the discontinuous or isolated surfaces can be effectively bypassed. Experimental results indicate that the proposed method is able to successfully measure highly reflective surfaces for both stationary and dynamic scenes.
•A novel method for fast 3-D measurements of moving shiny objects is presented.•Only four patterns are used for efficient 3-D reconstructions.•To avoid camera saturation, a dual-camera fringe projection system is built.•By the trifocal tensor, the phase unwrapping is robust to discontinuities.
Precise dimensional measurements, predominantly through coordinate metrology, largely influence the quality control in manufacturing industries. Other than the standard coordinate measuring machines ...(CMMs), coordinate metrology also functions in conjunction with measurement systems utilizing structured light, imaging, laser triangulation, photogrammetry and computer tomography. An articulated arm coordinate measuring machine (AACMM) or portable CMM provides enhanced flexibility and diminished weight as compared to the conventional CMM. Periodic reverification of articulated arm CMM is essential to ascertain accurate and precise dimensional measurements. In the present experimental investigation, an articulated arm coordinate measuring machine has been verified as per ISO 10360–12:2016 using one-dimensional standard artefact (KOBA Step Gauge, 1220 mm). The measurement uncertainty estimation has been carried out using Monte Carlo Simulation (MCS) and compared with ISO GUM (law of propagation of uncertainties: LPU) outcomes. Established expanded uncertainties and measured mean values acquired from the two approaches were observed to be in reasonable concordance.
There are two key motivations for this paper: (1) the need to respond to the often observed rejections of efficiency studies’ results by management as they claim that a single-perspective evaluation ...cannot fully reflect the operating units’ multi-function nature; and (2) a detailed bank branch performance assessment that is acceptable to both line managers and senior executives is still needed. In this context, a two-stage Data Envelopment Analysis approach is developed for simultaneously benchmarking the performance of operating units along different dimensions (for line managers) and a modified Slacks-Based Measure model is applied for the first time to aggregate the obtained efficiency scores from stage one and generate a composite performance index for each unit. This approach is illustrated by using the data from a major Canadian bank with 816 branches operating across the nation. Three important branch performance dimensions are evaluated: Production, Profitability, and Intermediation. This approach improves the reality of the performance assessment method and enables branch managers to clearly identify the strengths and weaknesses in their operations. Branch scale efficiency and the impacts of geographic location and market size on branch performance are also investigated. This multi-dimensional performance evaluation approach may improve management acceptance of the practical applications of DEA in real businesses.
This article introduces the Multidimensional Research Assessment Matrix of scientific output. Its base notion holds that the choice of metrics to be applied in a research assessment process depends ...on the unit of assessment, the research dimension to be assessed, and the purposes and policy context of the assessment. An indicator may by highly useful within one assessment process, but less so in another. For instance, publication counts are useful tools to help discriminate between those staff members who are research active, and those who are not, but are of little value if active scientists are to be compared with one another according to their research performance. This paper gives a systematic account of the potential usefulness and limitations of a set of 10 important metrics, including altmetrics, applied at the level of individual articles, individual researchers, research groups, and institutions. It presents a typology of research impact dimensions and indicates which metrics are the most appropriate to measure each dimension. It introduces the concept of a “meta‐analysis” of the units under assessment in which metrics are not used as tools to evaluate individual units, but to reach policy inferences regarding the objectives and general setup of an assessment process.
This article evaluates a theoretical model based on hypothesized relationships among four constructs, namely, destination image, place attachment, personal involvement, and visitors’ satisfaction as ...antecedents of loyalty. These relationships are explored for a sample of 705 international visitors staying in hotels on the island of Mauritius. Confirmatory factor analysis is used initially to ascertain the dimensions of the various constructs but also to assess the convergent and discriminant validity of the measurement items. The structural model indicates that destination image, personal involvement and place attachment are antecedents of visitors’ loyalty but this relationship is mediated by satisfaction levels. The findings offer important implications for tourism theory and practice.
The existence of quantum correlations that allow one party to steer the quantum state of another party is a counterintuitive quantum effect that was described at the beginning of the past century. ...Steering occurs if entanglement can be proven even though the description of the measurements on one party is not known, while the other side is characterized. We introduce the concept of steering maps, which allow us to unlock sophisticated techniques that were developed in regular entanglement detection and to use them for certifying steerability. As an application, we show that this allows us to go beyond even the canonical steering scenario; it enables a generalized dimension-bounded steering where one only assumes the Hilbert space dimension on the characterized side, with no description of the measurements. Surprisingly, this does not weaken the detection strength of very symmetric scenarios that have recently been carried out in experiments.