Unknown unknowns - often called unk-unks - are lurking in every project, just waiting to emerge, surprise and derail plans. Project knowledge comes from learning about the project its overall ...context, its goals and objectives, the process for achieving them, the people, tools and other resources to be deployed, and how all of these affect one another. Many so-called unk-unks aren't really unkunks at all. Rather, they are things no one has bothered to find out. Indeed, there are two kinds of unknowns: unknown unknowns (things we don't know we don't know) and known unknowns (things we know we don't know). (See Converting Knowable Unk-Unks to Known Unknowns.) Every project has some of both. The techniques of conventional risk management apply only to the known unknowns. Yet some unk-unks are knowable and can be converted to known unknowns through a process of directed recognition. This article provides an overview of the targets, methods and tools the where, why and how of directed recognition. Several characteristics of a project's subsystems and context make surprises more likely.
JIT delivery with stochastic lead time Hayya, J C; Ramasesh, R V; Tyworth, J G ...
The Journal of the Operational Research Society,
20/1/1/, Letnik:
64, Številka:
1
Journal Article
Recenzirano
We examine the effect of stochastic lead times on Just-in-Time (JIT) delivery. We find that with stochastic lead times there is a possibility of order crossover, and what order crossover does is to ...transform the original lead times into effective lead times, which is an AR(1) process that is an autoregressive process of Order 1. The mean of this process is the same as the mean of the original lead time, but its variance could be much smaller. The implication is that when we consider order crossover in the analysis, the cost could be less than otherwise (but never less than that with deterministic lead times). The literature on JIT with stochastic lead times has never considered order crossover, which produces the effective delivery times (EDT). Here, we demonstrate some important properties of the EDT: that it is a Cauchy sequence, and hence it converges; that it is an AR(1) process; and that it stochastically dominates the parent lead time distribution.
In quantum mechanics, measurements cause wavefunction collapse that yields precise outcomes, whereas for non-commuting observables such as position and momentum Heisenberg's uncertainty principle ...limits the intrinsic precision of a state. Although theoretical work1 has demonstrated that it should be possible to perform simultaneous non-commuting measurements and has revealed the limits on measurement outcomes, only recently2,3 has the dynamics of the quantum state been discussed. To realize this unexplored regime, we simultaneously apply two continuous quantum non-demolition probes of non-commuting observables to a superconducting qubit. We implement multiple readout channels by coupling the qubit to multiple modes of a cavity. To control the measurement observables, we implement a 'single quadrature' measurement by driving the qubit and applying cavity sidebands with a relative phase that sets the observable. Here, we use this approach to show that the uncertainty principle governs the dynamics of the wavefunction by enforcing a lower bound on the measurement-induced disturbance. Consequently, as we transition from measuring identical to measuring non-commuting observables, the dynamics make a smooth transition from standard wavefunction collapse to localized persistent diffusion and then to isotropic persistent diffusion. Although the evolution of the state differs markedly from that of a conventional measurement, information about both non-commuting observables is extracted by keeping track of the time ordering of the measurement record, enabling quantum state tomography without alternating measurements. Our work creates novel capabilities for quantum control, including rapid state purification4, adaptive measurement5,6, measurement-based state steering and continuous quantum error correction7. As physical systems often interact continuously with their environment via non-commuting degrees of freedom, our work offers a way8,9 to study how notions of contemporary quantum foundations10-14 arise in such settings.
Determination of the optimal lot sizing strategy when the vendor offers limited time price incentives, such as pre-announcement of a price increase that will take effect after a finite time or a ...price discount that is valid for a limited time, is a common problem that has been extensively researched. A review of the literature indicates that the mathematical analysis and solution of this problem are quite complex. This complexity may deter managers from using the optimal strategy although an optimal lot sizing strategy results in the lowest cost. Managers generally prefer simple heuristic or rule-of-thumb strategies that are easy to understand and to implement, provided the total relevant cost associated with such strategies compares well with that of the optimal strategy. Therefore, it would be of significant value to managers if the cost associated with the optimal strategy can be deduced easily without recourse to complex mathematical analysis so that the simpler strategies can be quickly and easily evaluated. In this paper, we present an intuitively appealing and easy-to-compute method to determine a tight lower bound, whose value is very close to the total cost of the optimal strategy. We demonstrate, through extensive computational analysis, the adequacy of our lower bound by comparing it with the total cost associated with an optimal strategy over a wide range of operating parameters. Thus, managers can use it as a surrogate for the cost of the optimal strategy while evaluating heuristic strategies. We illustrate the application of our lower bound with numerical examples.
The analysis of the full stochastic model in which both the demand per unit time and the lead time are stochastic is complex. Analysis of the reduced stochastic inventory models in which only one of ...the parameters (either the demand per unit time or the lead time) is stochastic and the other is constant is relatively less complex.
In this paper we exploit insights from vector analysis and postulate an approximation that expresses the optimal cost of the full stochastic model in terms of the optimal costs of the two reduced models. We demonstrate the adequacy of the cost relationship in the context of one specific type of inventory model—a periodic review (
S,
S−1,
t=1) model—by performing an extensive set of simulations, using the Poisson, the exponential, and the gamma distributions to characterize demand and lead time. We also use the simulation data to develop regression relationships between the cost and an appropriate measure of variability, such as the standard deviation, the variance, or the coefficient of variation. For the cost of the full model, we find in our computations that our approximations have 98.4% accuracy for the Poisson, 96% for the exponential, and 97% for the gamma.
The performance of local Møller-Plesset second-order perturbation theory (LMP2) and the impact of domain choice upon accuracy for a series of correlation consistent basis sets have been examined. MP2 ...correlation energies were calculated for 31 molecules ranging from 4 to 26 atoms, and containing up to 10 non-hydrogen atoms. The correlation energies were extrapolated to the complete basis set (CBS) limit using various schemes for comparison. The percent CPU savings for the local MP2 calculations as compared with conventional MP2 calculations are provided.
Despite containing an extensive body of normative or prescriptive studies, quality management literature offers little by way of generally applicable guidance concerning how to measure or monitor the ...critical factors underlying strategic quality management initiatives, such as total quality management and continuous quality improvement. Although several studies have relied on survey data from a multiple set of sources to unearth models of such factors, they do not offer general guidelines to select factors appropriate in a specific setting. In this paper, in contradistinction to the multiple-source survey methodology, we take an action-research approach and present the findings of a contextually specific, single-site empirical research that we carried out at Lockheed Martin Tactical Aircraft Systems, in Fort Worth, Texas. We discuss the implications of our findings for extending our empirical understanding of the factors underlying strategic quality management programs and for the development of reliable and valid instruments to monitor them.
The performance of local Moeller-Plesset second-order perturbation theory (LMP2) and the impact of domain choice upon accuracy for a series of correlation consistent basis sets have been examined. ...MP2 correlation energies were calculated for 31 molecules ranging from 4 to 26 atoms, and containing up to 10 non-hydrogen atoms. The correlation energies were extrapolated to the complete basis set (CBS) limit using various schemes for comparison. The percent CPU savings for the local MP2 calculations as compared with conventional MP2 calculations are provided.
When supply lead times are uncertain, the simultaneous procurement from two sources offers savings in inventory holding and shortage costs. Economies are achieved if these savings outweigh the ...increase in ordering costs. In this paper we analyze dual sourcing in the context of the "reorder point, order quantity" inventory model with constant demand and stochastic lead times and compare it with single sourcing. Two cases are studied, using the uniform and the exponential distributions, which may be thought of as two extreme ways of representing stochastic lead times. In our two-vendor model, the order quantity is split equally between the two vendors and the split orders are placed simultaneously when the inventory position reaches the reorder level. A comparison of the total expected costs suggests that when the uncertainty in the lead times is high and the ordering costs are low, dual sourcing could be cost effective.
Lot streaming in multistage production systems Ramasesh, Ranga V.; Fu, Haizhen; Fong, Duncan K.H. ...
International journal of production economics,
07/2000, Letnik:
66, Številka:
3
Journal Article
Recenzirano
Lot streaming is a procedure in which a production lot is split into smaller sub-lots and moved to the next processing stage so that operations at successive stages of a multistage manufacturing ...system can be overlapped in time. Lot streaming reduces the manufacturing lead time and thereby provides an opportunity to lower the costs of holding work-in-process inventories. In this paper, we present an economic production lot size model that minimizes the total relevant cost when lot streaming is used. Using illustrative numerical examples, we show that our model can yield significant cost economies compared to the traditional approaches.