There is considerable recent interest in the monitoring of individual surgeon or hospital surgical outcomes. If one aggregates data over time and assesses performance with a funnel plot, then the ...detection of any process deterioration or improvement could be delayed. The variable life adjusted display (VLAD) is widely used for monitoring on a case-by-case basis, but we show that use of the risk-adjusted Bernoulli cumulative sum (RA-CUSUM) chart leads to much better performance.
We use simulation to illustrate that the RA-CUSUM chart has better performance than the VLAD in detecting changes in the rates of adverse events. We recommend the RA-CUSUM approach over the VLAD approach for monitoring surgical performance. If the VLAD is used, we recommend running the RA-CUSUM chart in the background to generate signals that the process performance has changed.
After more than a decade since the introduction of Statistical Engineering by Roger Hoerl and Ronald Snee, a group of leading applied statisticians from academia, industry, and government were ...invited to discuss their perspectives on progress made, the current status of this important movement, and what future Statistical Engineering holds on the path forward in a series of two panel discussion papers. In this first article, the invited panelists focus their discussion on the past and present of Statistical Engineering. They discuss notable advances and current obstacles to progress. They also consider the unique value added by Statistical Engineering, and the possible addition of decision making to the body of knowledge. The format of the article consists of the posed questions from the moderators, a summary of key ideas from all the panelists, and then the individual detailed answers. The goal of this series of articles is to inspire statisticians to consider their possible role to advance the adoption of Statistical Engineering to solve important problems.
Statistical engineering - Part 2: Future Anderson-Cook, Christine M.; Lu, Lu; Brenneman, William ...
Quality engineering,
11/2022, Letnik:
34, Številka:
4
Journal Article
Recenzirano
In the second of two panel discussion articles focused on the evolution of statistical engineering (SE) as introduced by Roger Hoerl and Ronald Snee, a group of leading applied statisticians from ...academia, industry, and government present their perspectives on what the future might hold for this important movement. The invited panelists discuss the challenges and opportunities presented by the emergence of data science and the abundance of large amounts of data. They also consider the possible paths forward for SE, and the roles for statisticians in academia, industry, and government. The final question addresses what additional skills would be helpful to increase the effectiveness of the practice and advance SE. As with the first article, the format of the article follows the order of a posed question, a summary of key ideas, and then the detailed individual panelist answers. The article seeks to inspire statisticians to consider their possible role to leverage the potential of SE to solve important problems.
There are various methods for measuring flow rates in rivers, but all of them have practical issues and challenges. A period of exceptionally high water levels revealed substantial discrepancies ...between two measurement setups in the same waterway. Finding a causal explanation of the discrepancies was important, as the problem might have ramifications for other flow-rate measurement setups as well. Finding the causes of problems is called diagnostic problem-solving. We applied a branch-and-prune strategy, in which we worked with a hierarchy of hypotheses, and used statistical analysis as well as domain knowledge to rule out options. We were able to narrow down the potential explanations to one main suspect and an alternative explanation. Based on the analysis, we discuss the role of statistical techniques in diagnostic problem-solving and reasoning patterns that make the application of statistics powerful. The contribution to theory in statistics is not in the individual techniques but in their application and integration in a coherent sequence of studies - a reasoning strategy.
We consider a measurement system replacement case study. We show that with the collected data we can determine a decision rule for the corresponding 100 percent inspection scheme with the new ...measurement system that gives the same properties as the current system. However, we are unable to assess the misclassification rates for the inspection system nor could we determine the probabilities of discordance (defined later in this article). In addition, had statistical properties of the two measurement systems been substantially different, we could not have found an appropriate decision rule for the new scheme. We show that augmenting the collected data with available baseline information solves all of these problems.
Influenza viruses cause seasonal outbreaks in temperate climates, usually during winter and early spring, and are endemic in tropical climates. The severity and length of influenza outbreaks vary ...from year to year. Quick and reliable detection of the start of an outbreak is needed to promote public health measures.
We propose the use of an exponentially weighted moving average (EWMA) control chart of laboratory confirmed influenza counts to detect the start and end of influenza outbreaks.
The chart is shown to provide timely signals in an example application with seven years of data from Victoria, Australia.
The EWMA control chart could be applied in other applications to quickly detect influenza outbreaks.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
We recommend an approach to estimate a process performance measure (or parameter) at the present time from a stream of data where the performance may drift slowly over time. It is common practice to ...estimate current process performance using either present-time data only or including all historical data. When sample sizes by time period are small, an estimate based only on present-time data is imprecise. When the performance changes over time, including historical data in estimation trades more bias for less variability. We propose to regulate the bias/variance trade-off using estimating equations that down-weight past data. We derive approximations for the variance of the estimator and the distribution of a test statistic involving the estimator. The work is motivated by estimation of a customer loyalty measure where realistic data demonstrates the proposed approach.
Suppose we plan to assess a binary measurement system when the misclassification probabilities vary from part to part. We consider the estimation of the average error probabilities of such a system ...when a gold standard (error-free) system is available to verify the status of any part. We examine plans where we first measure a sample of n parts r times each with the binary measurement system. Then we study the impact on the precision and robustness of the estimates if we use the gold-standard system to verify the true status of none, some, or all of the sampled parts. We show that a partial verification plan has comparable performance to full verification in terms of the precision and robustness of the estimates while requiring as few as 10% of parts to be verified. When the gold-standard system is expensive or time consuming, eliminating the need to verify all parts dramatically reduces the cost of the assessment study.
Abstract Modern data collecting methods and computation tools have made it possible to monitor high‐dimensional processes. In this article, we investigate phase II monitoring of high‐dimensional ...processes when the available number of samples collected in phase I is limited in comparison to the number of variables. A new charting statistic for high‐dimensional multivariate processes based on the diagonal elements of the underlying covariance matrix is introduced and we propose a unified procedure for phases I and II by employing a self‐starting control chart. To remedy the effect of outliers, we adopt a robust procedure for parameter estimation in phase I and introduce the appropriate consistent estimators. The statistical performance of the proposed method is evaluated in phase II using the average run length (ARL) criterion in the absence and presence of outliers. Results show that the proposed control chart scheme effectively detects various kinds of shifts in the process mean vector. Finally, we illustrate the applicability of our proposed method via a manufacturing application.