Design Principles for Industrie 4.0 Scenarios Hermann, Mario; Pentek, Tobias; Otto, Boris
2016 49th Hawaii International Conference on System Sciences (HICSS),
01/2016
Conference Proceeding, Journal Article
Odprti dostop
The increasing integration of the Internet of Everything into the industrial value chain has built the foundation for the next industrial revolution called Industrie 4.0. Although Industrie 4.0 is ...currently a top priority for many companies, research centers, and universities, a generally accepted understanding of the term does not exist. As a result, discussing the topic on an academic level is difficult, and so is implementing Industrie 4.0 scenarios. Based on a quantitative text analysis and a qualitative literature review, the paper identifies design principles of Industrie 4.0. Taking into account these principles, academics may be enabled to further investigate on the topic, while practitioners may find assistance in identifying appropriate scenarios. A case study illustrates how the identified design principles support practitioners in identifying Industrie 4.0 scenarios.
Designed to break with state‐centric approaches to understanding economic development, global commodity chain, global value chain (GVC), and global production network (GPN) analyses have deepened our ...understanding of the corporate governance of global lead firms and associated development outcomes in an era of globalisation. Although this research field is recognised to have provided considerable insight into private governance, a rapidly emerging body of research has given greater attention to the role of the state in GVCs and GPN. Although the state playing a role as facilitator towards firms participating in GPNs has often been an emphasis, this article argues that a variety of other roles are of increasing prominence, including as regulator, producer (state‐owned enterprises), and buyer (public procurement). A major challenge for both policymakers and researchers is to understand how a range of state initiatives not just shape but also are shaped by their positioning in GVCs and GPNs.
Probabilistic estimates of ice impact loading on the propulsion systems of vessels designed to operate in polar waters are necessary to assess the adequacy of current design specifications. Over the ...course of 13 days of operations in sea ice, high frequency inboard shaft line deformation measurements were recorded aboard the SA Agulhas II, a polar supply and research vessel, and inversion of the dataset performed to determine ice loading on the propeller. The inversion method filters out resonant vibration of the propulsion shaft around its natural frequency, and is implemented as a rapid algorithm developed for application to long time series full voyage data as well as real-time monitoring. Extreme value analysis of inferred ice-induced impact loading in active ice navigation was conducted to obtain Weibull and Gumbel distribution parameters for 1-second interval ice loading maxima. The resulting annual exceedance probability curves indicate that the loading specification used in the design of the propulsion systems for the SA Agulhas II implies a 1×10−5 to 5×10−5 probability of exceedance during a year of regular operations. These quantitative probabilistic reference values for the ice-induced impact loading exposure of a polar vessel operating in first-year and multi-year sea ice conditions represent information of key significance to propulsion design of future polar vessels.
A thorough understanding of the long-term trends and extreme characteristics in the significant wave height (SWH) contributes greatly to coastal and offshore engineering activities and mitigation of ...marine disasters. In this study, the annual spatiotemporal variability of the SWH in the South China Sea (SCS) and the return periods for six locations are evaluated based on ERA5 wave reanalysis in a long time period (1950–2020). Long-term trends are estimated by using a popular non-parametric method, the Theil-Sen estimator, and then mapped to show the spatial variability of mean and extreme SWH. Basin-averaged analysis is also performed to investigate the general tendency of the mean SWH in the SCS, with an increasing rate of 0.11 cm/year. The well-known Mann-Kendall test is used to assess the significance of the trends. Significant positive trends in extreme SWH are mainly distributed in the eastern part of the central SCS around the Luzon Strait and the southwestern part of the SCS. In the extreme value analysis, by comparing 2 classical extreme value distribution models combined with 5 sampling methods, the Generalized Pareto Distribution incorporating the Peak Over Threshold (GPD-POT) method has turned out to be suitable for evaluating the return periods in the SCS based on 71-year SWH dataset. Extreme value analysis for different time lengths shows a correlation between the return level and the time span. Small and medium samples may lead to unstable parameter estimation and increased errors. The 100-year return values obtained by GPD-POT using the 71-year wave reanalysis are more credible at P1-P6 with 8.86, 11.79, 11.35, 10.52, 6.50 and 7.71 m, respectively. Both the estimated return periods and the number of extreme events quantified by the Method of Independent Storms (MIS) indicate that extreme events are closely related to the number of tropical cyclones. Seasonally, most extreme events in the SCS occur from June to December, with the summertime maximum SWH situated above the 15th degree of northern latitude and rapidly shifting to south in the fall.
Gridded data products, for example interpolated daily measurements of precipitation from weather stations, are commonly used as a convenient substitute for direct observations because these products ...provide a spatially and temporally continuous and complete source of data. However, when the goal is to characterize climatological features of extreme precipitation over a spatial domain (e.g., a map of return values) at the native spatial scales of these phenomena, then gridded products may lead to incorrect conclusions because daily precipitation is a fractal field and hence any smoothing technique will dampen local extremes. To address this issue, we create a new “probabilistic” gridded product specifically designed to characterize the climatological properties of extreme precipitation by applying spatial statistical analysis to daily measurements of precipitation from the Global Historical Climatology Network over the contiguous United States. The essence of our method is to first estimate the climatology of extreme precipitation based on station data and then use a data-driven statistical approach to interpolate these estimates to a fine grid. We argue that our method yields an improved characterization of the climatology within a grid cell because the probabilistic behavior of extreme precipitation is much better behaved (i.e., smoother) than daily weather. Furthermore, the spatial smoothing innate to our approach significantly increases the signal-to-noise ratio in the estimated extreme statistics relative to an analysis without smoothing. Finally, by deriving a data-driven approach for translating extreme statistics to a spatially complete grid, the methodology outlined in this paper resolves the issue of how to properly compare station data with output from earth system models. We conclude the paper by comparing our probabilistic gridded product with a standard extreme value analysis of the Livneh gridded daily precipitation product. Our new data product is freely available on the Harvard Dataverse (
https://bit.ly/2CXdnuj
).
Spatially weighted averages of Palmer Drought Severity Index (PDSI) over central and southern California show that the 1 year 2014 drought was not as severe as previously reported, but it still is ...the most severe in the 1895–2014 instrumental record. Using the typical adjustment procedure that matches the mean and standard deviation of tree ring PDSI values to those of instrumental data shows over 10 droughts from 800 to 2006 that were more severe than the 1 year 2014 drought, with the 2014 drought having a return period of 140–180 years. Quantile mapping allows for a closer correspondence between instrumental and tree ring PDSI probability distributions and produces return periods of 700–900 years for the 1 year 2014 drought. Associated cumulative 3 and 4 year droughts, however, are estimated to be much more severe. The 2012–2014 drought is nearly a 10,000 year event, while the 2012–2015 drought has an almost incalculable return period and is completely without precedent.
Key Points
The 1 year 2014 California drought was a multicentury‐scale event
The cumulative 2012–2014 California drought was a multimillennial‐scale event
The relative severity of the cumulative 2012–2015 drought is unprecedented
This work is motivated by the challenge organized for the 10th International Conference on Extreme-Value Analysis (EVA2017) to predict daily precipitation quantiles at the
99.8
%
level for each month ...at observed and unobserved locations. Our approach is based on a Bayesian generalized additive modeling framework that is designed to estimate complex trends in marginal extremes over space and time. First, we estimate a high non-stationary threshold using a gamma distribution for precipitation intensities that incorporates spatial and temporal random effects. Then, we use the Bernoulli and generalized Pareto (GP) distributions to model the rate and size of threshold exceedances, respectively, which we also assume to vary in space and time. The latent random effects are modeled additively using Gaussian process priors, which provide high flexibility and interpretability. We develop a penalized complexity (PC) prior specification for the tail index that shrinks the GP model towards the exponential distribution, thus preventing unrealistically heavy tails. Fast and accurate estimation of the posterior distributions is performed thanks to the integrated nested Laplace approximation (INLA). We illustrate this methodology by modeling the daily precipitation data provided by the EVA2017 challenge, which consist of observations from 40 stations in the Netherlands recorded during the period 1972–2016. Capitalizing on INLA’s fast computational capacity and powerful distributed computing resources, we conduct an extensive cross-validation study to select the model parameters that govern the smoothness of trends. Our results clearly outperform simple benchmarks and are comparable to the best-scoring approaches of the other teams.