Spare part provisioning for asset-intensive companies is a complicated problem due to the large number of items, low demand rates, and multi-echelon environment. A primary strategy for reducing the ...size and managing a large number of spare parts is using grouping techniques and data aggregation. In this paper, we address the question of how to reduce the size and complexity of large-scale, two-echelon, service part provisioning systems to benefit both inventory service levels and managerial processes while considering performance trade-offs. This paper contributes a performance-based inventory classification approach for a two-echelon inventory model by developing a novel ranking method. First, it defines the concept of the artificial stocking policy as a new classification criterion in the literature. Then, it adopts a non-subjective weighted linear scoring method for ranking items in the entire network. Finally, it presents a heuristic partitioning method, which is evaluated and compared with complete enumeration and eight alternative clustering and classification methods. The proposed model is implemented and tested in the context of the classic repairable spare part inventory model, called the VARI-METRIC. The results indicate that the proposed method is easy to apply and significantly outperforms the alternatives.
This paper presents a massively parallel, cloud-computing framework for the ad hoc evaluation of discrete-event simulation (DES) models to enable broad exploration of the design space for model ...parameters. Parallel evaluation is enabled through use of a serverless computing environment allowing thousands of simultaneous experiments, on demand, without the need to explicitly provision or manages hardware. A standard Simulation Evaluation application programming interface (API) was designed for evaluating simulation functions that enables language independence between client application and simulation model, encouraging reuse of simulation models for multiple purposes (what-if analysis, ranking and selection, sensitivity analysis, or optimization). Extensions to the Java Simulation Library (JSL) 27 enable rapid deployment of models built with the JSL as parameterized serverless functions implementing the Simulation Evaluation API. New Java packages facilitate the calling of any serverless functions that implement the Simulation Evaluation API.
Emphasizes a hands-on approach to learning statistical analysis and model building through the use of comprehensive examples, problems sets, and software applications With a unique blend of theory ...and applications, Simulation Modeling and Arena®, Second Edition integrates coverage of statistical analysis and model building to emphasize the importance of both topics in simulation. Featuring introductory coverage on how simulation works and why it matters, the Second Edition expands coverage on static simulation and the applications of spreadsheets to perform simulation. The new edition also introduces the use of the open source statistical package, R, for both performing statistical testing and fitting distributions. In addition, the models are presented in a clear and precise pseudo-code form, which aids in understanding and model communication. Simulation Modeling and Arena, Second Edition also features: Updated coverage of necessary statistical modeling concepts such as confidence interval construction, hypothesis testing, and parameter estimation Additional examples of the simulation clock within discrete event simulation modeling involving the mechanics of time advancement by hand simulation A guide to the Arena Run Controller, which features a debugging scenario New homework problems that cover a wider range of engineering applications in transportation, logistics, healthcare, and computer science A related website with an Instructor's Solutions Manual, PowerPoint® slides, test bank questions, and data sets for each chapter Simulation Modeling and Arena, Second Edition is an ideal textbook for upper-undergraduate and graduate courses in modeling and simulation within statistics, mathematics, industrial and civil engineering, construction management, business, computer science, and other departments where simulation is practiced. The book is
also an excellent reference for professionals interested in mathematical modeling, simulation, and Arena.
This paper develops a data-driven model that quantifies the benefits of supply chain collaboration initiatives such as a continuous replenishment program (CRP). CRP is a well-established supply chain ...collaboration program that is widely used in business. The model computes the cost savings of CRP for both partners involving inventory holding, transportation and ordering/handling cost components. The savings drivers associated with each cost component are identified and used to quantify the impact of CRP. The model is applied in a healthcare supply chain case study where a manufacturer seeks to estimate the cost savings of a business relationship with a distributor. The results indicate that CRP reduces the total cost of the supply chain by 19.1%. In this instance, the distributor gains disproportionately more savings in the shared cost components. The variability and sensitivity of cost savings across the network reveal the supply chain parameters that affect the partner savings.
Purpose - The purpose of this paper is to address the inefficiency in resource allocation for disaster relief procurement operations. It presents a holistic and reconfigurable procurement ...auctions-based framework which includes the announcement construction, bid construction and bid evaluation phases.Design methodology approach - The holistic framework is developed in a way that auctioneers and bidders compete amongst each other in multiple rounds of the procurement auction. Humanitarian organization in disaster locations are considered as auctioneers (buyers) and suppliers are considered as bidders.Findings - Unique system parameters (e.g. announcement options, priority of items, bidder strategies, etc.) are introduced to represent the disaster relief environment in a practical way. The framework is verified by simulation and optimization techniques using the system characteristics of the disaster relief environment as an input. Based on the parameters and their values, behavioural changes of auctioneers and suppliers are observed.Originality value - Combining the three phases of procurement auctions is unique both in the auction literature and in the disaster relief research, and it helps the humanitarian organizations supply the immediate and long-term requirements in the disaster location more efficiently.
•We present a new model for integrating facility location and hardening decisions.•We convert a natural three-level formulation to a single-level formulation.•We present algorithms for the single- ...and bi-objective versions of the problem.•We report the runtime of the single- and bi-objective algorithms.•Risk was sizably improved without a comparable degradation to normal performance.
Two methods of reducing the risk of disruptions to distribution systems are (1) strategically locating facilities to mitigate against disruptions and (2) hardening facilities. These two activities have been treated separately in most of the academic literature. This article integrates facility location and facility hardening decisions by studying the minimax facility location and hardening problem (MFLHP), which seeks to minimize the maximum distance from a demand point to its closest located facility after facility disruptions. The formulation assumes that the decision maker is risk averse and thus interested in mitigating against the facility disruption scenario with the largest consequence, an objective that is appropriate for modeling facility interdiction. By taking advantage of the MFLHP’s structure, a natural three-stage formulation is reformulated as a single-stage mixed-integer program (MIP). Rather than solving the MIP directly, the MFLHP can be decomposed into sub-problems and solved using a binary search algorithm. This binary search algorithm is the basis for a multi-objective algorithm, which computes the Pareto-efficient set for the pre- and post-disruption maximum distance. The multi-objective algorithm is illustrated in a numerical example, and experimental results are presented that analyze the tradeoff between objectives.
Physician preference items or PPIs are medical items recommended by physicians for use in medical procedures and other treatments. The recommendation of PPIs by individual physicians can cause the ...variety of item types that need to be managed within a health care supply chain to increase over time. To better manage the PPI selection process, healthcare organizations often select items through value analysis and discussion teams, which are highly subjective. To better control PPIs, this work uses multiple-objective decision analysis (MODA) to develop a structured quantitative framework for the PPI selection process. The established decision-making framework is based on the theory of multi-objective value analysis. It offers a structured and educated guide to decision-makers for improving value analysis outcomes, advocating sustainable healthcare management strategies. The model was tested and validated through two case studies on two different items in two hospitals in Jordan.
This paper presents a two-echelon non-repairable spare parts inventory system that consists of one warehouse and
m identical retailers and implements the reorder point, order quantity (
R,
Q) ...inventory policy. We formulate the policy decision problem in order to minimize the total annual inventory investment subject to average annual ordering frequency and expected number of backorder constraints. In order to solve the problem, we decompose the system by echelon and location, derive expressions for the inventory policy parameters, and develop an iterative heuristic optimization algorithm. Experimentation showed that our optimization algorithm is an efficient and effective method for setting the policy parameters in large-scale inventory systems.
A continuous-review, two-echelon inventory system with one central warehouse and an arbitrary number of non-identical retailers is considered in this study. Retailers face independent Poisson demands ...and apply standard (r, Nq) policies. Filled orders at the central warehouse must be consolidated into loads before shipping to the retailer level. New modeling options for the backorder processing and load-building processes are considered. Employing simulation, a set of experiments is performed to illustrate how different processing rules for the backlogging and load-building queues affect the lead-time experienced at the retailer level. Simulation results indicate that there are cases where considerable improvements can be gained from using different processing rules in backlogging and load-building queues.
This paper examines the robustness of lead time demand models for the continuous review (
r,
Q) inventory policy. A number of classic distributions, (e.g. normal, lognormal, gamma, Poisson and ...negative binomial) as well as distribution selection rules are examined under a wide variety of demand conditions. First, the models are compared to each other by assuming a known demand process and evaluating the errors associated with using a different model. Then, the models are examined using a large sample of simulated demand conditions. Approximation results of inventory performance measures—ready rate, expected number of backorders and on-hand inventory levels are reported. Results indicate that distribution selection rules have great potential for modeling the lead time demand.
►Robustness of lead time demand models is examined for the continuous review
(
r
,
Q
)
inventory policy. ►Novel strategies for selecting the most appropriate lead time demand distribution are introduced. ►Analytical and simulation evaluation are performed to examine classic distributions and selection rules. ►We find that distribution selection rules have great potential for modeling the lead time demand.