•Studied dynamic lot sizing problem with returns and hybrid products.•Developed a metaheuristic algorithm to find near optimal solutions efficiently.•Explored the profitability conditions for ...producing hybrid products.•The performance of the developed metaheuristic is good.•The proposed model performs well at medium to high holding cost environments.
For a hybrid system with manufacturing and remanufacturing, a variant of dynamic lot sizing problem is addressed in this study. In the system, manufactured and remanufactured products are produced on separate lines and sold in segmented markets. In addition to these two types of products, there are also hybrid products produced in the system. Hybrids are used to meet the excess manufactured product demand and integrate the two distinct lines. Therefore, this study investigates the profitability conditions for producing the hybrid products. Using a variant of dynamic lot sizing problem, called dynamic lot sizing problem with returns and hybrids (DLSPRH), which is a constrained mixed-integer nonlinear programming problem, the performance of the system with hybrids is compared to the same system with no hybrids. The DLSPRH is a NP-hard problem. A Genetic Algorithm based heuristic (GA_H) has been proposed to solve the DLSPRH and its capacitated version from the literature. The performance of the algorithm is tested by comparing its results with Simulated Annealing (SA), Variable Neighborhood Search (VNS) and Simulated Annealing with Neighborhood List (SA_NL). Numerical experiments show that GA_H significantly outperforms the other metaheuristic algorithms. On average, GA_H performs 2.51%, 2.24% and 2.06% better than SA, VNS and SA_NL algorithms, respectively. Another finding is that the system with hybrids performs well at medium–high holding cost environments especially when remanufacturing demand is low. Additional managerial insights are also presented.
•Develops novel model to optimally solve the job shop rework-reprocessing problem.•Proposes two algorithms that outperform the best of commonly used dispatch rules.•Modified GA outperforms MSTEPT ...algorithm by 6% to 10% for medium sized job shops.•MSTEPT algorithm outperforms modified GA by 6 to 10% for larger job shops.•Both algorithms outperform an optimal solver in several instances tested.
Scheduling challenges are typical with electronics manufacturing services (EMS) providers. The rework and reprocessing of failed electronics components consume more time in the production line, causing jobs to miss their due dates. A mathematical model and a Modified Shortest Total Estimated Processing Time (MSTEPT) Algorithm to minimize the Total Weighted Tardiness (TWT) are proposed in this research. This research then develops a novel modified Genetic Algorithm approach to solve the scheduling problem with stochastic rework and reprocessing time. While the Genetic Algorithm as a methodology to solve scheduling problems has been developed in earlier research articles, the existing set of genes in the chromosomes of a regular Genetic Algorithm would not be able of handle jobs waiting to undergo reprocessing. The modified Genetic Algorithm in this research introduces the concept of priority genes, specifically encoded to handle jobs waiting to be reprocessed after they have been reworked. Experimental results indicate that the proposed modified GA outperforms the best of different commonly used dispatch rules, in terms of solution quality. For small-to-medium-sized job shops, the proposed algorithm outperforms optimal results from CPLEX® optimal solver, as well as those from the MSTEPT algorithm.
With dwindling supply of surface water, Ground water is increasingly being used as a source of fresh water in many cities across the world. Consequently, there is an increasing need to evaluate ...groundwater potential of an area. Over the past few decades, Remote Sensing and GIS have been used for systematic investigations on potential recharge of aquifers. As in major cities of the world, the demand for water in Pune City is also increasing every year and demand outstrips the supply of surface water. This study delineated potential zones for artificial recharge across Pune City by using Multi-criteria analysis and the Analytical Hierarchy Process (AHP) techniques. Artificial recharge techniques especially the use of rainwater harvesting (RWH) are being deployed globally to augment supply of fresh water. Ground-water recharge is directly influenced by surface characteristics such as rainfall, geology, soil types, Land Use/Land Cover (LULC), drainage, lineaments/fractures, etc. Hence, six such parameters, namely, LULC, Slope, Soil texture, Rainfall, Drainage density, and Geology were considered to generate a groundwater recharge potential map. Based on the analysis, the study area was zoned into five classes, namely, low, moderate, good, very good and high groundwater potentials. About 45% of the city shows good to high potential for recharge. The results reveal that the high and good potential recharge zones lie to the western part of the city, whereas the central part (inner city) and the eastern part show medium to low potential for recharge. The results can help to identify areas for recharge and formulate a framework for systematic recharge of the existing aquifers in the area under study.
The objective of this research is introduce a new machine learning ensemble approach that is a hybridization of Bagging ensemble (BE) and Logistic Model Trees (LMTree), named as BE-LMtree, for ...improving the performance of the landslide susceptibility model. The LMTree is a relatively new machine learning algorithm that was rarely explored for landslide study, whereas BE is an ensemble framework that has proven highly efficient for landslide modeling. Upper Reaches Area of Red River Basin (URRB) in Northwest region of Viet Nam was employed as a case study. For this work, a GIS database for the URRB area has been established, which contains a total of 255 landslide polygons and eight predisposing factors i.e., slope, aspect, elevation, land cover, soil type, lithology, distance to fault, and distance to river. The database was then used to construct and validate the proposed BE-LMTree model. Quality of the final BE-LMTree model was checked using confusion matrix and a set of statistical measures. The result showed that the performance of the proposed BE-LMTree model is high with the classification accuracy is 93.81% on the training dataset and the prediction capability is 83.4% on the on the validation dataset. When compared to the support vector machine model and the LMTree model, the proposed BE-LMTree model performs better; therefore, we concluded that the BE-LMTree could prove to be a new efficient tool that should be used for landslide modeling. This research could provide useful results for landslide modeling in landslide prone areas.
As a central concept in fluid dynamics stability is fundamental in understanding transitions from laminar to turbulent flow. In continuum flows, it is well-established that a transition to turbulence ...can occur at subcritical Reynolds numbers, in contrast to theoretical predictions. In non-equilibrium molecular dynamics (NEMD), it has been widely observed that at a critical Reynolds number the fluid undergoes an ordering transition from an amorphous phase to a ‘string’ phase. Using the fluctuation theorem (FT) and the dissipation function, we generalize the classical continuum Reynolds-Orr equation to sheared molecular fluids by ascribing a natural description to the nature of stochastic perturbations, i.e. fluctuations in shear stress. Via the Poincaré inequality, we arrive at a new stability criterion by providing a lower bound on the exponential decay of perturbations, which reduces to the classical continuum result in the limit of infinite system size. We investigate the nature of these velocity perturbations and conditions necessary for growth in the kinetic energy of perturbations. We obtain a fluid dependent estimate for the critical Reynolds number by which one may estimate the critical Reynolds number at which the fluid transitions to the string phase, thus providing a framework for generalizing classical continuum theories to the microscale.
•Studied a scheduling problem with stochastic rework and reprocessing times in electronic manufacturing systems.•Developed a mathematical model to capture different system parameters including in- ...and off-line reworks.•Developed two greedy heuristics based on Total Estimated Processing Time (TEPT).•Showed that the proposed algorithms outperform compared to different dispatch rules in terms of solution quality and computation time.
This paper introduces a mathematical model for the scheduling problem with stochastic rework and reprocessing time, which is typical for electronics manufacturing services (EMS) providers. Since rework and reprocessing of jobs take more time resulting in missing the due dates, this research aims to determine the optimal job sequence on the machines. A heuristic methodology is developed that takes into consideration the Total Estimated Processing Time (TEPT), a linear combination of processing, rework, and reprocessing times. Jobs with different configurations of processing, rework, and reprocessing times representing a High-Mix–Low-Volume (HMLV) setup are tested on a single-machine job shop system to understand the effectiveness of the shortest TEPT algorithm (STEPT). Improvements to the STEPT algorithm are made and the modified STEPT algorithm (MSTEPT) is tested using a single-machine job shop setup with a larger number of jobs. Experimental results indicate that the proposed algorithm outperforms different commonly used dispatch rules in terms of solution quality and computation time. Then, experiments conducted on a multi-machine job shop setup with larger number of jobs also indicate the superior performance of the MSTEPT algorithm.
Purpose
– This paper aims to focus on integrating a lean framework in a high-mix-low-volume (HMLV) printed circuit board assembly (PCBA) environment to enhance current assembly processes and facility ...layouts. An HMLV PCBA environment is characterized by stochastic demands, a variety of products in terms of shapes and sizes and different sequences of assembly and test operations, in addition to long cycle times and high fall-out rates.
Design/methodology/approach
– Preliminary analysis indicates that the push inventory control system led to the longer cycle times, such that various lean methodologies have been applied to enhance the assembly operations. In this research, Kanban sizes for different assembly lines are also estimated to integrate and implement a “pull-system” into the lean framework. In addition, material movement and facility layout have been studied to minimize work-in-process travel time. An “iterative-MAIC” approach has been applied to implement lean principles.
Findings
– As a result, a lean manufacturing pilot line has been implemented to evaluate the effectiveness of the lean principles before rolling them out across the manufacturing floor. It has been shown that the cycle times of the pilot line products are decreased by 40 per cent and the number of defects decreased by 10-30 per cent, depending on different assembly processes, after the lean implementation.
Originality/value
– There is limited literature that addresses lean transformation in an HMLV electronics manufacturing service provider handling several product types with different testing methodologies, frequent product revision changes and higher fall-out rates. Hence, in this research, lean manufacturing has been implemented in an HMLV PCBA environment, which has the challenges of varying demand with a mix of assembly and test operations for different product families.
Abstract
This paper has developed a cost-efficient framework for flood vulnerability assessment at a local scale using a multi-parametric approach integrated with the Open Source Geographical ...Information System (GIS) and Open Remote Sensing data. The study focuses on generating a set of criteria considering three dimensions of flood vulnerability: exposure, sensitivity, and adaptive capacity (AC) on an index-based approach. These indicators were decided based on a robust analysis considering the physical and socio-economic conditions of the study area. The flood exposure was generated from the geomorphological and hydrological parameters integrated with the flood water depth, the distance to river channels, and the Modified Normalized Difference Water Index. The flood sensitivity was determined by the aggregation of local income, land use, poverty index, population density, and other parameters reflecting the socio-economic condition. The AC has been evaluated based on the Normalized Difference Vegetation Index, the density of the community service facilities, and other factors related to the coping capacity to flood. Finally, the flood vulnerability at the local scale was determined based on the integration of its contributing factors using the Analytical Hierarchical Process-based aggregated model. Results indicated that a total of 20 parameters impacted the flood vulnerability of the research area. The findings also confirmed that among the indicators of flood vulnerability of Da Nang City, the flood depth, land-use condition, and drainage system are the key factors affecting the vulnerability level. The empirical assessment showed that the study area is significantly affected by flood vulnerability with more than 60% of the area having the vulnerability level from moderate to very high. In addition, this paper points out that the vulnerability research should be localized and is not always based on the administrative units. This practice can make the decision-making process and adaptation plan more appropriate locally. Especially, this study attempted to evaluate the accuracy of the flood vulnerability map for the first time by using field survey data and the statistical report on flood damage that most of the previous studies have not conducted yet. This framework provides a valuable toolkit for flood management in data-scarce regions all over the world.
Spontaneous, random violations of the second law become relevant as the length and/or timescales become very small. Modern statistical physics tells us that the second law then needs to be replaced ...by the fluctuation theorem and, mathematically, the irreversible entropy evolves as a submartingale. This is illustrated on the Couette flow of a molecular fluid. On the continuum level, such phenomena lead to a framework of thermomechanics relying on stochastic (rather than deterministic) functionals of energy and entropy, which are applied in heat diffusion and thermoelasticity settings. Counterintuitive thermomechanical behaviors are also discussed in (1) the evolution of acceleration wavefronts of nanoscale thickness and (2) the fluid mechanics which has to enter a permeability model of a poroelastic medium with nanoscale pores.
Accurate and high resolution bathymetric data is a necessity for a wide range of coastal oceanographic research topics. Active sensing methods, such as ship-based soundings and Light Detection and ...Ranging (LiDAR), are expensive and time consuming solutions. Therefore, the significance of Satellite-Derived Bathymetry (SDB) has increased in the last ten years due to the availability of multi-constellation, multi-temporal, and multi-resolution remote sensing data as Open Data. Effective SDB algorithms have been proposed by many authors, but there is no ready-to-use software module available in the Geographical Information System (GIS) environment as yet. Hence, this study implements a Geographically Weighted Regression (GWR) based SDB workflow as a Geographic Resources Analysis Support System (GRASS) GIS module (i.image.bathymetry). Several case studies were carried out to examine the performance of the module in multi-constellation and multi-resolution satellite imageries for different study areas. The results indicate a strong correlation between SDB and reference depth. For instance, case study 1 (Puerto Rico, Northeastern Caribbean Sea) has shown an coefficient of determination (R2) of 0.98 and an Root Mean Square Error (RMSE) of 0.61 m, case study 2 (Iwate, Japan) has shown an R2 of 0.94 and an RMSE of 1.50 m, and case study 3 (Miyagi, Japan) has shown an R2 of 0.93 and an RMSE of 1.65 m. The reference depths were acquired by using LiDAR for case study 1 and an echo-sounder for case studies 2 and 3. Further, the estimated SDB has been used as one of the inputs for the Australian National University and Geoscience Australia (ANUGA) tsunami simulation model. The tsunami simulation results also show close agreement with post-tsunami survey data. The i.mage.bathymetry module developed as a part of this study is made available as an extension for the Open Source GRASS GIS to facilitate wide use and future improvements.