Abstract
Given the upcoming High-Luminosity LHC Upgrade, the performance requirements for the trigger systems associated with the LHC experiments will increase due to the larger volume of data to be ...processed. One of the possibilities that the ATLAS Collaboration is evaluating for upgrading the software-based portion of its trigger system is the use of Graphical Processing Units as hardware accelerators. The present work focuses on the GPU acceleration of the Topological Clustering algorithm, which is used to reconstruct calorimeter showers by grouping cells according to their signal-to-noise ratio. A more GPU parallelizable version of the Topological Clustering, called Topo-Automaton Clustering, was implemented within AthenaMT, the software framework of the ATLAS trigger, and its results were compared to those of the standard CPU algorithm to ensure physical validity is maintained. Time measurements suggest an average improvement of the event processing time by a factor between 3.5 and 5.5 (depending on the kind of the event), though less than 20% of that time corresponds to the algorithm itself, suggesting that the main bottleneck lies in data transfers and conversions.
Multi-stage assembly systems where the demand for components depends on the market-driven demand for end products, are commonly encountered in practice. Production Planning and Control (PPC) systems ...for this production context include Kanban, Materials Requirement Planning (MRP), Optimised Production Technology (OPT), and Demand Driven MRP (DDMRP). All four of these PPC systems are widely applied in practice and literature abounds on each of these systems. Yet, studies comparing these systems are scarce and remain largely inconclusive. In response, this study uses simulation to assess the performance of all four PPC systems under different levels of bottleneck severity and due date tightness. Results show that MRP performs the worst, which can be explained by the enforcement of production start dates. Meanwhile, Kanban and DDMRP perform the best if there is no bottleneck. If there is a bottleneck then DDMRP and OPT perform the best, with DDMRP realising lower inventory levels. If there is a severe bottleneck, then the performance results for DDMRP and OPT converge. This identification of contingency factors not only resolves some of the inconsistencies in the literature but also has important implications for the applicability of these four PPC systems in practice.
Abstract
Pollutants discharged by roads may impact water bodies and soils. The best method to characterise road runoff is by monitoring, which is not always possible due to human or material ...constraints. Therefore, prediction tools can be a valuable method to manage road runoff discharges and protect the environment. The present work reviewed and evaluated international tools for road runoff quality prediction, in order to assess if an existing tool could be suitable for wide usage by stakeholders in Europe. Four tools from the USA and Europe were selected and tested at 22 road sites located in regions with annual precipitation values ranging from 500 to 1,000 mm, from seven European countries. The results for the site median concentration (SMC) of total suspended solids (TSS), Zn, Cu, Pb and Cd showed coefficients of determination (R2) from 0.0004 to 0.2890 for the different pollutants and tools. It was concluded that none of the tools could predict the road runoff pollutant concentrations, except for the country where it had been calibrated. The findings support practitioners and researchers all over the world, pointing out directions, and gaps to be filled, regarding the management of road runoff discharges and use of prediction tools.
Workload Control withholds orders from the shop floor in a backlog from which they are released to meet certain performance metrics. This release decision precedes the execution of orders at shop ...floor stations. For each station there are consequently three types of workload: indirect, released work that is still upstream of the station; direct, work that is currently at the station; and, completed, work that is still on the shop floor but is downstream of the station. Most release methods control an aggregate workload made up of some representation of at least two of these three workload types. Yet the core objective of Workload Control release methods relates to only one of the three types - that is, to create a small, stable direct load in front of each station. Clearly, order release would be greatly simplified if only the direct load had to be considered. Using discrete event simulation, we show that Direct Workload Control leads to performance levels that match those of more complex and sophisticated approaches to Workload Control. Further, it greatly simplifies continuous order release, decentralising the release decision by allowing it to be executed at each gateway station. This has important implications for research and practice.
Highlights
Presents a new Workload Control release method that controls the direct load only.
The new method significantly simplifies workload calculations.
The new method can be decentralised with control exercised locally at gateway stations.
Simulation results demonstrate comparable performance to more sophisticated methods.
The new method improves the performance of large jobs.
Purpose
Building on contingency theory, this paper aims to investigate the extent to which the “4Ps international adaptation strategy” and internationalization intensity shape the ...servitization–profitability relationship.
Design/methodology/approach
The authors use primary (survey) and secondary (archival) data to perform multiple regression analysis.
Findings
The results indicate a positive relationship between servitization and profitability, and international intensity strengthens this association. The effects, however, are not consistent across the 4Ps – the price international adaptation strategy strengthens the positive relationship between servitization and profitability, while product and place international adaptation strategies weaken that relationship.
Practical implications
The findings have implications for the role of international intensity and the 4Ps in the marketing servitization context.
Originality/value
The study provides guidance for small firms in realizing higher performance by leveraging the 4Ps in the servitization context. Counter to expectations, placement and product lead to lower performance with increasing servitization, whereas price strengthens this relationship. The study adds to the international industrial management and marketing literature, providing evidence that contingency factors such as international marketing mix adaptation/standardization strategies moderate the servitization–profitability relationship.
Uncertainty has been shown to reduce the willingness to cooperate in various social dilemmas and negatively affect prosocial behavior. However, some studies showed that uncertainty does not always ...decrease prosocial behavior, depending on the type of uncertainty. More specifically, recent research has shown that prosocial behavior tends to increase under impact uncertainty-uncertainty about the consequences for others if they become infected. In addition, researchers have argued that intuition favors prosocial behavior while deliberation leads to selfish behavior. Our study explored how intuitive (time pressure) or deliberate mental processing, under outcome, or impact uncertainty affect prosocial behavior in the context of the COVID-19 pandemic. Our sample consists of 496 participants, and we used a 4 (COVID-19 scenario: Control vs. Impact Uncertainty vs. Worst-Case vs. Indirect Transmission) by 2 (decision time: time delay vs. time pressure) between-subjects design. Results suggest that participants are more inclined to stay at home (prosocial intention) when forced to make their decisions intuitively rather than deliberately. Additionally, we found that uncertainty does not always decrease prosocial behavior. It seems that uncertainty does not affect the prosocial intention in a scenario with a real infectious disease. These findings suggest that the distinction between outcome and impact uncertainty may be due to the realism of experimental stimuli interventions.
Material flow control mechanisms determine: (i) whether an order should be released onto the shop floor; and (ii) whether a station should be authorized to produce. Well‐known approaches include ...Kanban, Drum‐Buffer‐Rope (DBR), Constant Work‐in‐Process (ConWIP), Paired‐cell Overlapping Loops of Cards with Authorization (POLCA), Workload Control (WLC), and Control of Balance by Card Based Navigation (COBACABANA). The literature typically treats these approaches as competing, meaning studies argue for the superiority of one over another. However, a closer look reveals that existing mechanisms either focus on order release (ConWIP, DBR, WLC, and COBACABANA) or on production authorization (Kanban and POLCA). This study therefore calls for a paradigm shift and argues that the different mechanisms may play complementary rather than competing roles. Using simulation, we assess the performance of COBACABANA and POLCA in a high‐variety make‐to‐order shop, a type of shop arguably in most need of material flow control given the importance of throughput times and delivery time adherence. Results demonstrate that COBACABANA outperforms POLCA, but the simultaneous adoption of both control mechanisms outperforms the use of either one in isolation. More specifically, adding POLCA production authorization to COBACABANA order release enables the superfluous direct load to be further reduced, resulting in shop floor throughput time reductions of between 15% and 26% while further reducing the percentage tardy and mean tardiness by up to 14%. Compared to no material flow control, the new combined mechanism realizes a reduction of almost 50% in the percentage tardy and more than 30% in mean tardiness.
POLCA is an important card-based control system for low volume, high variety production contexts. A job can only be produced at an upstream station if it has acquired a POLCA card that has returned ...from its downstream station. A common assumption in the POLCA literature is that cards are allocated to jobs as soon as they return to the upstream station. This dissects the queue in front of a station into jobs that have a card (and can be produced) and those that do not have a card (and cannot be produced). This artificially and prematurely constrains the dispatching decision, i.e. the decision concerning which job to produce next at a station. In response, this paper proposes integrating the card-allocation and dispatching decisions such that the allocation of POLCA cards to jobs is postponed until the dispatching decision is made. Simulation results demonstrate that this integrated approach does not improve performance under simple ERD dispatching, as is commonly applied in the POLCA literature. But when a more powerful rule is applied, percentage tardy and mean tardiness performance improve by more than 75% and 50%, respectively, for an integrated decision. Most importantly, results suggest that in production environments like the one considered in this study, the integrated approach dispenses with the use of POLCA altogether if a suitable priority rule is used.
Desorption electrospray ionization (DESI) mass spectrometry is an emerging technology for direct therapeutic drug monitoring in dried blood spots (DBS). Current DBS methods require manual application ...of small molecules as internal standards for absolute drug quantification. With industrial standardization in mind, we superseded the manual addition of standard and built a three-layer setup for robust quantification of salicylic acid directly from DBS. We combined a dioctyl sodium sulfosuccinate weave facilitating sample spreading with a cellulose layer for addition of isotope-labeled salicylic acid as internal standard and a filter paper for analysis of the standard-containing sample by DESI-MS. Using this setup, we developed a quantification method for salicylic acid from whole blood with a validated linear curve range from 10 to 2000 mg/L, a relative standard deviation (RSD%) ≤14 %, and determination coefficients of 0.997. The limit of detection (LOD) was 8 mg/L and the lower limit of quantification (LLOQ) was 10 mg/L. Recovery rates in method verification by LC-MS/MS were 97 to 101 % for blinded samples. Most importantly, a study in healthy volunteers after administration of a single dose of Aspirin provides evidence to suggest that the three-layer setup may enable individual pharmacokinetic and endpoint testing following blood collection by finger pricking by patients at home. Taken together, our data suggests that DBS-based quantification of drugs by DESI-MS on pre-manufactured three-layer cartridges may be a promising approach for future near-patient therapeutic drug monitoring.