In this paper, a new production, allocation, location, inventory holding, distribution, and flow problems for a new sustainable-resilient health care network related to the COVID-19 pandemic under ...uncertainty is developed that also integrated sustainability aspects and resiliency concepts. Then, a multi-period, multi-product, multi-objective, and multi-echelon mixed-integer linear programming model for the current network is formulated and designed. Formulating a new MILP model to design a sustainable-resilience healthcare network during the COVID-19 pandemic and developing three hybrid meta-heuristic algorithms are among the most important contributions of this research. In order to estimate the values of the required demand for medicines, the simulation approach is employed. To cope with uncertain parameters, stochastic chance-constraint programming is proposed. This paper also proposed three meta-heuristic methods including Multi-Objective Teaching–learning-based optimization (TLBO), Particle Swarm Optimization (PSO), and Genetic Algorithm (GA) to find Pareto solutions. Since heuristic approaches are sensitive to input parameters, the Taguchi approach is suggested to control and tune the parameters. A comparison is performed by using eight assessment metrics to validate the quality of the obtained Pareto frontier by the heuristic methods on the experiment problems. To validate the current model, a set of sensitivity analysis on important parameters and a real case study in the United States are provided. Based on the empirical experimental results, computational time and eight assessment metrics proposed methodology seems to work well for the considered problems. The results show that by raising the transportation costs, the total cost and the environmental impacts of sustainability increased steadily and the trend of the social responsibility of staff rose gradually between − 20 and 0%, but, dropped suddenly from 0 to + 20%. Also in terms of the on-resiliency of the proposed network, the trends climbed slightly and steadily. Applications of this paper can be useful for hospitals, pharmacies, distributors, medicine manufacturers and the Ministry of Health.
•This study focuses on visual and verbal cues in eWOM.•We study the effect of visual and verbal cues on consumer intention and behavior.•We draw upon Dual Coding Theory and use a mixed-method ...approach.•Popularity and performance visual heuristics affect intentions and decision.•User-generated pictures also affect consumers’ intentions and decision.
Consumers increasingly use eWOM to make decisions about various products and services. However, few studies have investigated how different visual and verbal eWOM cues affect the intention and decision to visit tourist destinations and their attractions. The current study fills this gap by drawing on Dual Coding Theory and investigating the influence of verbal and visual eWOM cues on consumers’ intention and behavior. The findings of a field study and an experimental study revealed that eWOM mainly affects tourists’ intentions and decisions through visual cues. Specifically, popularity heuristics, performance visual heuristics, and user-generated pictures affect tourists’ intention and decision to visit a destination and its attractions. Interestingly, information quality did not affect tourists’ decisions. The study offers important theoretical and managerial implications.
This paper introduces a novel population-based bio-inspired meta-heuristic optimization algorithm, called Blood Coagulation Algorithm (BCA). BCA derives inspiration from the process of blood ...coagulation in the human body. The underlying concepts and ideas behind the proposed algorithm are the cooperative behavior of thrombocytes and their intelligent strategy of clot formation. These behaviors are modeled and utilized to underscore intensification and diversification in a given search space. A comparison with various state-of-the-art meta-heuristic algorithms over a test suite of 23 renowned benchmark functions reflects the efficiency of BCA. An extensive investigation is conducted to analyze the performance, convergence behavior and computational complexity of BCA. The comparative study and statistical test analysis demonstrate that BCA offers very competitive and statistically significant results compared to other eminent meta-heuristic algorithms. Experimental results also show the consistent performance of BCA in high dimensional search spaces. Furthermore, we demonstrate the applicability of BCA on real-world applications by solving several real-life engineering problems.
Software Defined Networking (SDN) marks a paradigm shift towards an externalized and logically centralized network control plane. A particularly important task in SDN architectures is that of ...controller placement, i.e., the positioning of a limited number of resources within a network to meet various requirements. These requirements range from latency constraints to failure tolerance and load balancing. In most scenarios, at least some of these objectives are competing, thus no single best placement is available and decision makers need to find a balanced trade-off. This work presents POCO, a framework for Pareto-based Optimal COntroller placement that provides operators with Pareto optimal placements with respect to different performance metrics. In its default configuration, POCO performs an exhaustive evaluation of all possible placements. While this is practically feasible for small and medium sized networks, realistic time and resource constraints call for an alternative in the context of large scale networks or dynamic networks whose properties change over time. For these scenarios, the POCO toolset is extended by a heuristic approach that is less accurate, but yields faster computation times. An evaluation of this heuristic is performed on a collection of real world network topologies from the Internet Topology Zoo. Utilizing a measure for quantifying the error introduced by the heuristic approach allows an analysis of the resulting trade-off between time and accuracy. Additionally, the proposed methods can be extended to solve similar virtual functions placement problems which appear in the context of Network Functions Virtualization (NFV).
This paper presents an investigation of a simple generic hyper-heuristic approach upon a set of widely used constructive heuristics (graph coloring heuristics) in timetabling. Within the ...hyper-heuristic framework, a tabu search approach is employed to search for permutations of graph heuristics which are used for constructing timetables in exam and course timetabling problems. This underpins a multi-stage hyper-heuristic where the tabu search employs permutations upon a different number of graph heuristics in two stages. We study this graph-based hyper-heuristic approach within the context of exploring fundamental issues concerning the search space of the hyper-heuristic (the heuristic space) and the solution space. Such issues have not been addressed in other hyper-heuristic research. These approaches are tested on both exam and course benchmark timetabling problems and are compared with the fine-tuned bespoke state-of-the-art approaches. The results are within the range of the best results reported in the literature. The approach described here represents a significantly more generally applicable approach than the current state of the art in the literature. Future work will extend this hyper-heuristic framework by employing methodologies which are applicable on a wider range of timetabling and scheduling problems.
Consideration-set heuristics Hauser, John R.
Journal of business research,
08/2014, Volume:
67, Issue:
8
Journal Article
Peer reviewed
Open access
Consumers often choose products by first forming a consideration set and then choosing from among considered products. When there are many products to screen (or many features to evaluate), it is ...rational for consumers to use consider-then-choose decision processes and to do so with heuristic decision rules. Managerial decisions (product development, marketing communications, etc.) depend upon the ability to identify and react to consumers' heuristic consideration-set rules. We provide managerial examples and review the state-of-the-art in the theory and measurement of consumers' heuristic consideration-set rules. Advances in greedoid methods, Bayesian inference, machine-learning, incentive alignment, measurement formats, and unstructured direct elicitation make it feasible and cost-effective to understand, quantify, and simulate “what-if” scenarios for a variety of heuristics. These methods now apply to a broad set of managerial problems including applications in complex product categories with large numbers of product features and feature-levels.
Many contemporary accounts of human reasoning assume that the mind is equipped with multiple heuristics that could be deployed to perform a given task. This raises the question of how the mind ...determines when to use which heuristic. To answer this question, we developed a rational model of strategy selection, based on the theory of rational metareasoning developed in the artificial intelligence literature. According to our model people learn to efficiently choose the strategy with the best cost-benefit tradeoff by learning a predictive model of each strategy's performance. We found that our model can provide a unifying explanation for classic findings from domains ranging from decision-making to arithmetic by capturing the variability of people's strategy choices, their dependence on task and context, and their development over time. Systematic model comparisons supported our theory, and 4 new experiments confirmed its distinctive predictions. Our findings suggest that people gradually learn to make increasingly more rational use of fallible heuristics. This perspective reconciles the 2 poles of the debate about human rationality by integrating heuristics and biases with learning and rationality.
Heuristics evaluation is frequently employed to evaluate usability. While general heuristics are suitable to evaluate most user interfaces, there is still a need to establish heuristics for specific ...domains to ensure that their specific usability issues are identified. This paper presents a comprehensive review of 70 studies related to usability heuristics for specific domains. The aim of this paper is to review the processes that were applied to establish heuristics in specific domains and identify gaps in order to provide recommendations for future research and area of improvements. The most urgent issue found is the deficiency of validation effort following heuristics proposition and the lack of robustness and rigour of validation method adopted. Whether domain specific heuristics perform better or worse than general ones is inconclusive due to lack of validation quality and clarity on how to assess the effectiveness of heuristics for specific domains. The lack of validation quality also affects effort in improving existing heuristics for specific domain as their weaknesses are not addressed.
•Analytical review of 70 studies of domain specific heuristics for usability evaluation.•There is a deficiency of validation effort following heuristics proposition.•It is inconclusive whether domain specific heuristics is better than general ones.•Less than 10% of the studies showed acceptable robustness and rigorousness.•More than 80% of the studies used similar heuristics as Nielsen's.
Identifying structural damage is an essential task for ensuring the safety and functionality of civil, mechanical, and aerospace structures. In this study, the structural damage identification scheme ...is formulated as an optimization problem, and a new meta-heuristic optimization algorithm, called visible particle series search (VPSS), is proposed to tackle that. The proposed VPSS algorithm is inspired by the visibility graph technique, which is a technique used basically to convert a time series into a graph network. In the proposed VPSS algorithm, the population of candidate solutions is regarded as a particle series and is further mapped into a visibility graph network to obtain visible particles. The information captured from the visible particles is then utilized by the algorithm to seek the optimum solution over the search space. The general performance of the proposed VPSS algorithm is first verified on a set of mathematical benchmark functions, and, afterward, its ability to identify structural damage is assessed by conducting various numerical simulations. The results demonstrate the high accuracy, reliability, and computational efficiency of the VPSS algorithm for identifying the location and the extent of damage in structures.
Registered Replication Report Bouwmeester, S.; Verkoeijen, P. P. J. L.; Aczel, B. ...
Perspectives on psychological science,
05/2017, Volume:
12, Issue:
3
Journal Article
Peer reviewed
Open access
In an anonymous 4-person economic game, participants contributed more money to a common project (i.e., cooperated) when required to decide quickly than when forced to delay their decision (Rand, ...Greene & Nowak, 2012), a pattern consistent with the social heuristics hypothesis proposed by Rand and colleagues. The results of studies using time pressure have been mixed, with some replication attempts observing similar patterns (e.g., Rand et al., 2014) and others observing null effects (e.g., Tinghög et al., 2013; Verkoeijen & Bouwmeester, 2014). This Registered Replication Report (RRR) assessed the size and variability of the effect of time pressure on cooperative decisions by combining 21 separate, preregistered replications of the critical conditions from Study 7 of the original article (Rand et al., 2012). The primary planned analysis used data from all participants who were randomly assigned to conditions and who met the protocol inclusion criteria (an intent-to-treat approach that included the 65.9% of participants in the time-pressure condition and 7.5% in the forced-delay condition who did not adhere to the time constraints), and we observed a difference in contributions of -0.37 percentage points compared with an 8.6 percentage point difference calculated from the original data. Analyzing the data as the original article did, including data only for participants who complied with the time constraints, the RRR observed a 10.37 percentage point difference in contributions compared with a 15.31 percentage point difference in the original study. In combination, the results of the intent-to-treat analysis and the compliant-only analysis are consistent with the presence of selection biases and the absence of a causal effect of time pressure on cooperation.