The whole frame of interconnections in complex networks hinges on a specific set of structural nodes, much smaller than the total size, which, if activated, would cause the spread of information to ...the whole network, or, if immunized, would prevent the diffusion of a large scale epidemic. Localizing this optimal, that is, minimal, set of structural nodes, called influencers, is one of the most important problems in network science. Despite the vast use of heuristic strategies to identify influential spreaders, the problem remains unsolved. Here we map the problem onto optimal percolation in random networks to identify the minimal set of influencers, which arises by minimizing the energy of a many-body system, where the form of the interactions is fixed by the non-backtracking matrix of the network. Big data analyses reveal that the set of optimal influencers is much smaller than the one predicted by previous heuristic centralities. Remarkably, a large number of previously neglected weakly connected nodes emerges among the optimal influencers. These are topologically tagged as low-degree nodes surrounded by hierarchical coronas of hubs, and are uncovered only through the optimal collective interplay of all the influencers in the network. The present theoretical framework may hold a larger degree of universality, being applicable to other hard optimization problems exhibiting a continuous transition from a known phase.
Microfoundations have become an important theme in recent macromanagement research. However, the international management (IM) field is an exception to this. We document the lack of attention on ...microfoundations in IM research by focusing on knowledge sharing – a key IM research field – which we investigate by means of a keyword-based literature study of the leading IM and general management journals. We discuss possible reasons why microfoundations have so far met with less resonance in IM research. We point to the training and background of IM scholars as possible reasons. We also highlight the significance that IM scholars place on context and structure in explanation. These may be seen as contrary to a microfoundations perspective, a view that we show is incorrect. We end by identifying several microfoundational issues in IM research, calling for a sustained effort with respect to theory, heuristics, and empirics.
Research summary
Enterprises in low‐resource contexts often rely on bricolage (i.e., making do by applying resources at hand to new problems). However, bricolage has traditionally been regarded as a ...way to temporarily get by, potentially constraining growth if continued over time. This has been explained by factors such as limited development of learning competencies. Surprisingly, we encountered a social organization appearing to use bricolage to scale extensively into a variety of locations. This puzzling observation prompted our research question: Can bricolage be scaled, and if so, how and why? We embarked on a process study of this organization, leading to a novel conceptual model of scaling bricolage: as a low‐cost replication process of heuristics, enabling fit with a diversity of local environments, as well as cross‐unit learning.
Managerial summary
How do organizations emerge, survive, and scale in resource‐scarce environments? Traditional scaling models tend to rely on considerable financial resources and companies often struggle to adjust to diverse contexts. In contrast, we identified and studied an organization in Sub‐Saharan Africa that we argue used simple rules to scale bricolage—making the best out of what is at hand—successfully in diverse low‐resource contexts. Our paper provides a novel conceptual model of scaling bricolage: a low‐cost replication process of heuristics, enabling fit with a diversity of local environments, as well as cross‐unit innovation and learning.
Reports an error in "Evaluating categories from experience: The simple averaging heuristic" by Thomas K. A. Woiczyk and Gaël Le Mens ( Journal of Personality and Social Psychology, 2021Oct, Vol 1214, ...747-773). There was an error in Figure 7. In the two plots of the second row, the data previously labeled as “Equal” correspond to “Natural” and the data previously labeled “Natural” correspond to “Equal.” The online version of this article has been corrected. (The following abstract of the original article appeared in record 2021-89474-001.) We analyze how people form evaluative judgments about categories based on their experiences with category members. Prior research suggests that such evaluative judgments depend on some experience average but is unclear about the specific kind of average. We hypothesized that evaluations of categories could be driven either by the simple average of experiences with the category or by the member average (the average of the evaluations of the category members, where the evaluation of a category member is the average of experiences with this particular member). Understanding whether evaluations of categories are driven by the simple average or the member average is important in settings where people obtain unbalanced numbers of observations about category members such as when people form opinions about a social group and predominantly interact with just a few members of this group. Across nine studies ( N = 1,966), we consistently found that evaluative judgments about categories were better explained by the simple average than by the member average. We call the underlying cognitive strategy the simple averaging heuristic. Collected evidence indicates that participants relied on simple averaging even in settings where normative principles required avoiding the use of this cognitive strategy, leading to systematic mistakes. Our findings contribute to several areas of social cognition such as research on redundancy biases, information aggregation, social sampling, and norm perceptions. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
A distinctive feature of software‐defined networking (SDN) is a logically centralized control plane realized using multiple physical controllers. The placement of the controllers, the so‐called ...controller placement problem (CPP), is a crucial design issue. It influences network performance parameters such as latency, flow setup time, network availability, load balance of the controllers, and energy consumption. In this article, we illustrate the formulation of these CPP objectives. We categorize the CPP design solutions as either static or adaptive. In adaptive CPP, the solutions proposed dynamically adapt to the number of controllers required and the switch to controller mapping to varying network traffic. We further differentiate adaptive CPP as wired or wireless. The optimization strategies adopted by the papers are analyzed and grouped into five categories: exact, heuristic, meta‐heuristic, clustering, and game theory. The merits and demerits of each approach are discussed. In conclusion, we outline the research challenges worth investigating.
This paper aims to tackle the Patient Admission Scheduling Problem (PASP) using the Discrete Flower Pollination Algorithm (DFPA), a new, meta-heuristic optimization method based on plant pollination. ...PASP is one of the most important problems in the field of health care. It is a highly constrained and combinatorial optimization problem of assigning patients to medical resources in a hospital, subject to predefined constraints, while maximizing patient comfort. While the flower pollination algorithm was designed for continuous optimization domains, a discretization of the algorithm has been carried out for application to the PASP. Various neighborhood structures have been employed to enhance the method, and to explore more solutions in the search space. The proposed method has been tested on six instances of benchmark datasets for comparison against another algorithm using the same dataset. The prospective method is shown to be very efficient in solving any scheduling problem.
•A discrete flower pollination algorithm is proposed for PASP.•A discretization procedure is invoked to address such kind of problem.•3.Three neighborhood structures have been employed to enhance the method.•The proposed method has been successful in gaining promising feasible solutions.
It is broadly assumed that political elites (e.g. party leaders) regularly rely on heuristics in their judgments or decision-making. In this article, I aim to bring together and discuss the scattered ...literature on this topic. To address the current conceptual unclarity, I discuss two traditions on heuristics: (1) the heuristics and biases (H&B) tradition pioneered by Kahneman and Tversky and (2) the fast and frugal heuristics (F&F) tradition pioneered by Gigerenzer et al. I propose to concentrate on two well-defined heuristics from the H&B tradition—availability and representativeness—to empirically assess when political elites rely on heuristics and thereby understand better their judgments and decisions. My review of existing studies supports the notion that political elites use the availability heuristic and possibly the representativeness one for making complex decisions under uncertainty. It also reveals that besides this, we still know relatively little about when political elites use which heuristic and with what effect(s). Therefore, I end by proposing an agenda for future research.
Fast and frugal forecasting Goldstein, Daniel G.; Gigerenzer, Gerd
International journal of forecasting,
10/2009, Volume:
25, Issue:
4
Journal Article
Peer reviewed
Open access
Simple statistical forecasting rules, which are usually simplifications of classical models, have been shown to make better predictions than more complex rules, especially when the future values of a ...criterion are highly uncertain. In this article, we provide evidence that some of the fast and frugal heuristics that people use intuitively are able to make forecasts that are as good as or better than those of knowledge-intensive procedures. We draw from research on the adaptive toolbox and ecological rationality to demonstrate the power of using intuitive heuristics for forecasting in various domains including sport, business, and crime.
•A comprehensive review of the flowshop literature in the last 10 years is given.•The best heuristics and metaheuristics are identified.•A comprehensive computational and statistical evaluation is ...provided.•19 heuristics and 12 metaheuristics are compared.•We identify the state-of-the-art and propose a method for comparing heuristics.
The permutation flowshop problem is a classic machine scheduling problem where n jobs must be processed on a set of m machines disposed in series and where each job must visit all machines in the same order. Many production scheduling problems resemble flowshops and hence it has generated much interest and had a big impact in the field, resulting in literally hundreds of heuristic and metaheuristic methods over the last 60 years. However, most methods proposed for makespan minimisation are not properly compared with existing procedures so currently it is not possible to know which are the most efficient methods for the problem regarding the quality of the solutions obtained and the computational effort required. In this paper, we identify and exhaustively compare the best existing heuristics and metaheuristics so the state-of-the-art regarding approximate procedures for this relevant problem is established.
Recently, iterated greedy algorithms have been successfully applied to solve a variety of combinatorial optimization problems. This paper presents iterated greedy algorithms for solving the blocking ...flowshop scheduling problem (BFSP) with the makespan criterion. Main contributions of this paper can be summed up as follows. We propose a constructive heuristic to generate an initial solution. The constructive heuristic generates better results than those currently in the literature. We employ and adopt well-known speed-up methods from the literature for both insertion and swap neighborhood structures. In addition, an iteration jumping probability is proposed to change the neighborhood structure from insertion neighborhood to swap neighborhood. Generally speaking, the insertion neighborhood is much more effective than the swap neighborhood for the permutation flowshop scheduling problems. Instead of considering the use of these neighborhood structures in a framework of the variable neighborhood search algorithm, two powerful local search algorithms are designed in such a way that the search process is guided by an iteration jumping probability determining which neighborhood structure will be employed. By doing so, it is shown that some additional enhancements can be achieved by employing the swap neighborhood structure with a speed-up method without jeopardizing the effectiveness of the insertion neighborhood. We also show that the performance of the iterated greedy algorithm significantly depends on the speed-up method employed. The parameters of the proposed iterated greedy algorithms are tuned through a design of experiments on randomly generated benchmark instances. Extensive computational results on Taillard’s well-known benchmark suite show that the iterated greedy algorithms with speed-up methods are equivalent or superior to the best performing algorithms from the literature. Ultimately, 85 out of 120 problem instances are further improved with substantial margins.
•A new constructive heuristic is proposed.•Speed-up methods from the literature are adapted very well.•IG algorithm is superior with the speed-up method. But without a speed-up method, its performance is poor.•We propose an iteration jumping probability to employ the swap neighborhood structure.•Ultimately, 85 out of 90 problem instances are further improved.