This paper addresses a partial inverse combinatorial optimization problem, called the partial inverse min–max spanning tree problem. For a given weighted graph
G
and a forest
F
of the graph, the ...problem is to modify weights at minimum cost so that a bottleneck (min–max) spanning tree of
G
contains the forest. In this paper, the modifications are measured by the weighted Manhattan distance. The main contribution is to present two algorithms to solve the problem in polynomial time. This result is considerable because the partial inverse minimum spanning tree problem, which is closely related to this problem, is proved to be NP-hard in the literature. Since both the algorithms have the same worse-case complexity, some computational experiments are reported to compare their running time.
This paper investigates an inverse version of maximum capacity path problems. Its goal is how to increase arc capacities under a budget constraint so that the maximum capacity among all paths from an ...origin to a destination is improved as much as possible. Two distinct cases are considered: fixed costs and linear costs. In the former, an algorithm is designed to solve the problem in polynomial time. In the latter, a polynomial-time approach is developed to contain two phases. The first phase applies the first algorithm as a subroutine to find a small interval containing the optimal value. The second phase converts the reduced problem into a minimum ratio path problem. Then, a Secant-Newton hybrid algorithm is proposed to obtain the exact optimal solution. Some theoretical aspects as well as experimental computations are performed to guarantee the correctness and performance of our proposed algorithm.
In this paper, the generalized widest path problem (or generalized maximum capacity problem) is studied. This problem is denoted by the GWPP. The classical widest path problem is to find a path from ...a source (s) to a sink (t) with the highest capacity among all possible s-t paths. The GWPP takes into account the presence of loss/gain factors on arcs as well. The GWPP aims to find an s-t path considering the loss/gain factors while satisfying the capacity constraints. For solving the GWPP, three strongly polynomial time algorithms are presented. Two algorithms only work in the case of losses. The first one is less efficient than the second one on a CPU, but it proves to be more efficient on large networks if it parallelized on GPUs. The third algorithm is able to deal with the more general case of losses/gains on arcs. An example is considered to illustrate how each algorithm works. Experiments on large networks are conducted to compare the efficiency of the algorithms proposed.
A natural extension of maximum flow problems is called the generalized maximum flow problem taking into account the gain and loss factors for arcs. This paper investigates an inverse problem ...corresponding to this problem. It is to increase arc capacities as less cost as possible in a way that a prescribed flow becomes a maximum flow with respect to the modified capacities. The problem is referred to as the generalized maximum flow problem (IGMF). At first, we present a fast method that determines whether the problem is feasible or not. Then, we develop an algorithm to solve the problem under the max-type distances in O ( m n · log n ) time. Furthermore, we prove that the problem is strongly NP-hard under sum-type distances and propose a heuristic algorithm to find a near-optimum solution to these NP-hard problems. The computational experiments show the accuracy and the efficiency of the algorithm.
Background: One of important subject in the operations' management fields is partitioning matter that was investigated in the study. This topic has recently received more attention from researchers ...of the healthcare management systems' field. This subject is important because planning about improvement of the healthcare system structure is considered as one of the most important management problems in each society. The goal of solving this problem was to district a society into several areas, so that each area can cover its health services completely.
Methods: This fundamental-applied study was conducted based on the Genetic optimization algorithm, particle swarm, and differential evolution to improve the current structures with regard to the existing health structure in Iran. Moreover, the health system strategic model was applied to categorize the population regions into 10 partitions. According to nature of the investigated problem, the objective function is maximizing the equilibrium amount in each district. The constraints included exclusive assignment and not-existing unusual assignment. Unusual assignment is defined as existence of no contiguity and holes in partitions.
Results: According to the obtained results, the particle swarm algorithm had the most efficiency, while differential evolution had the lowest efficiency. However, the stated constraints were satisfied completely in all algorithms, which represented appropriate efficiency of the modified algorithm in the generation solutions.
Conclusion: The results obtained from solving this problem can be used as a useful tool in improving the existing healthcare system in Iran.
The maximum capacity path problem is to find a path connecting two given nodes in a network such that the minimum arc capacity on this path is maximized. The inverse maximum capacity path problem ...(IMCP) is to modify the capacities of the arcs as little as possible so that a given path becomes maximum capacity path in the modified network. Two cases of IMCP are considered: the capacity of the given path is preserved or not. IMCP is studied and solved both, under any sum-type (e.g., weighted <inline-formula> <tex-math notation="LaTeX">l_{k} </tex-math></inline-formula> norms and sum-type Hamming distance) and max-type distance (e.g., weighted <inline-formula> <tex-math notation="LaTeX">l_{\infty } </tex-math></inline-formula> norm or bottleneck Hamming distance). The obtained algorithms for IMCP are applied to solve a real road transportation network optimization problem.
Given a network G(V,A,c) and a collection of origin-destination pairs with
prescribed values, the reverse shortest path problem is to modify the arc
length vector c as little as possible under some ...bound constraints such that
the shortest distance between each origin-destination pair is upper bounded
by the corresponding prescribed value. It is known that the reverse shortest
path problem is NP-hard even on trees when the arc length modifications are
measured by the weighted sum-type Hamming distance. In this paper, we
consider two special cases of this problem which are polynomially solvable.
The first is the case with uniform lengths. It is shown that this case
transforms to a minimum cost flow problem on an auxiliary network. An
efficient algorithm is also proposed for solving this case under the unit
sum-type Hamming distance. The second case considered is the problem without
bound constraints. It is shown that this case is reduced to a minimum cut
problem on a tree-like network. Therefore, both cases studied can be solved
in strongly polynomial time.
nema
Given a linear programming problem with the objective function coefficients vector c and a feasible solution x0 to this problem, a corresponding inverse linear programming problem is to modify the ...vector c as little as possible to make x0 form an optimal solution to the linear programming problem. The modifications can be measured by different distances. In this article, we consider the inverse linear programming problem under the bottleneck-type weighted Hamming distance. We propose an algorithm based on the binary search technique to solve the problem. At each iteration, the algorithm has to solve a linear programming problem. We also consider the inverse minimum cost flow problem as a special case of the inverse linear programming problems and specialize the proposed method for solving this problem in strongly polynomial time. The specialized algorithm solves a shortest path problem at each iteration. It is shown that its complexity is better than the previous one.