The concept of
hotlink assignment
aims at reducing the navigation effort for users of a web directory or similar structure by inserting a limited number of additional hyperlinks called
hotlinks
. ...Given an access probability distribution of the leaves of the tree representing the web site, the goal of hotlink assignment algorithms is to minimize the expected path length between the root and the leaves.
We prove that this optimization problem is NP-hard, even if only one outgoing hotlink is allowed for each node. This answers a question that has been open since the first formulation of the problem in Bose et al. (Proceedings of 11th International Symposium on Algorithms and Computation (ISAAC),
2000
) nine years ago.
In this work we also investigate the model where hotlinks are only allowed to point at the leaves of the tree. We demonstrate that for this model optimal solutions can be computed in
O
(
n
2
) time. Our algorithm L-OPT also operates in a more general setting, where the maximum number of outgoing hotlinks is specified individually for each node. The runtime is then
O
(
n
3
). Experimental evaluation shows that L-OPT terminates within less than one second on problem instances having up to half a million nodes.
We present the first experimental study of online packet buffering algorithms for network switches. We consider a basic scenario in which
m
queues of size
B
have to be maintained so as to maximize ...the packet throughput. For this model various online algorithms with competitive factors ranging between 2 and 1.5 were developed in the literature. We first develop a new 2-competitive online algorithm, called
HSFOD
, which is especially designed to perform well under real-world conditions. In our experimental study we have implemented all the proposed algorithms, including
HSFOD
, and tested them on packet traces from benchmark libraries. We have evaluated the experimentally observed competitiveness, the running times, memory requirements and actual packet throughput of the strategies. The tests were executed for varying values of
m
and
B
as well as varying switch speeds. It shows that greedy-like strategies and
HSFOD
perform best in practice.
Due to longer lifespans in societies in industrialized countries, cardiovascular diseases are becoming increasingly important for medical research. It has already been shown that the cell ...membrane-bound, non-selective TRPC6 ion channel is important in the pathogenesis of heart diseases. Among other things, it is permeable to calcium ion, which plays a critical role in cardiac contraction and relaxation. The TRPC6 ion channel is a promising therapeutic target in the treatment of cardiovascular diseases. A deeper understanding of the physiological and pathophysiological role as well as the localization of TRPC6 in human cardiac tissue is the basis for new drug development. Although the TRPC6 channel has been detected in animal studies, at the mRNA level in humans, and sparse TRPC6 protein has been detected in humans, there are no systematic studies of TRPC6 protein detection in the human heart. For the first time, TRPC6 ion channel protein was detected histologically in human heart tissue from body donors in different structures, localizations, and histological layers - particularly in cardiomyocytes and intramuscular arterioles - by immunohistochemistry, just as TRPC6 expression has already been shown in animal models of the heart by other research groups. In the sense of the translational concept, this indicates a possible transferability of research results from animal models to humans.
This article describes an analytical framework for reasoning about the issue of tie breaking in algorithm theory. The core of this framework is the concept of robust algorithms. Such kind of ...algorithms have the convenient property that an arbitrary set of degenerate cases can be ignored without loss of generality during proofs of many important properties, e.g., runtime, space requirements, feasibility, competitive and approximation ratios. Here degeneracy is defined as a set of problem instances satisfying a certain property, and this property is independent from both the algorithm under consideration and the specific combinatorial problem structure.
It turns out that robustness is related to the tie breaking policy of algorithms in two different ways. On the one hand, the set of inputs where a tie actually occurs is typically (but not always) a degenerate case. On the other hand, we show that for any algorithm there is a way to break ties such that it becomes robust. In particular, robustness is guaranteed by any tie breaking strategy that can be interpreted as symbolic perturbation. For many typical algorithms the implicit usage of symbolic perturbation can be easily verified and so the consideration of degenerate cases can be avoided during their analysis. The concept of robustness also gives a theoretical explanation of why tie breaking does often not matter at all.
Integrated operation of distribution grids for multiple energy carriers promises hitherto unused synergies in terms of efficient generation, storage, and consumption. A major obstacle to the ...investment in such systems is their increased complexity, as conventional tools and methods were not designed to capture all relevant technical and economic aspects of hybrid grids. To address this obstacle, this work proposes a methodology to systematically assess multi-carrier energy grids under a holistic scope. By adopting a simulation-based approach that relies on detailed technical and economic models, an efficient and precise evaluation of both short-term (operational) and long-term (strategic) aspects is supported. The methodology enables the assessment of system configurations, control strategies, business models, and regulatory conditions in one coherent approach. As a proof-of-concept, the new methodology is applied to a real-world use case of a hybrid thermal-electrical distribution grid in a central European city. The results are comprehensively discussed to showcase how the various aspects of hybrid energy systems are addressed. The outcomes also demonstrate how this methodology aids the involved stakeholders in understanding the associated risks and potentials, paving the way for early adopters to realize multi-carrier energy distribution grids.
•A holistic methodology to study multi-carrier energy systems is presented.•Operational technical and strategic economic perspectives are considered.•Established tools for different domains are coupled in a co-simulation framework.•Multiple market participants and opposing objectives are taken into account.•The methodology is demonstrated by a comprehensive study in a central European city.
We address generalized versions of the Huffman and Alphabetic Tree Problem where the cost caused by each individual leaf
i
, instead of being linear, depends on its depth in the tree by an arbitrary ...function. The objective is to minimize either the total cost or the maximum cost among all leaves. We review and extend the known results in this direction and devise a number of new algorithms and hardness proofs.
It turns out that the Dynamic Programming approach for the Alphabetic Tree Problem can be extended to arbitrary cost functions, resulting in a time
O
(
n
4
) optimal algorithm using space
O
(
n
3
). We identify classes of cost functions where the well-known trick to reduce the runtime by a factor of
n
via a “monotonicity” property can be applied.
For the generalized Huffman Tree Problem we show that even the
k
-ary version can be solved by a generalized version of the Coin Collector Algorithm of Larmore and Hirschberg (in Proc. SODA’90, pp. 310–318,
1990
) when the cost functions are nondecreasing and convex. Furthermore, we give an
O
(
n
2
log
n
) algorithm for the worst case minimization variants of both the Huffman and Alphabetic Tree Problem with nondecreasing cost functions.
Investigating the limits of computational tractability, we show that the Huffman Tree Problem in its full generality is inapproximable unless P = NP, no matter if the objective function is the sum of leaf costs or their maximum. The alphabetic version becomes NP-hard when the leaf costs are interdependent.
Transarterial embolization of branches of the hepatic artery with biocompatible 90Y-labeled microspheres (SIR-Spheres) is a local treatment modality for patients with liver tumors, which, most ...recently, has become available in Europe. The aim of this study was to evaluate the feasibility and efficacy of this selective internal radiation therapy (SIRT).
Twenty-three patients with nonresectable hepatic metastases or hepatocellular carcinoma nonresponding to polychemotherapy and/or other local treatment were treated with SIRT. SIR-Spheres (mean activity, 2270 MBq) were administered by gentle intra-arterial infusion in the hepatic artery. A follow-up was documented by fluorodeoxyglucose-positron emission tomography (FDG-PET), course of tumor markers, and computed tomography (CT).
Common minor side-effects were abdominal pain, nausea, and fever. Mild pancreatitis and peptic ulceration were observed once each. Currently, all patients are still alive, with survival times ranging from 11 to 518 days from SIRT up to the present. Three-month follow-up investigations are available in 13 of 23 patients, which, so far, are showing a marked decrease of FDG uptake, a drop of tumor markers, and unchanged or slightly decreasing lesion size (CT) in 10 of 13 patients. Two patients showed stable findings, while another patient showed progressive disease. Long-term follow-up investigations are available in 2 of 23 patients, showing hepatic and extrahepatic progression 6 and 9 months after SIRT.
Our initial experience confirms that SIRT is a promising local therapeutic approach in patients with nonresectable liver tumors which is feasible and has an acceptable toxicity profile. Prospective data on comparing this treatment alone or in combination with other modalities are needed to answer whether long-term survival in this unfavorable stage of disease can be markedly improved.
The TRPC6 channel is permeable to calcium ions as well as other ions and plays an important role in the physiology and pathophysiology of vessels. Findings from animal and cell culture experiments ...have shown its involvement in important vascular processes such as the Bayliss effect or endothelial-mediated vasodilatation. Furthermore, the relevance of TRPC6 channels in humans has become apparent based on diseases such as idiopathic pulmonary arterial hypertension, focal segmental glomerulosclerosis and atherosclerosis, amongst others. However, histological evidence that systematically detects TRPC6 channels in human vessels has not been provided to date. In this study, 40 vessel sections from nine body donors were obtained, processed and stained with a knockout-validated antibody against the TRPC6 protein using immunohistochemistry and western blotting. More than half of the samples yielded evidence of TRPC6 channel expression in the intima and adventitia. TRPC6 channels were detected in the tunica media in only one of 40 cases. TRPC6 detection in the human intima confirmed several demonstrated physiological aspects of the TRPC6 channels in the vasculature and may also be involved in associated human diseases. The near absence of TRPC6 channels in the tunica media was in contrast to a view that is primarily based on animal studies, from which its presence was assumed.
We study the problem of minimizing the weighted average number of queries to identify an initially unknown object in a poset. We show that for general posets, there cannot be any
o
(
log
n
)
...-approximation algorithm unless
NP
⊆
TIME
(
n
O
(
log
log
n
)
)
. When the Hasse diagram of the partially ordered set has the structure of a tree, the problem is equivalent to the following
tree search problem: in a given rooted tree
T
=
(
V
,
E
)
a node has been marked and we want to identify it. In order to locate the marked node, we can perform node queries. A node query
u
asks whether the marked node lies in the subtree rooted at
u
. A function
w
:
V
→
Z
+
is given which defines the likelihood for a node to be the one marked, and we want the strategy that minimizes the expected number of queries. Prior to this paper the complexity of this problem had remained open.
We prove that the above
tree search problem is
NP
-complete even for the class of trees with diameter at most 4. This results in a complete characterization of the complexity of the problem with respect to the diameter size. In fact, for diameter not larger than 3 we show that the problem is polynomially solvable using a dynamic programming approach.
In addition we prove that the problem is
NP
-complete even for the class of trees of maximum degree at most 16. To the best of our knowledge, the only known result in this direction is that the tree search problem is solvable in
O
(
|
V
|
log
|
V
|
)
time for trees with degree at most 2 (paths).
Our results sharply contrast with those for the variant of the problem where one is interested in minimizing the maximum number of queries. In fact, for the worst case scenario, linear time algorithms are known for finding an optimal search strategy K. Onak, P. Parys, Generalization of binary search: searching in trees and forest-like partial orders, in: FOCS’06: Proceedings of the 47th Annual IEEE Symposium on Foundations of Computer Science, IEEE Computer Society, Washington, DC, USA, 2006, pp. 379–388; S. Mozes, K. Onak, O. Weimann, Finding an optimal tree searching strategy in linear time, in: SODA’08: Proceedings of the Nineteenth Annual ACM–SIAM Symposium on Discrete Algorithms, Society for Industrial and Applied Mathematics, Philadelphia, PA, USA, 2008, pp. 1096–1105.