Let G be a graph with a vertex colouring α. Let a and b be two colours. Then a connected component of the subgraph induced by those vertices coloured either a or b is known as a Kempe chain. A ...colouring of G obtained from α by swapping the colours on the vertices of a Kempe chain is said to have been obtained by a Kempe change. Two colourings of G are Kempe equivalent if one can be obtained from the other by a sequence of Kempe changes.
A conjecture of Mohar (2007) asserts that, for k≥3, all k-colourings of a k-regular graph that is not complete are Kempe equivalent. It was later shown that all 3-colourings of a cubic graph that is neither K4 nor the triangular prism are Kempe equivalent. In this paper, we prove that the conjecture holds for each k≥4. We also report the implications of this result on the validity of the Wang–Swendsen–Kotecký algorithm for the antiferromagnetic Potts model at zero-temperature.
In the Token Jumping problem we are given a graph
G
=
(
V
,
E
)
and two independent sets
S
and
T
of
G
, each of size
k
≥
1
. The goal is to determine whether there exists a sequence of
k
-sized ...independent sets in
G
,
⟨
S
0
,
S
1
,
…
,
S
ℓ
⟩
, such that for every
i
,
|
S
i
|
=
k
,
S
i
is an independent set,
S
=
S
0
,
S
ℓ
=
T
, and
|
S
i
Δ
S
i
+
1
|
=
2
. In other words, if we view each independent set as a collection of tokens placed on a subset of the vertices of
G
, then the problem asks for a sequence of independent sets which transforms
S
to
T
by individual token jumps which maintain the independence of the sets. This problem is known to be PSPACE-complete on very restricted graph classes, e.g., planar bounded degree graphs and graphs of bounded bandwidth. A closely related problem is the Token Sliding problem, where instead of allowing a token to jump to any vertex of the graph we instead require that a token slides along an edge of the graph. Token Sliding is also known to be PSPACE-complete on the aforementioned graph classes. We investigate the parameterized complexity of both problems on several graph classes, focusing on the effect of excluding certain cycles from the input graph. In particular, we show that both Token Sliding and Token Jumping are fixed-parameter tractable on
C
4
-free bipartite graphs when parameterized by
k
. For Token Jumping, we in fact show that the problem admits a polynomial kernel on
{
C
3
,
C
4
}
-free graphs. In the case of Token Sliding, we also show that the problem admits a polynomial kernel on bipartite graphs of bounded degree. We believe both of these results to be of independent interest. We complement these positive results by showing that, for any constant
p
≥
4
, both problems are W1-hard on
{
C
4
,
⋯
,
C
p
}
-free graphs and Token Sliding remains W1-hard even on bipartite graphs.
Human activities in the oceans are increasing and can result in additional mortality on many marine Protected, Endangered or Threatened Species (PETS). It is necessary to implement ambitious measures ...that aim to restore biodiversity at all nodes of marine food webs and to manage removals resulting from anthropogenic activities. We developed a stochastic surplus production model (SPM) linking abundance and removal processes under the assumption that variations in removals reflect variations in abundance. We then consider several 'harvest' control rules, included two candidate ones derived from this SPM (which we called 'Anthropogenic Removals Threshold', or ART), to manage removals of PETS. The two candidate rules hinge on the estimation of a stationary removal rate. We compared these candidate rules to other existing control rules (
potential biological removal or a fixed percentage rule) in three scenarios: (i) a base scenario whereby unbiased but noisy data are available, (ii) scenario whereby abundance estimates are overestimated and (iii) scenario whereby abundance estimates are underestimated. The different rules were tested on a simulated set of data with life-history parameters close to a small-sized cetacean species of conservation interest in the North-East Atlantic, the harbour porpoise (
), and in a management strategy evaluation framework. The effectiveness of the rules were assessed by looking at performance metrics, such as time to reach the conservation objectives, the removal limits obtained with the rules or temporal autocorrelation in removal limits. Most control rules were robust against biases in data and allowed to reach conservation objectives with removal limits of similar magnitude when averaged over time. However, one of the candidate rule (ART) displayed greater alignment with policy requirements for PETS such as minimizing removals over time.
Summary
The article by Tian et al. (Appl. Stoch. Models Bus. Ind. 2023) takes an interesting look at the use of non‐informative priors adapted to several censoring processes, which are common in ...reliability. It proposes a continuum of modelling approaches that go as far as defining weakly informative priors to overcome the well‐known shortcomings of frequentist approaches to problems involving highly censored samples. In this commentary, I make some critical remarks and propose to link this work to a more generic vision of what could be a relevant Bayesian elicitation in reliability, taking advantage of recent theoretical and applied advances. Through tools like approximate posterior priors and prior equivalent sample sizes, and by illustrating them with simple reliability models, I suggest methodological avenues to formalize the elicitation of informative priors in a auditable, defensible way. By allowing a clear modulation of subjective information, this might respond to the authors' primary concern of constructing weakly informative priors and to a more general concern for precaution in Bayesian reliability.
Recoloring graphs via tree decompositions Bonamy, Marthe; Bousquet, Nicolas
European journal of combinatorics,
March 2018, 2018-03-00, 2018-03, Letnik:
69
Journal Article
Recenzirano
Odprti dostop
Let k be an integer. Two vertex k-colorings of a graph are adjacent if they differ on exactly one vertex. A graph is k-mixing if any proper k-coloring can be transformed into any other through a ...sequence of adjacent proper k-colorings.
Jerrum proved that any graph is k-mixing if k is at least the maximum degree plus two. We first improve Jerrum’s bound using the Grundy number, which is the greatest number of colors in a greedy coloring. As a corollary, we obtain that any cograph G is (χ(G)+1)-mixing, and that for fixed χ(G) the shortest sequence between any two (χ(G)+1)-colorings is at most linear in the number of vertices. We additionally argue that while cographs are exactly the P4-free graphs, the result cannot be extended to P5-free graphs.
Any graph is (tw+2)-mixing, where tw denotes the treewidth of the graph (Cereceda 2006). We prove that the shortest sequence between any two (tw+2)-colorings is at most quadratic (which is optimal up to a constant factor), a problem left open in Bonamy et al. (2012).
All the proofs are constructive and lead to polynomial-time recoloring algorithms: given two colorings, we can exhibit in polynomial time a sequence transforming one into the other.
Let k and d be positive integers such that k≥d+2. Consider two k-colourings of a d-degenerate graph G. Can we transform one into the other by recolouring one vertex at each step while maintaining a ...proper colouring at any step? Cereceda et al. answered that question in the affirmative, and exhibited a recolouring sequence of exponential length. However, Cereceda conjectured that there should exist one of quadratic length.
The k-reconfiguration graph of G is the graph whose vertices are the proper k-colourings of G, with an edge between two colourings if they differ on exactly one vertex. Cereceda's conjecture can be reformulated as follows: the diameter of the (d+2)-reconfiguration graph of any d-degenerate graph on n vertices is O(n2). So far, there is no proof of a polynomial upper bound on the diameter, even for d=2.
In this paper, we prove that the diameter of the k-reconfiguration graph of a d-degenerate graph is O(nd+1) for k≥d+2. Moreover, we prove that if k≥32(d+1) then the diameter of the k-reconfiguration graph is quadratic, improving the previous bound of k≥2d+1. We also show that the 5-reconfiguration graph of planar bipartite graphs has quadratic diameter, confirming Cereceda's conjecture for this class of graphs.
This article describes the preparation of a clinical study on a connected medical device dedicated to facilitate the prevention of lymphedema for patients treated for breast cancer after axillary ...lymph node dissection (ALND), and to alert them of any manifestation or resurgence in order to prevent its aggravation. A simulator of the entire physical process called digital twin has been developed for this purpose, in a hardware-in-the-loop framework. Statistical calibration of its input parameters, by stochastic inversion and using sensitivity studies, led to establish one or more measurement protocols allowing to capture the signal on a mobile device (phone or tablet) and to detect signal breaks that are physically significant. The measured signal makes it possible to report quickly on the worsening of the patient's condition and to warn the therapists within a very reasonable period of time. The general methodology of this work seems to us to be easily reproducible in the preparation of clinical studies of other types for connected devices, which aim to develop measurement protocols limiting the often significant cost of such studies. One of the immediate prospects of this work is the initiation of a clinical study on different patients who have been treated by surgery for breast cancer, after improving the robustness of the design of the prototype to take into account the uncertainties affecting the measurements.
Uncertain information on input parameters of computer models is usually modeled by considering these parameters as random, and described by marginal distributions and a dependence structure of these ...variables. In numerous real-world applications, while information is mainly provided by marginal distributions, typically from samples, little is really known on the dependence structure itself. Faced with this problem of incomplete or missing information, risk studies that make use of these computer models are often conducted by considering independence of input variables, at the risk of including irrelevant situations. This approach is especially used when reliability functions are considered as black-box models. Such analyses remain weakened in absence of in-depth model exploration, at the possible price of a strong risk misestimation. Considering the frequent case where the reliability output is a quantile, this article provides a methodology to improve risk assessment, by exploring a set of pessimistic dependencies using a copula-based strategy. In dimension greater than two, a greedy algorithm is provided to build input regular vine copulas reaching a minimum quantile to which a reliability admissible limit value can be compared, by selecting pairwise components of sensitive influence on the result. The strategy is tested over toy models and a real industrial case-study. The results highlight that current approaches can provide non-conservative results.
Age estimates, typically determined by counting periodic growth increments in calcified structures of vertebrates, are the basis of population dynamics models used for managing exploited or ...threatened species. In fisheries research, the use of otolith growth rings as an indicator of fish age has increased considerably in recent decades. However, otolith readings include various sources of uncertainty. Current ageing methods, which converts an average count of rings into age, only provide periodic age estimates in which the range of uncertainty is fully ignored. In this study, we describe a hierarchical model for estimating individual ages from repeated otolith readings. The model was developed within a Bayesian framework to explicitly represent the sources of uncertainty associated with age estimation, to allow for individual variations and to include knowledge on parameters from expertise. The performance of the proposed model was examined through simulations, and then it was coupled to a two-stanza somatic growth model to evaluate the impact of the age estimation method on the age composition of commercial fisheries catches. We illustrate our approach using the sagittal otoliths of yellowfin tuna of the Indian Ocean collected through large-scale mark-recapture experiments. The simulation performance suggested that the ageing error model was able to estimate the ageing biases and provide accurate age estimates, regardless of the age of the fish. Coupled with the growth model, this approach appeared suitable for modeling the growth of Indian Ocean yellowfin and is consistent with findings of previous studies. The simulations showed that the choice of the ageing method can strongly affect growth estimates with subsequent implications for age-structured data used as inputs for population models. Finally, our modeling approach revealed particularly useful to reflect uncertainty around age estimates into the process of growth estimation and it can be applied to any study relying on age estimation.
The northern Gulf of St. Lawrence (NGSL) stock of Atlantic cod (Gadus morhua), historically the second largest cod population in the Western Atlantic, has known a severe collapse during the early ...1990 s and is currently considered as endangered by the Committee on the Status of Endangered Wildlife in Canada. As for many fish populations over the world which are currently being heavily exploited or overfished, urgent management actions in the form of recovery plans are needed for restoring this stock to sustainable levels. Stochastic projections based on a statistical population model incorporating predation were conducted over a period of 30 years (2010-2040) to assess the expected outcomes of alternative fishing strategies on the stock recovery under different scenarios of harp seal (Pagophilus groenlandicus) abundance and environmental conditions. This sensitivity study shows that water temperature is key in the rebuilding of the NGSL cod stock. Model projections suggest that maintaining the current management practice under cooler water temperatures is likely to maintain the species in an endangered status. Under current or warmer conditions in the Gulf of St. Lawrence, partial recovery might only be achieved by significant reductions in both fishing and predation pressure. In the medium-term, a management strategy that reduces catch could be favoured over a complete moratorium so as to minimize socio-economic impacts on the industry.