This paper develops some basic principles to study autocatalytic networks and exploit their structural properties in order to characterize their inherent fundamental limits and tradeoffs. In a ...dynamical system with autocatalytic structure, the system's output is necessary to catalyze its own production. Our study has been motivated by a simplified model of a glycolysis pathway. First, the properties of this class of pathways are investigated through a network model, which consists of a chain of enzymatically catalyzed intermediate reactions coupled with an autocatalytic component. We explicitly derive a hard limit on the minimum achievable L 2 -gain disturbance attenuation and a hard limit on its minimum required output energy. Then, we show how these resulting hard limits lead to some fundamental tradeoffs between transient and steady-state behavior of the network and its net production.
Motivated partially by a control-theoretic viewpoint, we propose a game-theoretic model, called random access game, for contention control. We characterize Nash equilibria of random access games, ...study their dynamics, and propose distributed algorithms (strategy evolutions) to achieve Nash equilibria. This provides a general analytical framework that is capable of modeling a large class of system-wide quality-of-service (QoS) models via the specification of per-node utility functions, in which system-wide fairness or service differentiation can be achieved in a distributed manner as long as each node executes a contention resolution algorithm that is designed to achieve the Nash equilibrium. We thus propose a novel medium access method derived from carrier sense multiple access/collision avoidance (CSMA/CA) according to distributed strategy update mechanism achieving the Nash equilibrium of random access game. We present a concrete medium access method that adapts to a continuous contention measure called conditional collision probability, stabilizes the network into a steady state that achieves optimal throughput with targeted fairness (or service differentiation), and can decouple contention control from handling failed transmissions. In addition to guiding medium access control design, the random access game model also provides an analytical framework to understand equilibrium and dynamic properties of different medium access protocols.
Cerebral blood flow (CBF) supports brain metabolism. Diseases impair CBF, and pharmacological agents modulate CBF. Many techniques measure CBF, but phase contrast (PC) MR imaging through the four ...arteries supplying the brain is rapid and robust. However, technician error, patient motion, or tortuous vessels degrade quality of the measurements of the internal carotid (ICA) or vertebral (VA) arteries. We hypothesized that total CBF could be imputed from measurements in subsets of these 4 feeding vessels without excessive penalties in accuracy. We analyzed PC MR imaging from 129 patients, artificially excluded 1 or more vessels to simulate degraded imaging quality, and developed models of imputation for the missing data. Our models performed well when at least one ICA was measured, and resulted in
values of 0.998-0.990, normalized root mean squared error values of 0.044-0.105, and intra-class correlation coefficient of 0.982-0.935. Thus, these models were comparable or superior to the test-retest variability in CBF measured by PC MR imaging. Our imputation models allow retrospective correction for corrupted blood vessel measurements when measuring CBF and guide prospective CBF acquisitions.
Liver iron concentration (LIC) measured by MRI has become the clinical reference standard for managing iron overload in chronically transfused patients. Transverse relaxivity (R
or R
) measurements ...are converted to LIC units using empirically derived calibration curves.
That flip angle (FA) error due to B
spatial heterogeneity causes significant LIC quantitation error. B
scale (b
, FA
/FA
) variation is a major problem at 3 T which could reduce the accuracy of transverse relaxivity measurements.
Prospective.
Forty-seven subjects with chronic transfusional iron overload undergoing clinically indicated LIC assessment.
5 T/3 T dual-repetition time B
mapping sequence ASSESSMENT: We quantified the average/standard deviation b
in the right and left lobes of the liver from B
maps acquired at 1.5 T and 3 T. The impact of b
variation on spin echo LIC estimates was determined using a Monte Carlo model.
Mean, median, and standard deviation in whole liver and right and left lobes; two-sided t-test between whole-liver b
means.
Average b
within the liver was 99.3% ± 12.3% at 1.5 T versus 69.6% ± 14.6% at 3 T and was independent of iron burden (P < 0.05). Monte Carlo simulations demonstrated that b
systematically increased R
estimates at lower LIC (<~25 mg/g at 1.5 T, <~15 mg/g at 3 T) but flattened or even inverted the R
-LIC relationship at higher LIC (≥~25 mg/g to 1.5 T, ≥~15 mg/g to 3 T); changes in the R
-LIC relationship were symmetric with respect to over and under excitation and were similar at 1.5 T and 3 T (for the same R
value). The R
-LIC relationship was independent of b
.
Spin echo R
measurement of LIC at 3 T is error-prone without correction for b
errors. The impact of b
error on current 1.5 T spin echo-based techniques for LIC quantification is large enough to introduce measurable intersubject variability but the in vivo effect size needs a dedicated validation study.
2.
Building on a recent effort that combines a first-principles approach to modeling router-level connectivity with a more pragmatic use of statistics and graph theory, we show in this paper that for ...the Internet, an improved understanding of its physical infrastructure is possible by viewing the physical connectivity as an annotated graph that delivers raw connectivity and bandwidth to the upper layers in the TCP/IP protocol stack, subject to practical constraints (e.g., router technology) and economic considerations (e.g., link costs). More importantly, by relying on data from Abilene, a Tier-1 ISP, and the Rocketfuel project, we provide empirical evidence in support of the proposed approach and its consistency with networking reality. To illustrate its utility, we: 1) show that our approach provides insight into the origin of high variability in measured or inferred router-level maps; 2) demonstrate that it easily accommodates the incorporation of additional objectives of network design (e.g., robustness to router failure); and 3) discuss how it complements ongoing community efforts to reverse-engineer the Internet.
This paper is aimed at designing a congestion control system that scales gracefully with network capacity, providing high utilization, low queueing delay, dynamic stability, and fairness among users. ...The focus is on developing decentralized control laws at end-systems and routers at the level of fluid-flow models, that can provably satisfy such properties in arbitrary networks, and subsequently approximate these features through practical packet-level implementations. Two families of control laws are developed. The first "dual" control law is able to achieve the first three objectives for arbitrary networks and delays, but is forced to constrain the resource allocation policy. We subsequently develop a "primal-dual" law that overcomes this limitation and allows sources to match their steady-state preferences at a slower time-scale, provided a bound on round-trip-times is known. We develop two packet-level implementations of this protocol, using 1) ECN marking, and 2) queueing delay, as means of communicating the congestion measure from links to sources. We demonstrate using ns-2 simulations the stability of the protocol and its equilibrium features in terms of utilization, queueing and fairness, under a variety of scaling parameters.
Aim
After environmental disasters, species with large population losses may need urgent protection to prevent extinction and support recovery. Following the 2019–2020 Australian megafires, we ...estimated population losses and recovery in fire‐affected fauna, to inform conservation status assessments and management.
Location
Temperate and subtropical Australia.
Time period
2019–2030 and beyond.
Major taxa
Australian terrestrial and freshwater vertebrates; one invertebrate group.
Methods
From > 1,050 fire‐affected taxa, we selected 173 whose distributions substantially overlapped the fire extent. We estimated the proportion of each taxon’s distribution affected by fires, using fire severity and aquatic impact mapping, and new distribution mapping. Using expert elicitation informed by evidence of responses to previous wildfires, we estimated local population responses to fires of varying severity. We combined the spatial and elicitation data to estimate overall population loss and recovery trajectories, and thus indicate potential eligibility for listing as threatened, or uplisting, under Australian legislation.
Results
We estimate that the 2019–2020 Australian megafires caused, or contributed to, population declines that make 70–82 taxa eligible for listing as threatened; and another 21–27 taxa eligible for uplisting. If so‐listed, this represents a 22–26% increase in Australian statutory lists of threatened terrestrial and freshwater vertebrates and spiny crayfish, and uplisting for 8–10% of threatened taxa. Such changes would cause an abrupt worsening of underlying trajectories in vertebrates, as measured by Red List Indices. We predict that 54–88% of 173 assessed taxa will not recover to pre‐fire population size within 10 years/three generations.
Main conclusions
We suggest the 2019–2020 Australian megafires have worsened the conservation prospects for many species. Of the 91 taxa recommended for listing/uplisting consideration, 84 are now under formal review through national processes. Improving predictions about taxon vulnerability with empirical data on population responses, reducing the likelihood of future catastrophic events and mitigating their impacts on biodiversity, are critical.
Feedback regulation is pervasive in biology at both the organismal and cellular level. In this article, we explore the properties of a particular biomolecular feedback mechanism called antithetic ...integral feedback, which can be implemented using the binding of two molecules. Our work develops an analytic framework for understanding the hard limits, performance tradeoffs, and architectural properties of this simple model of biological feedback control. Using tools from control theory, we show that there are simple parametric relationships that determine both the stability and the performance of these systems in terms of speed, robustness, steady-state error, and leakiness. These findings yield a holistic understanding of the behavior of antithetic integral feedback and contribute to a more general theory of biological control systems.
Display omitted
•Feedback control is an essential component of biomolecular systems•The design of feedback systems necessarily imposes performance tradeoffs•We use control theory to study an important class of molecular feedback motifs•Our work provides a map between biochemical parameters and circuit performance
While feedback regulation is pervasive at every level of biology, it has proven difficult to design synthetic biomolecular feedback systems that match the performance found in nature. The recently developed antithetic integral feedback motif provides a promising mechanism for the implementation of robust control of molecular processes. Our work applies mathematical tools from control theory to this motif, with the goal of taking steps towards the development of a coherent theoretical framework to guide the design of synthetic feedback networks. We characterize the stability and performance tradeoffs of the network, clarifying the relationship between low-level biomolecular rate parameters and high-level system performance (e.g., speed, robustness, tracking error). While these observations can be taken separately, we highlight that a mathematical result known as Bode’s integral theorem provides a unifying framework for considering the fundamental constraints on feedback control systems.