The seminar will provide an overview of the space debris environment, an update on the ELSA-d mission currently in orbit, and discuss upcoming technology plans for furthering sustainable development ...across all orbits.
Long-term space sustainability demands effective management and limitation of the growing orbital debris population. Achieving this requires first an improved quantification of environmental risk ...before proper mitigation can be formulated. A significant contributor to this risk is the creation of debris that is large enough to cause damage to another spacecraft upon impact, small enough to be difficult to track, and prohibitively expensive to remove due to quantity and orbital distribution. Currently, the most effective way to reduce the long-term risk of orbital debris is to prevent its generation in the first place. In line with this, the United States Federal Communications Commission (FCC) requires, as part of their license application processes, compliance with regulations designed to limit the likelihood of events which would generate debris, such as the limitation of the probability of collision with large objects 1. In this instance, a “large object” is any object exceeding 10 cm in diameter, and the legacy maximum allowable probability of collision is 1.0e-3 per satellite, assessed over its lifetime. To show compliance with this regulation, satellite operators commonly use NASA's Debris Assessment Software (DAS) tool. Historically, DAS has provided adequate means for assessing lifetime collision risk for single-satellite missions operating in low earth orbit (LEO). However, given the rapid increase in the number of spacecraft and large constellations that are being launched into orbit, it is prudent to re-evaluate both the regulation requirement itself, as well as the method by which compliance is determined. This paper explores the origin and intent of the requirement and assesses its efficacy through multiple case studies representing different spacecraft in various orbits. In each case study, the likelihood and impact dimensions of catastrophic fragmentation risk are computed for each spacecraft and then compared to corresponding DAS outputs. Analysis results show that not only does the likelihood of catastrophic fragmentation consistently exceed predictions by DAS, but the environmental impact may also be greatly underestimated for spacecraft more massive than 50 kg.
We introduce a methodology for estimating the risk posed to the space environment by a spacecraft over an arbitrary period of time following a risk mitigation strategy, in terms of aggregate ...collision probability. Our methodology enables estimation of residual risk and maneuver frequency, where residual risk is defined conceptually as the risk to a spacecraft which remains even after adherence to a risk mitigation strategy. The key parameters considered which affect residual risk for a general risk mitigation strategy are the risk mitigation maneuver (RMM) threshold, the risk reduction factor, and the maneuver execution time. We present an analytic result regarding the necessary residual risk (per-satellite) to ensure the total aggregate collision probability of a satellite constellation of arbitrary size be below a target value. This approach offers a more complete model of spacecraft safety and potential risk to the space environment by studying more than just the RMM threshold, which has historically been used as a common benchmark in space situational awareness and regulatory compliance literature. Our analysis shows that the RMM threshold is but one of a few factors which have significant effects on residual risk. We provide evidence that the RMM threshold alone is an incomplete indicator of the actual risk posed to the space environment by an operational spacecraft. We demonstrate the effectiveness of this methodology by presenting numerical results which model realistic satellites and satellite constellations and draw key insights from this analysis which will aid in managing the safety and sustainability of the space environment and inform risk mitigation strategies for large satellite constellations.
This paper describes the development of an empirical model to forecast epidemics of Ross River virus (RRV) disease using the multivariate seasonal auto-regressive integrated moving average (SARIMA) ...technique in Brisbane, Australia. We obtained computerized data on notified RRV disease cases, climate, high tide, and population sizes in Brisbane for the period 1985-2001 from the Queensland Department of Health, the Australian Bureau of Meteorology, the Queensland Department of Transport, and Australian Bureau of Statistics, respectively. The SARIMA model was developed and validated by dividing the data file into two data sets: the data between January 1985 and December 2000 were used to construct a model, and those between January and December 2001 to validate it. The SARIMA models show that monthly precipitation (beta = 0.004, P = 0.031) was significantly associated with RRV transmission. However, there was no significant association between other climate variables (e.g., temperature, relative humidity, and high tides) and RRV transmission. The predictive values in the model were generally consistent with actual values (root mean square percentage error = 0.94%). Therefore, this model may have applications as a decision supportive tool in disease control and risk-management planning programs.
We develop and exemplify application of new classes of dynamic models for time series of nonnegative counts. Our novel univariate models combine dynamic generalized linear models for binary and ...conditionally Poisson time series, with dynamic random effects for over-dispersion. These models estimate dynamic regression coefficients in both binary and nonzero count components. Sequential Bayesian analysis allows fast, parallel analysis of sets of decoupled time series. New multivariate models then enable information sharing in contexts when data at a more highly aggregated level provide more incisive inferences on shared patterns such as trends and seasonality. A novel multiscale approach-one new example of the concept of decouple/recouple in time series-enables information sharing across series. This incorporates cross-series linkages while insulating parallel estimation of univariate models, and hence enables scalability in the number of series. The major motivating context is supermarket sales forecasting. Detailed examples drawn from a case study in multistep forecasting of sales of a number of related items showcase forecasting of multiple series, with discussion of forecast accuracy metrics, comparisons with existing methods, and broader questions of probabilistic forecast assessment.
Historically, autosomal recessive 5q-linked spinal muscular atrophy (SMA) has been the leading inherited cause of infant death. SMA is caused by the absence of the
gene, and
gene replacement therapy, ...onasemnogene abeparvovec-xioi, was Food and Drug Administration approved in May 2019. Approval included all children with SMA age <2 years without end-stage weakness. However, gene transfer with onasemnogene abeparvovec-xioi has been only studied in children age ≤8 months.
In this article, we report key safety and early outcome data from the first 21 children (age 1-23 months) treated in the state of Ohio.
In children ≤6 months, gene transfer was well tolerated. In this young group, serum transaminase (aspartate aminotransferase and alanine aminotransferase) elevations were modest and not associated with γ glutamyl transpeptidase elevations. Initial prednisolone administration matched that given in the clinical trials. In older children, elevations in aspartate aminotransferase, alanine aminotransferase and γ glutamyl transpeptidase were more common and required a higher dose of prednisolone, but all were without clinical symptoms. Nineteen of 21 (90%) children experienced an asymptomatic drop in platelets in the first week after treatment that recovered without intervention. Of the 19 children with repeated outcome assessments, 11% (
= 2) experienced stabilization and 89% (
= 17) experienced improvement in motor function.
In this population, with thorough screening and careful post-gene transfer management, replacement therapy with onasemnogene abeparvovec-xioi is safe and shows promise for early efficacy.
We present new Bayesian methodology for consumer sales forecasting. Focusing on the multi-step-ahead forecasting of daily sales of many supermarket items, we adapt dynamic count mixture models for ...forecasting individual customer transactions, and introduce novel dynamic binary cascade models for predicting counts of items per transaction. These transaction–sales models can incorporate time-varying trends, seasonality, price, promotion, random effects and other outlet-specific predictors for individual items. Sequential Bayesian analysis involves fast, parallel filtering on sets of decoupled items, and is adaptable across items that may exhibit widely-varying characteristics. A multi-scale approach enables information to be shared across items with related patterns over time in order to improve prediction, while maintaining the scalability to many items. A motivating case study in many-item, multi-period, multi-step-ahead supermarket sales forecasting provides examples that demonstrate an improved forecast accuracy on multiple metrics, and illustrates the benefits of full probabilistic models for forecast accuracy evaluation and comparison.
Uncertainty in the Pan‐Arctic Ice‐Ocean Modeling and Assimilation System (PIOMAS) Arctic sea ice volume record is characterized. A range of observations and approaches, including in situ ice ...thickness measurements, ICESat retrieved ice thickness, and model sensitivity studies, yields a conservative estimate for October Arctic ice volume uncertainty of 1.35 × 103 km3 and an uncertainty of the ice volume trend over the 1979–2010 period of 1.0 × 103 km3 decade–1. A conservative estimate of the trend over this period is −2.8 × 103 km3 decade–1. PIOMAS ice thickness estimates agree well with ICESat ice thickness retrievals (<0.1 m mean difference) for the area for which submarine data are available, while difference outside this area are larger. PIOMAS spatial thickness patterns agree well with ICESat thickness estimates with pattern correlations of above 0.8. PIOMAS appears to overestimate thin ice thickness and underestimate thick ice, yielding a smaller downward trend than apparent in reconstructions from observations. PIOMAS ice volume uncertainties and trends are examined in the context of climate change attribution and the declaration of record minima. The distribution of 32 year trends in a preindustrial coupled model simulation shows no trends comparable to those seen in the PIOMAS retrospective, even when the trend uncertainty is accounted for. Attempts to label September minima as new record lows are sensitive to modeling error. However, the September 2010 ice volume anomaly did in fact exceed the previous 2007 minimum by a large enough margin to establish a statistically significant new record.
Key Points
Uncertainty of the modeled Arctic sea ice volume and its trend