The High Accuracy Satellite Drag Model (HASDM) is the operational thermospheric density model used by the US Space Force Combined Space Operations Center. By using real‐time data assimilation, HASDM ...can provide density estimates with increased accuracy over other empirical models. With historical HASDM density data being released publicly for the first time, we can analyze the data to compare dominant modes of variability in the upper atmosphere as modeled by HASDM and the Jacchia‐Bowman 2008 Empirical Thermospheric Density Model (JB2008), a Jacchia family model upon which density corrections are made as a part of the HASDM framework. This model comparison is conducted through the use of principal component analysis (PCA) which shows the increased variability of the HASDM dataset. We highlight HASDM's ability to capture the movement of lighter species during solar minimum conditions, unlike many empirical models. We then compare density from both models to the CHAllenging Minisatellite Payload (CHAMP) and Gravity Recovery and Climate Experiment (GRACE) accelerometer‐derived density estimates. This comparison shows that HASDM more closely matches the accelerometer‐derived densities with mean absolute differences of 30.93% compared to CHAMP and GRACE‐A, respectively. The comparison also reveals improved representation of cooling mechanisms due to NO and CO2 by the HASDM database.
Plain Language Summary
Many density models for the upper atmosphere have been developed using data from various sources to fit parametric equations. These are classified as empirical models which make up a large portion of those used in operations. While they are generally reliable, there are conditions where their accuracy falls short, such as during geomagnetic storms or during periods of low solar activity. The High Accuracy Satellite Drag Model (HASDM) uses observations from calibration satellites to make real‐time corrections (a process known as Dynamic Calibration of the Atmosphere) to density outputs from an empirical model, making it far more robust. In this paper, we analyze a 20 year dataset of HASDM outputs and compare it the baseline JB2008 model, upon which data is assimilated. By leveraging a mathematical tool called principal component analysis, we identify features in the HASDM dataset that are not modeled by JB2008, such as the movement of lighter species during solar minimum conditions. We also perform a case study where we compare both model outputs to high‐fidelity density observations from different satellites during two unique periods. Through this comparison, we find evidence that the HASDM dataset captures the effects of NO and CO2 cooling mechanisms.
Key Points
We conduct principal component analysis of High Accuracy Satellite Drag Model (HASDM) and JB2008 database covering almost two solar cycles
HASDM models the movement of lighter species during solar minimum conditions
HASDM exhibits the ability to capture/model NO and CO2 cooling mechanisms
The community has leveraged satellite accelerometer data sets in previous years to estimate neutral mass density and exospheric temperatures. We utilize derived temperature data and optimize a ...nonlinear machine‐learned (ML) regression model to improve upon the performance of the linear EXospheric TEMPeratures on a PoLyhedrAl gRid (EXTEMPLAR) model. The newly developed EXTEMPLAR‐ML model allows for exospheric temperature predictions at any location with one model and provides performance improvements over its predecessor. We achieve reductions in mean absolute error of 2 K on an independent test set while providing similar error standard deviation values. Comparing the performance of both EXTEMPLAR models and the Naval Research Laboratory Mass Spectrometer and Incoherent Scatter radar Extended model (NRLMSISE‐00) across different solar and geomagnetic activity levels shows that EXTEMPLAR‐ML has the lowest mean absolute error across 80% of conditions tested. A study for spatial errors demonstrated that at all grid locations, EXTEMPLAR‐ML has the lowest mean absolute error for over 60% of the polyhedral grid cells on the test set. Like EXTEMPLAR, our model's outputs can be utilized by NRLMSISE‐00 (exclusively) to more closely match satellite accelerometer‐derived densities. We conducted 10 case studies where we compare the accelerometer‐derived temperature and density estimates from four satellites to NRLMSISE‐00, EXTEMPLAR, and EXTEMPALR‐ML during major storm periods. These comparisons show that EXTEMPLAR‐ML generally has the best performance of the three models during storms. We use principal component analysis on EXTEMPLAR‐ML outputs to verify the physical response of the model to its drivers.
Plain Language Summary
Density in the upper atmosphere is highly variable and difficult to model. Empirical density models often rely on temperature profile predictions to determine species and mass densities. One of three key parameters in determining the temperature profiles is the asymptotic value at the top of the thermosphere called the exospheric temperature. By using temperatures derived from satellite acceleration measurements, we develop a machine‐learned global temperature model called EXospheric TEMPeratures on a PoLyhedrAl gRid Machine Learned (EXTEMPLAR‐ML). We achieve a 2 K reduction in mean absolute error on the independent test set relative to the model's predecessor. Additional analyses showed that EXTEMPLAR‐ML was more accurate than linear EXTEMPLAR across a majority of conditions and grid locations. We also look at temperatures and densities along satellite orbits during 10 major geomagnetic storms from the 21st century. In this study, we see major improvements over a significant empirical model called NRLMSISE‐00 and the linear predecessor to EXTEMPLAR‐ML. We leveraged a mathematical decomposition tool on the model outputs to assess its internal formulation. This shows that EXTEMPLAR‐ML is most heavily driven by solar activity and the seasons.
Key Points
We develop a nonlinear global model for exospheric temperature prediction called EXospheric TEMPeratures on a PoLyhedrAl gRid Machine Learned (EXTEMPLAR‐ML)
We leverage principal component analysis to improve our understanding of the EXTEMPLAR‐ML temperature formulation
EXTEMPLAR‐ML shows increased accuracy relative to satellite observations across a majority of conditions, locations, and during geomagnetic storms
Obtaining accurate predictions of the neutral density in the thermosphere has been a long‐standing problem. During geomagnetic storms the auroral heating in the polar ionospheres quickly raises the ...temperature of the thermosphere, resulting in higher neutral densities that exert a greater drag force on objects in low Earth orbit. Rapid increases and decreases in the temperature and density may occur within a couple days. A key parameter in the thermosphere is the total amount of nitric oxide (NO). The production of NO is accelerated by the auroral heating, and since NO is an efficient radiator of thermal energy, higher concentrations of this molecule accelerate the rate at which the thermosphere cools. This paper describes an improved technique that calculates changes in the global temperature of the thermosphere. Starting from an empirical model of the Poynting flux into the ionosphere, a set of differential equations derives the minimum, global value of the exospheric temperature, which can be used in a neutral density model to calculate the global values. The relative variations in NO content are used to obtain more accurate cooling rates. Comparisons with the global rate of NO emissions that are measured with the Sounding of the Atmosphere using Broadband Emission Radiometry instrument show that there is very good agreement with the predicted values. The NO emissions correlate highly with the total auroral heating that has been integrated over time. We also show that the NO emissions are highly correlated with thermospheric temperature, as well as indices of solar extreme ultraviolet radiation.
Key Points
The thermosphere cools most rapidly following periods of strong auroral heating
Nitric oxide emissions have high correlations with the heating and temperature
Accounting for cooling by nitric acid is required for accurate model predictions
Machine learning (ML) models are universal function approximators and—if used correctly—can summarize the information content of observational data sets in a functional form for scientific and ...engineering applications. A benefit to ML over parametric models is that there are no a priori assumptions about particular basis functions which can potentially limit the phenomena that can be modeled. In this work, we develop ML models on three data sets: the Space Environment Technologies High Accuracy Satellite Drag Model (HASDM) density database, a spatiotemporally matched data set of outputs from the Jacchia‐Bowman 2008 Empirical Thermospheric Density Model (JB2008), and an accelerometer‐derived density data set from CHAllenging Minisatellite Payload (CHAMP). These ML models are compared to the Naval Research Laboratory Mass Spectrometer and Incoherent Scatter radar (NRLMSIS 2.0) model to study the presence of post‐storm cooling in the middle‐thermosphere. We find that both NRLMSIS 2.0 and JB2008‐ML do not account for post‐storm cooling and consequently perform poorly in periods following strong geomagnetic storms (e.g., the 2003 Halloween storms). Conversely, HASDM‐ML and CHAMP‐ML do show evidence of post‐storm cooling indicating that this phenomenon is present in the original data sets. Results show that density reductions up to 40% can occur 1–3 days post‐storm depending on the location and strength of the storm.
Plain Language Summary
Machine learning (ML) models are valuable universal function approximators and—if used correctly—can provide scientific information related to the data set used for fitting. A benefit to ML over other common models is that there are no background functions limiting what phenomena it can represent. In this work, we develop ML models on three data sets: the Space Environment Technologies High Accuracy Satellite Drag Model (HASDM) density database, a spatiotemporally matched data set of outputs from the Jacchia‐Bowman 2008 Empirical Thermospheric Density Model (JB2008), and an accelerometer‐derived density data set from CHAllenging Minisatellite Payload (CHAMP). These ML models are compared to the Naval Research Laboratory Mass Spectrometer and Incoherent Scatter radar (NRLMSIS 2.0) model to study the presence of post‐storm cooling in the upper atmosphere. We find that both NRLMSIS 2.0 and JB2008‐ML do not account for post‐storm cooling and perform poorly in periods following strong geomagnetic storms. Conversely, HASDM‐ML and CHAMP‐ML do show evidence of post‐storm cooling indicating that this phenomenon is present in the original data sets.
Key Points
Machine learning is used to develop models from unique density data sets
We compared model predictions along the CHAllenging Minisatellite Payload (CHAMP) orbit during the 2003 Halloween storms
We find that models developed on CHAMP and High Accuracy Satellite Drag Model density data can capture density depletion in the post‐storm period
An accurate estimation of upper atmospheric densities is crucial for precise orbit determination (POD), prediction of low Earth orbit satellites, and scientific studies of the Earth's atmosphere. But ...densities estimated using satellite tracking data are always uncertain up to the drag‐coefficient assumed in the inversion method. This work develops a new framework to simultaneously estimate the density and drag‐coefficient for satellites with a time‐varying attitude. We do so by leveraging Fourier drag‐coefficient models, previously developed by the authors, and physical models of the drag‐coefficient. The method is tested with synthetic data for different geomagnetic activities, altitude levels, and errors in the gas‐surface interaction parameters. We report an improvement of up to 70% in density estimates for the simulations. Finally, POD data from Spire satellites are used for validation. An improvement of around 29% is obtained in the filter density estimates over NRLMSISE‐00 and 49% over JB2008 compared to the High Accuracy Satellite Drag Model densities.
Plain Language Summary
With the rapidly increasing number of Earth‐orbiting satellites, accurate monitoring of the satellite positions has become crucial for collision avoidance purposes. One of the major sources of error in the tracking of LEO satellites is the force that the tenuous atmosphere exerts on satellites, known as atmospheric drag. Modeling the drag force is complicated due to the uncertainties in the atmospheric density and the drag‐coefficient—a parameter that governs the interactions between the atmosphere and the satellite surface. In this work, we propose a method to obtain corrections to both the density and the drag‐coefficient from satellite tracking data, thus improving the tracking accuracy in the process.
Key Points
We provide a new method to estimate accurate local atmospheric densities at sub‐orbital cadence
The biases in density and drag‐coefficient are decorrelated using time‐variations induced due to attitude variations in the latter
The reduction in density bias is validated using Precision Orbit Determination data from Spire satellites
The EXospheric TEMperatures on a PoLyhedrAl gRid (EXTEMPLAR) method predicts the neutral densities in the thermosphere. The performance of this model has been evaluated through a comparison with the ...Air Force High Accuracy Satellite Drag Model (HASDM). The Space Environment Technologies (SET) HASDM database that was used for this test spans the 20 years 2000 through 2019, containing densities at 3 hr time intervals at 25 km altitude steps, and a spatial resolution of 10° latitude by 15° longitude. The upgraded EXTEMPLAR that was tested uses the newer Naval Research Laboratory MSIS 2.0 model to convert global exospheric temperature values to neutral density as a function of altitude. The revision also incorporated time delays that varied as a function of location, between the total Poynting flux in the polar regions and the exospheric temperature response. The density values from both models were integrated on spherical shells at altitudes ranging from 200 to 800 km. These sums were compared as a function of time. The results show an excellent agreement at temporal scales ranging from hours to years. The EXTEMPLAR model performs best at altitudes of 400 km and above, where geomagnetic storms produce the largest relative changes in neutral density. In addition to providing an effective method to compare models that have very different spatial resolutions, the use of density totals at various altitudes presents a useful illustration of how the thermosphere behaves at different altitudes, on time scales ranging from hours to complete solar cycles.
Plain Language Summary
A recently developed computer model predicts the mass density of atoms and molecules in upper atmosphere, in the region known as the thermosphere. Changes in this “neutral density” following geomagnetic storms can perturb the orbits of the many satellites in this region, leading to imprecise knowledge of their paths and risk of collisions. This model uses measurements of the solar wind and the embedded magnetic field to predict the level of heating in the upper atmosphere, and the resulting expansion of the atmosphere to higher altitudes. In order to test the capabilities of the new model, its calculations were compared with density values derived by an Air Force data assimilation system based on radar tracking of multiple objects in Earth orbit over a 20‐year period. The results of this comparison show an excellent agreement, particularly at the higher altitudes where geomagnetic storms have the greatest influence.
Key Points
Thermosphere neutral densities from the EXospheric TEMperatures on a PoLyhedrAl gRid (EXTEMPLAR) model are compared with the SET HASDM density database for a 20 year time period
The use of mean densities on spherical shells at several altitudes is an effective way to compare the models
The EXTEMPLAR model performs well at altitudes of 400 km and above where geomagnetic storms produce the largest changes in neutral density
The Mg II index: A proxy for solar EUV Viereck, Rodney; Puga, Lawrence; McMullin, Donald ...
Geophysical research letters,
1 April 2001, Letnik:
28, Številka:
7
Journal Article
Recenzirano
Odprti dostop
This paper shows that the Mg II core‐to‐wing ratio is a better proxy for Solar Extreme Ultraviolet (EUV) radiation, between 25 and 35 nm than is the F10.7 index. The He II 30.4 nm solar emission, by ...itself, is an important source of energy for the upper atmosphere. We will compare the NOAA Mg II Index and the F10.7 Index to the He II 30.4 data taken with the CELIAS/Solar EUV Monitor (SEM) on the Solar and Helospheric Observatory (SOHO).
Ionizing radiation at aircraft and commercial suborbital spaceflight altitudes is driven by space weather and is a health concern for crew and passengers. We compare the response functions of two ...radiation detectors that were exposed to four different ground‐based laboratory radiation fields as well as flown alongside each other on aircraft. The detectors were a tissue equivalent proportional counter (TEPC) and a Teledyne silicon micro dosimeter chip that was integrated into an Automated Radiation Measurements for Aerospace Safety Flight Module (ARMAS FM). Both detectors were flown onboard commercial and research aircraft. In addition, both detectors were exposed neutrons at the Los Alamos Neutron Science Center, protons at Loma Linda University Medical Center, 56Fe particles at the NASA Space Radiation Laboratory, and also a gamma radiation source at Lawrence Livermore National Laboratory. The response of each of these instruments as well as derived dosimetric quantities are compared for each radiation exposure and the ratio for converting ARMAS absorbed dose in silicon to an estimated absorbed dose in tissue is obtained. This process resulted in the first definitive calibration of the silicon‐based detector like ARMAS to TEPC. In particular, with seven flights of both instruments together, the ARMAS‐derived dose in tissue was then validated with the TEPC‐measured dose in tissue and these results are reported. This work provides a method for significantly improving the accuracy of radiation measurements relevant to human tissue safety using a silicon detector that is easy to deploy and can report data in real time.
Key Points
Measurements from a radiation detector based on the Teledyne uDOS001 were cross‐calibrated to measurements from a TEPC microdosimer
Results from parallel exposure of these instruments in 4 different radiation fields were utilized to create a new cross‐calibration method
Excellent agreement on seven airline flight measurements was found between the TEPC and the Teledyne uDOS001 using the new method
The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety climatological model and the Automated Radiation Measurements for Aerospace Safety (ARMAS) statistical database are presented as ...polynomial fit equations. Using equations based on altitude, L shell, and geomagnetic conditions an effective dose rate for any location from a galactic cosmic ray (GCR) environment can be calculated. A subset of the ARMAS database is represented by a second polynomial fit equation for the GCR plus probable relativistic energetic particle (REP; Van Allen belt REP) effective dose rates within a narrow band of L shells with altitudinal and geomagnetic dependency. Solar energetic particle events are not considered in this study since our databases do not contain these events. This work supports a suggestion that there may be a REP contribution having an effect at aviation altitudes. The ARMAS database is rich in Western Hemisphere observations for L shells between 1.5 and 5; there have been many cases of enhanced radiation events possibly related to effects from radiation belt particles. Our work identifies that the combined effects of an enhanced radiation environment in this L shell range are typically 15% higher than the GCR background. We also identify applications for the equations representing the Nowcast of Atmospheric Ionizing Radiation for Aviation Safety and ARMAS databases. They include (i) effective dose rate climatology in comparison with measured weather variability and (ii) climatological and statistical weather nowcasting and forecasting. These databases may especially help predict the radiation environment for regional air traffic management, for airport overflight operations, and for air carrier route operations of individual aircraft.
Plain Language Summary
Analytical functions have been constructed that represent the Nowcast of Atmospheric Ionizing Radiation for Aviation Safety model and Automated Radiation Measurements for Aerospace Safety measurement databases of the radiation environment at commercial aviation altitudes; the functions enable global climatological and statistical weather nowcasting and forecasting of aviation radiation hazards during quiet to active geomagnetic conditions.
Key Points
Analytical functions have been constructed that represent the NAIRAS model and ARMAS measurement databases of the radiation environment
These functions enable global climatological and statistical weather forecasting of aviation radiation hazards
These functions represent the GCR radiation component and the GCR plus probable REP radiation components
Hubble Space Telescope (HST) Wide‐Field Planetary Camera 2 (WFPC 2) images of Jupiter's aurora have been obtained close in time with Galileo ultraviolet spectrometer (UVS) spectra and in situ ...particles, fields, and plasma wave measurements between June 1996 and July 1997, overlapping Galileo orbits G1, G2, G7, G8, and C9. This paper presents HST images of Jupiter's aurora as a first step toward a comparative analysis of the auroral images with the in situ Galileo data. The WFPC 2 images appear similar to earlier auroral images, with the main ovals at similar locations to those observed over the preceding 2 years, and rapidly variable emissions poleward of the main ovals. Further examples have been observed of the equatorward surge of the auroral oval over 140–180° longitude as this region moves from local morning to afternoon. Comparison of the WFPC 2 reference auroral ovals north and south with the VIP4 planetary magnetic field model suggests that the main ovals map along magnetic field lines exceeding 15 RJ, and that the Io footprint locations have lead angles of 0–10° from the instantaneous magnetic projection. There was an apparent dawn auroral storm on June 23, 1996, and projections of the three dawn storms imaged with HST to date demonstrate that these appear consistently along the WFPC 2 reference oval. Auroral emissions have been consistently observed from Io's magnetic footprints on Jupiter. Possible systematic variations in brightness are explored, within factor of 6 variations in brightness with time. Images are also presented marked with expected locations of any auroral footprints associated with the satellites Europa and Ganymede, with localized emissions observed at some times but not at other times.