Context.
Differentiating between a true exoplanet signal and residual speckle noise is a key challenge in high-contrast imaging (HCI). Speckles result from a combination of fast, slow, and static ...wavefront aberrations introduced by atmospheric turbulence and instrument optics. While wavefront control techniques developed over the last decade have shown promise in minimizing fast atmospheric residuals, slow and static aberrations such as non-common path aberrations (NCPAs) remain a key limiting factor for exoplanet detection. NCPAs are not seen by the wavefront sensor (WFS) of the adaptive optics (AO) loop, hence the difficulty in correcting them.
Aims.
We propose to improve the identification and rejection of slow and static speckles in AO-corrected images. The algorithm known as the Direct Reinforcement Wavefront Heuristic Optimisation (DrWHO) performs a frequent compensation operation on static and quasi-static aberrations (including NCPAs) to boost image contrast. It is applicable to general-purpose AO systems as well as HCI systems.
Methods.
By changing the WFS reference at every iteration of the algorithm (a few tens of seconds), DrWHO changes the AO system point of convergence to lead it towards a compensation mechanism for the static and slow aberrations. References are calculated using an iterative lucky-imaging approach, where each iteration updates the WFS reference, ultimately favoring high-quality focal plane images.
Results.
We validated this concept through both numerical simulations and on-sky testing on the SCExAO instrument at the 8.2-m Subaru telescope. Simulations show a rapid convergence towards the correction of 82% of the NCPAs. On-sky tests were performed over a 10 min run in the visible (750 nm). We introduced a flux concentration (FC) metric to quantify the point spread function (PSF) quality and measure a 15.7% improvement compared to the pre-DrWHO image.
Conclusions.
The DrWHO algorithm is a robust focal-plane wavefront sensing calibration method that has been successfully demonstrated on-sky. It does not rely on a model and does not require wavefront sensor calibration or linearity. It is compatible with different wavefront control methods, and can be further optimized for speed and efficiency. The algorithm is ready to be incorporated in scientific observations, enabling better PSF quality and stability during observations.
The first on-sky results obtained by CANARY, the multi-object adaptive optics (MOAO) demonstrator, are analysed. The data were recorded at the William Herschel Telescope, at the end of September ...2010. We describe the command and calibrations algorithms used during the run and present the observing conditions. The processed data are MOAO-loop engaged or disengaged slopes buffers, comprising the synchronised measurements of the four natural guide stars (NGS) wavefront sensors running in parallel, and near infrared (IR) images. We describe the method we use to establish the error budget of CANARY. We are able to evaluate the tomographic and the open loop errors, having median values around 216 nm and 110 nm respectively. In addition, we identify an unexpected residual quasi-static field aberration term of mean value 110 nm. We present the detailed error budget analysed for three sets of data for three different asterisms. We compare the experimental budgets with the numerically simulated ones and demonstrate a good agreement. We find also a good agreement between the computed error budget from the slope buffers and the measured Strehl ratio on the IR images, ranging between 10% and 20% at 1530 nm. These results make us confident in our ability to establish the error budget of future MOAO instruments.
Aims. We characterise the properties of stars, dust, and gas and their spatial distribution in the central region of the Seyfert 2 galaxy NGC 1068. Method. Our study is based on near-infrared (YJH, ...0.95−1.650 μm, R = 350) long-slit spectroscopy observations of the central region of NGC 1068 with a 0.4″ spatial resolution. We decomposed the observed continuum emission into three components: hot dust, stars, and scattered light from the central engine. We measured their contributions at various distances from the nucleus. We also measured fluxes and Doppler shifts for the emission lines in our spectrum to probe the physical conditions of the narrow line region. Results. Dust and stars are the main sources of continuum emission, but scattered light from the central engine has also been detected in the very central region. Together, these three components reproduce the observed continuum well. The dust emission is compatible with a 830 K blackbody. It has only been detected in the very central region and is not spatially resolved. The stellar content is ubiquitous. It harbours a 250 pc cusp centred around the nucleus, over-imposed on a young stellar background. The spectrum of the cusp is consistent with a 120 Myr old single stellar population. Finally, the emission lines exhibit a significant Doppler shift that is consistent with a radial outflow from the nucleus in a biconical structure. The Fe II behaviour strongly differs from other lines, indicating that it arises from a different structure.
Adaptive optics provides real time correction of wavefront disturbances on ground based telescopes. Optimizing control and performance is a key issue for ever more demanding instruments on ever ...larger telescopes affected not only by atmospheric turbulence, but also by vibrations, windshake and tracking errors. Linear Quadratic Gaussian control achieves optimal correction when provided with a temporal model of the disturbance. We present in this paper the first on-sky results of a Kalman filter based LQG control with vibration mitigation on the CANARY instrument at the Nasmyth platform of the 4.2-m William Herschel Telescope. The results demonstrate a clear improvement of performance for full LQG compared with standard integrator control, and assess the additional improvement brought by vibration filtering with a tip-tilt model identified from on-sky data, thus validating the strategy retained on the instrument SPHERE at the VLT.
Context. A new challenging adaptive optics (AO) system, called multi-object adaptive optics (MOAO), has been successfully demonstrated on-sky for the first time at the 4.2 m William Herschel ...Telescope, Canary Islands, Spain, at the end of September 2010. Aims. This system, called CANARY, is aimed at demonstrating the feasibility of MOAO in preparation of a future multi-object near infra-red (IR) integral field unit spectrograph to equip extremely large telescopes for analysing the morphology and dynamics of high-z galaxies. Methods. CANARY compensates for the atmospheric turbulence with a deformable mirror driven in open-loop and controlled through a tomographic reconstruction by three widely separated off-axis natural guide star (NGS) wavefront sensors, which are in open loop too. We compared the performance of conventional closed-loop AO, MOAO, and ground-layer adaptive optics (GLAO) by analysing both IR images and simultaneous wave-front measurements. Results. In H-band, Strehl ratios of 0.20 are measured with MOAO while achieving 0.25 with closed-loop AO in fairly similar seeing conditions (r0 ≈ 15 cm at 0.5 μm). As expected, MOAO has performed at an intermediate level between GLAO and closed-loop AO.
High-performance computing (HPC) achieved an astonishing three orders of magnitude performance improvement per decade for three decades, thanks to hardware technology scaling resulting in exponential ...improvement in the rate of floating point executions, though slowing in the most recent. Captured in the Top500 list, this hardware evolution cascaded through the software stack, triggering changes at all levels, including the redesign of numerical linear algebra libraries. HPC simulations on massively parallel systems are often driven by matrix computations, whose rate of execution depends on their floating point precision. Referred to by Jack Dongarra, the 2021 ACM A.M. Turing Award Laureate, as "responsibly reckless" matrix algorithms, we highlight the implications of mixed-precision (MP) computations for HPC applications. Introduced 75 years ago, long before the advent of HPC architectures, MP numerical methods turn out to be paramount for increasing the throughput of traditional and artificial intelligence (AI) workloads beyond riding the wave of the hardware alone. Reducing precision comes at the price of trading away some accuracy for performance (reckless behavior) but in noncritical segments of the workflow (responsible behavior) so that accuracy requirements of the application can still be satisfied. They offer a valuable performance/accuracy knob and, just as they are in AI, they are now indispensable in the pursuit of knowledge and discovery in simulations. In particular, we illustrate the MP impact on three representative HPC applications related to seismic imaging, climate/environment geospatial predictions, and computational astronomy.