The early classification of university students according to their potential academic performance can be a useful strategy to mitigate failure, to promote the achievement of better results and to ...better manage resources in higher education institutions. This paper proposes a two-stage model, supported by data mining techniques, that uses the information available at the end of the first year of students' academic career (path) to predict their overall academic performance. Unlike most literature on educational data mining, academic success is inferred from both the average grade achieved and the time taken to conclude the degree. Furthermore, this study proposes to segment students based on the dichotomy between the evidence of failure or high performance at the beginning of the degree program, and the students' performance levels predicted by the model. A data set of 2459 students, spanning the years from 2003 to 2015, from a European Engineering School of a public research University, is used to validate the proposed methodology. The empirical results demonstrate the ability of the proposed model to predict the students' performance level with an accuracy above 95%, in an early stage of the students' academic path. It is found that random forests are superior to the other classification techniques that were considered (decision trees, support vector machines, naive Bayes, bagged trees and boosted trees). Together with the prediction model, the suggested segmentation framework represents a useful tool to delineate the optimum strategies to apply, in order to promote higher performance levels and mitigate academic failure, overall increasing the quality of the academic experience provided by a higher education institution.
•We propose a two-stage model for early predicting students overall academic success.•We introduce a new academic performance metric and use single and ensemble data mining techniques.•We introduce a students' segmentation approach based on evidences and predictions.•A dataset of about 2500 students is used to validate the proposed methodology.•The proposed model reveals to have high predictive power (above 95%).
The estimation of atmospheric turbulence parameters is of relevance for the following: (a) site evaluation and characterization; (b) prediction of the point spread function; (c) live assessment of ...error budgets and optimization of adaptive optics performance; (d) optimization of fringe trackers for long baseline optical interferometry. The ubiquitous deployment of Shack-Hartmann wavefront sensors in large telescopes makes them central for atmospheric turbulence parameter estimation via adaptive optics telemetry. Several methods for the estimation of the Fried parameter and outer scale have been developed, most of which are based on the fitting of Zernike polynomial coefficient variances reconstructed from the telemetry. The non-orthogonality of Zernike polynomial derivatives introduces modal cross coupling, which affects the variances. Furthermore, the finite resolution of the sensor introduces aliasing. In this article the impact of these effects on atmospheric turbulence parameter estimation is addressed with simulations. It is found that cross-coupling is the dominant bias. An iterative algorithm to overcome it is presented. Simulations are conducted for typical ranges of the outer scale (4-32 m), Fried parameter (10 cm) and noise in the variances (signal-to-noise ratio of 10 and above). It is found that, using the algorithm, both parameters are recovered with sub-per cent accuracy.
Abstract
Shack–Hartmann wavefront sensing relies on accurate spot centre measurement. Several algorithms were developed with this aim, mostly focused on precision, i.e. minimizing random errors. In ...the solar and extended scene community, the importance of the accuracy (bias error due to peak-locking, quantization, or sampling) of the centroid determination was identified and solutions proposed. But these solutions only allow partial bias corrections. To date, no systematic study of the bias error was conducted. This article bridges the gap by quantifying the bias error for different correlation peak-finding algorithms and types of sub-aperture images and by proposing a practical solution to minimize its effects. Four classes of sub-aperture images (point source, elongated laser guide star, crowded field, and solar extended scene) together with five types of peak-finding algorithms (1D parabola, the centre of gravity, Gaussian, 2D quadratic polynomial, and pyramid) are considered, in a variety of signal-to-noise conditions. The best performing peak-finding algorithm depends on the sub-aperture image type, but none is satisfactory to both bias and random errors. A practical solution is proposed that relies on the antisymmetric response of the bias to the sub-pixel position of the true centre. The solution decreases the bias by a factor of ∼7 to values of ≲ 0.02 pix. The computational cost is typically twice of current cross-correlation algorithms.
Scalar field effects on the orbit of S2 star Amorim, A; Bauböck, M; Benisty, M ...
Monthly notices of the Royal Astronomical Society,
11/2019, Letnik:
489, Številka:
4
Journal Article
Recenzirano
Odprti dostop
ABSTRACT
Precise measurements of the S-stars orbiting SgrA* have set strong constraints on the nature of the compact object at the centre of the Milky Way. The presence of a black hole in that region ...is well established, but its neighbouring environment is still an open debate. In that respect, the existence of dark matter in that central region may be detectable due to its strong signatures on the orbits of stars: the main effect is a Newtonian precession which will affect the overall pericentre shift of S2, the latter being a target measurement of the GRAVITY instrument. The exact nature of this dark matter (e.g. stellar dark remnants or diffuse dark matter) is unknown. This article assumes it to be a scalar field of toroidal distribution, associated with ultralight dark matter particles, surrounding the Kerr black hole. Such a field is a form of ‘hair’ expected in the context of superradiance, a mechanism that extracts rotational energy from the black hole. Orbital signatures for the S2 star are computed and shown to be detectable by GRAVITY. The scalar field can be constrained because the variation of orbital elements depends both on the relative mass of the scalar field to the black hole and on the field mass coupling parameter.
Assessing the quality of aperture synthesis maps is relevant for benchmarking image reconstruction algorithms, for the scientific exploitation of data from optical long-baseline interferometers, and ...for the design/upgrade of new/existing interferometric imaging facilities. Although metrics have been proposed in these contexts, no systematic study has been conducted on the selection of a robust metric for quality assessment. This article addresses the question: what is the best metric to assess the quality of a reconstructed image? It starts by considering several metrics and selecting a few based on general properties. Then, a variety of image reconstruction cases are considered. The observational scenarios are phase closure and phase referencing at the Very Large Telescope Interferometer (VLTI), for a combination of two, three, four and six telescopes. End-to-end image reconstruction is accomplished with the MiRA software, and several merit functions are put to test. It is found that convolution by an effective point spread function is required for proper image quality assessment. The effective angular resolution of the images is superior to naive expectation based on the maximum frequency sampled by the array. This is due to the prior information used in the aperture synthesis algorithm and to the nature of the objects considered. The l1-norm is the most robust of all considered metrics, because being linear it is less sensitive to image smoothing by high regularization levels. For the cases considered, this metric allows the implementation of automatic quality assessment of reconstructed images, with a performance similar to human selection.
The shortage of petroleum reserves and the increase in CO(2) emissions have raised global concerns and highlighted the importance of adopting sustainable energy sources. Second-generation ethanol ...made from lignocellulosic materials is considered to be one of the most promising fuels for vehicles. The giant snail Achatina fulica is an agricultural pest whose biotechnological potential has been largely untested. Here, the composition of the microbial population within the crop of this invasive land snail, as well as key genes involved in various biochemical pathways, have been explored for the first time. In a high-throughput approach, 318 Mbp of 454-Titanium shotgun metagenomic sequencing data were obtained. The predominant bacterial phylum found was Proteobacteria, followed by Bacteroidetes and Firmicutes. Viruses, Fungi, and Archaea were present to lesser extents. The functional analysis reveals a variety of microbial genes that could assist the host in the degradation of recalcitrant lignocellulose, detoxification of xenobiotics, and synthesis of essential amino acids and vitamins, contributing to the adaptability and wide-ranging diet of this snail. More than 2,700 genes encoding glycoside hydrolase (GH) domains and carbohydrate-binding modules were detected. When we compared GH profiles, we found an abundance of sequences coding for oligosaccharide-degrading enzymes (36%), very similar to those from wallabies and giant pandas, as well as many novel cellulase and hemicellulase coding sequences, which points to this model as a remarkable potential source of enzymes for the biofuel industry. Furthermore, this work is a major step toward the understanding of the unique genetic profile of the land snail holobiont.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
Point source detection algorithms play a pivotal role across diverse applications, influencing fields such as astronomy, biomedical imaging, environmental monitoring, and beyond. This article reviews ...the algorithms used for space imaging applications from ground and space telescopes. The main difficulties in detection arise from the incomplete knowledge of the impulse function of the imaging system, which depends on the aperture, atmospheric turbulence (for ground-based telescopes), and other factors, some of which are time-dependent. Incomplete knowledge of the impulse function decreases the effectiveness of the algorithms. In recent years, deep learning techniques have been employed to mitigate this problem and have the potential to outperform more traditional approaches. The success of deep learning techniques in object detection has been observed in many fields, and recent developments can further improve the accuracy. However, deep learning methods are still in the early stages of adoption and are used less frequently than traditional approaches. In this review, we discuss the main challenges of point source detection, as well as the latest developments, covering both traditional and current deep learning methods. In addition, we present a comparison between the two approaches to better demonstrate the advantages of each methodology.
During the last two decades, the first generation of beam combiners at the Very Large Telescope Interferometer has proved the importance of optical interferometry for high-angular resolution ...astrophysical studies in the near- and mid-infrared. With the advent of 4-beam combiners at the VLTI, the
u
−
v
coverage per pointing increases significantly, providing an opportunity to use reconstructed images as powerful scientific tools. Therefore, interferometric imaging is already a key feature of the new generation of VLTI instruments, as well as for other interferometric facilities like CHARA and JWST. It is thus imperative to account for the current image reconstruction capabilities and their expected evolutions in the coming years. Here, we present a general overview of the current situation of optical interferometric image reconstruction with a focus on new wavelength-dependent information, highlighting its main advantages and limitations. As an
Appendix
we include several cookbooks describing the usage and installation of several state-of-the art image reconstruction packages. To illustrate the current capabilities of the software available to the community, we recovered chromatic images, from simulated MATISSE data, using the MCMC software SQUEEZE. With these images, we aim at showing the importance of selecting good regularization functions and their impact on the reconstruction.
Circumstellar disks are vast expanses of dust that form around new stars in the earliest stages of their birth. Predicted by astronomers as early as the eighteenth century, they weren’t observed ...until the late twentieth century, when interstellar imaging technology enabled us to see nascent stars hundreds of light years away. Since then, circumstellar disks have become an area of intense study among astrophysicists, largely because they are thought to be the forerunners of planetary systems like our own—the possible birthplaces of planets. This volume brings together a team of leading experts to distill the most up-to-date knowledge of circumstellar disks into a clear introductory volume. Understanding circumstellar disks requires a broad range of scientific knowledge, including chemical processes, the properties of dust and gases, hydrodynamics and magnetohydrodynamics, radiation transfer, and stellar evolution—all of which are covered in this comprehensive work, which will be indispensable for graduate students, seasoned researchers, or even advanced undergrads setting out on the study of planetary evolution.