Context.
Multi-wavelength light curves in long-term campaigns show that, for several blazars, the radio emission occurs with a significant delay with respect to the
γ
-ray band, with timescales ...ranging from weeks to years. Such observational evidence has long been a matter of debate, and is usually interpreted as a signature of the
γ
-ray emission originating upstream in the jet, with the emitting region becoming radio transparent at larger scales.
Aims.
In this paper, we show, by means of self-consistent numerical modelling, that the adiabatic expansion of a relativistic blob can explain these delays, reproducing lags compatible with the observed timescales.
Methods.
We use the
JetSeT
framework to reproduce the numerical modelling of the radiative and accelerative processes, reproducing the temporal evolution of a single blob, from the initial flaring activity and the subsequent expansion. We follow the spectral evolution and the corresponding light curves, investigating the relations among the observed parameters, rise time, delay, and decay time, and we identify the link with physical parameters.
Results.
We find that, when adiabatic expansion is active, lags due to the shift of the synchrotron frequency occur. The corresponding time lags have an offset equal to the distance in time between the flaring onset and the beginning of the expansion, whilst the rising and decaying timescales depend on the velocity of the expansion and on the time required for the source to exhibit a synchrotron self-absorption frequency below the relevant radio spectral window. We derive an inter-band response function, embedding the aforementioned parameters, and we investigate the effects of the competitions between radiative and adiabatic cooling timescales on the response. We apply the response function to long-term radio and
γ
-ray light curves of Mrk 421, Mrk 501, and 3C 273, finding satisfactory agreement on the log-term behaviour, and we use a Monte Carlo Markov chain approach to estimate some relevant physical parameters. We discuss applications of the presented analysis to polarization measurements and to jet collimation profile kinematics. The collimation profiles observed in radio images are in agreement with the prediction from our model.
Full text
Available for:
FMFMET, NUK, UL, UM, UPUK
Context.
The origin of the
γ
-ray emission of the blazar
Mrk 421
is still a matter of debate.
Aims.
We used 5.5 years of unbiased observing campaign data, obtained using the FACT telescope and the
...Fermi
-LAT detector at TeV and GeV energies, the longest and densest so far, together with contemporaneous multi-wavelength observations, to characterise the variability of
Mrk 421
and to constrain the underlying physical mechanisms.
Methods.
We studied and correlated light curves obtained by ten different instruments and found two significant results.
Results.
The TeV and X-ray light curves are very well correlated with a lag of < 0.6 days. The GeV and radio (15 Ghz band) light curves are widely and strongly correlated. Variations of the GeV light curve lead those in the radio.
Conclusions.
Lepto-hadronic and purely hadronic models in the frame of shock acceleration predict proton acceleration or cooling timescales that are ruled out by the short variability timescales and delays observed in Mrk 421. Instead the observations match the predictions of leptonic models.
Full text
Available for:
FMFMET, NUK, UL, UM, UPUK
We compared convolutional neural networks to the classical boosted decision trees for the separation of atmospheric particle showers generated by gamma rays from the particle-induced background. We ...conduct the comparison of the two techniques applied to simulated observation data from the Cherenkov Telescope Array. We then looked at the Receiver Operating Characteristics (ROC) curves produced by the two approaches and discuss the similarities and differences between both. We found that neural networks overperformed classical techniques under specific conditions.
ABSTRACT
By studying the variability of blazars across the electromagnetic spectrum, it is possible to resolve the underlying processes responsible for rapid flux increases, so-called flares. We ...report on an extremely bright X-ray flare in the high-peaked BL Lacertae object Markarian 421 (Mrk 421) that occurred simultaneously with enhanced γ-ray activity detected at very high energies by First G-APD Cherenkov Telescope on 2019 June 9. We triggered an observation with XMM–Newton, which observed the source quasi-continuously for 25 h. We find that the source was in the brightest state ever observed using XMM–Newton, reaching a flux of 2.8 × 10−9 $\mathrm{erg\, cm^{-2}\, s^{-1}}$ over an energy range of 0.3–10 keV. We perform a spectral and timing analysis to reveal the mechanisms of particle acceleration and to search for the shortest source-intrinsic time-scales. Mrk 421 exhibits the typical harder-when-brighter behaviour throughout the observation and shows a clock-wise hysteresis pattern, which indicates that the cooling dominates over the acceleration process. While the X-ray emission in different sub-bands is highly correlated, we can exclude large time lags as the computed z-transformed discrete correlation functions are consistent with a zero lag. We find rapid variability on time-scales of 1 ks for the 0.3–10 keV band and down to 300 s in the hard X-ray band (4–10 keV). Taking these time-scales into account, we discuss different models to explain the observed X-ray flare, and find that a plasmoid-dominated magnetic reconnection process is able to describe our observation best.
Extended dark matter (DM) substructures may play the role of microlenses in the Milky Way and in extragalactic gravitational lens systems (GLSs). We compare microlensing effects caused by point ...masses (Schwarzschild lenses) and extended clumps of matter using a simple model for the lens mapping. A superposition of the point mass and the extended clump is also considered. For special choices of the parameters, this model may represent a cusped clump of cold DM, a cored clump of self-interacting dark matter (SIDM) or an ultra-compact minihalo of DM surrounding a massive point-like object. We built the resulting micro-amplification curves for various parameters of one clump moving with respect to the source in order to estimate differences between the light curves caused by clumps and by point lenses. The results show that it may be difficult to distinguish between these models. However, some region of the clump parameters can be restricted by considering the high amplification events at the present level of photometric accuracy. Then we estimate the statistical properties of the amplification curves in extragalactic GLSs. For this purpose, an ensemble of amplification curves is generated yielding the autocorrelation functions (ACFs) of the curves for different choices of the system parameters. We find that there can be a significant difference between these ACFs if the clump size is comparable with typical Einstein radii; as a rule, the contribution of clumps makes the ACFs less steep.
The single-mirror small-size telescope (SST-1M) is one of the three proposed designs for the small-size telescopes (SSTs) of the Cherenkov Telescope Array (CTA) project. The SST-1M will be equipped ...with a 4 m-diameter segmented reflector dish and an innovative fully digital camera based on silicon photo-multipliers. Since the SST sub-array will consist of up to 70 telescopes, the challenge is not only to build telescopes with excellent performance, but also to design them so that their components can be commissioned, assembled and tested by industry. In this paper we review the basic steps that led to the design concepts for the SST-1M camera and the ongoing realization of the first prototype, with focus on the innovative solutions adopted for the photodetector plane and the readout and trigger parts of the camera. In addition, we report on results of laboratory measurements on real scale elements that validate the camera design and show that it is capable of matching the CTA requirements of operating up to high moonlight background conditions.
Full text
Available for:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
General relativity (GR) has a solid experimental base. However, the emergence of new experimental capabilities and independent observational information stimulates continuing tests of general ...relativity. The purpose of this work is to evaluate the potential of gravitational microlensing of distant sources on the stars of our Galaxy and to verify Einstein’s formula of gravitational refraction. This effect has been repeatedly tested in the Solar System in high-accuracy experiments with the propagation of radio waves, when the measurements are most effective for the distances from the signal trajectory to the Sun on the order of several solar radii. In the case of galactic microlensing, a quite different type of observational data and other characteristic distances are used that are determined in the high magnification events by the Einstein ring radii, which is typically of the order of 1 AU. Although the gravitational deflections of light by stars are very small and currently practically inaccessible by direct measurements, nonetheless, due to the large distances to the microlenses, the radiation flux from the source in strong microlensing events can increase several times. To verify Einstein’s formula, a more general dependence of the beam deflection angle
on its impact distance
p
relative to the deflector is considered and, accordingly, the equations of gravitational lensing are modified. The challenge is to limit ε based on observational data. The Early Warning System data obtained in 2018 within the Optical Gravitational Lensing Experiment (OGLE) (
http://ogle.astrouw.edu.pl/ogle4/ ews/2019/ews.html
) was used. A sample of 100 light curves from the data obtained by the OGLE group in 2018 was formed. Each light curve was fitted as part of a modified model of gravitational lensing with parameter ε. As a result, 100 values of ε and estimates of their variances were obtained. It was found that the mean value of ε does not contradict GR within the limits of a one percent standard deviation. In the future, using a larger number of light curves will allow one to hope for a significant decrease in the error of ε due to statistical averaging.
Full text
Available for:
EMUNI, FIS, FZAB, GEOZS, GIS, IJS, IMTLJ, KILJ, KISLJ, MFDPS, NLZOH, NUK, OBVAL, OILJ, PNG, SAZU, SBCE, SBJE, SBMB, SBNM, UKNU, UL, UM, UPUK, VKSCE, ZAGLJ
The SST-1M is a 4-m diameter mirror Davies-Cotton gamma-ray telescope. It has been designed to cover the energy range above ∼500 GeV and to be part of an array of telescopes separated by ∼150−200 m. ...Its innovative camera is featuring large area hexagonal silicon photo-multipliers as photon detectors and a fully digital trigger and readout system. Here, the strategy and the methods for its calibration are presented, together with the obtained results. In particular, the off and on-site calibration strategies are demonstrated on the first camera prototype. The performances of the camera in terms of charge and time resolution are described.
Advancements in communication-mobile technologies have transformed the digital landscape, creating new opportunities while also exposing disparities in access and usage. This phenomenon of unequal ...digital participation, often termed the “digital divide”, can exacerbate inequalities. Bridging this divide through innovative technological solutions and policy interventions is critical for empowerment. This study investigates the role communication-mobile technologies have played in promoting digital inclusion over the past decade through a systematic review of academic literature. Fourteen studies published between 2012 and 2023 were analyzed following a rigorous selection process. A conceptual framework was developed to examine the layers of the digital divide, variety of divides, intervention types, and tools used. The analysis reveals the multifaceted nature of the divide across geographical, socioeconomic, and demographic dimensions. Communication and access emerge as pivotal elements, with studies emphasizing approaches like leveraging libraries as community hubs. The importance of multi-tiered interventions, from grassroots to policy-level, is pronounced. Arts, libraries, training, and mobile platforms are identified as key tools. While the findings largely align with the topics highlighted in the preliminary sections, gaps exist concerning insufficiently addressed divides and groups. Expanding the discourse to incorporate these areas can enrich the conceptualization of communication technologies’ role in digital inclusion. This timely systematic review provides a foundation for continued interrogation of digital participation challenges facing diverse global populations.
Background. The brain perfusion ROI detection being a preliminary step, designed to exclude non-brain tissues from analyzed DSC perfusion MR images. Its accuracy is considered as the key factor for ...delivering correct results of perfusion data analysis. Despite the large variety of algorithms developed on brain tissues segmentation, there is no one that works reliably and robustly on T2-weighted MR images of a human head with abnormal brain anatomy. Therefore, thresholding method is still the state-of-the-art technique that is widely used as a way of managing pixels involved in brain perfusion ROI in modern software applications for perfusion data analysis. Objective. This paper presents the analysis of effectiveness of thresholding techniques in brain perfusion ROI detection on T2-weighted MR images of a human head with abnormal brain anatomy. Methods. Four threshold-based algorithms implementation are considered: according to Otsu method as global thresholding, according to Niblack method as local thresholding, thresholding in approximate anatomical brain location, and brute force thresholding. The result of all algorithms is images with pixels’ values changed to zero for background regions (air pixels and pixels that represent non-brain tissues) and original values for foreground regions (brain perfusion ROIs). The analysis is done using comparison of qualitative perfusion maps produced from thresholded images and from the reference ones (manual brain tissues delineation by experienced radiologists). The same DSC perfusion MR datasets of a human head with abnormal brain anatomy from 12 patients with cerebrovascular disease are used for comparison. Results. Pearson correlation analysis showed strong positive (r was ranged from 0.7123 to 0.8518, p < 0.01) and weak positive (r < 0.35, p < 0.01) relationship in case of conducted experiments with CBF, CBV, MTT and Tmax perfusion maps, respectively. Linear regression analysis showed at level of 95 % confidence interval that perfusion maps produced from thresholded images were subject to scale and offset errors in all conducted experiments. Conclusions. The experimental results showed that widely used thresholding methods are an ineffective way of managing pixels involved in brain perfusion ROI. Thresholding as brain segmentation tool can lead to poor placement of perfusion ROI and, as a result, produced perfusion maps will be subject to artifacts and can cause falsely high or falsely low perfusion parameter assessment.
Full text
Available for:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, UILJ, UKNU, UL, UM, UPUK