The fourth catalog of active galactic nuclei (AGNs) detected by the Fermi Gamma-ray Space Telescope Large Area Telescope (4LAC) between 2008 August 4 and 2016 August 2 contains objects located at ...high Galactic latitudes ( ). It includes 85% more sources than the previous 3LAC catalog based on 4 yr of data. AGNs represent at least 79% of the high-latitude sources in the fourth Fermi-Large Area Telescope Source Catalog (4FGL), which covers the energy range from 50 MeV to 1 TeV. In addition, gamma-ray AGNs are found at low Galactic latitudes. Most of the 4LAC AGNs are blazars (98%), while the remainder are other types of AGNs. The blazar population consists of 24% Flat Spectrum Radio Quasars (FSRQs), 38% BL Lac-type objects, and 38% blazar candidates of unknown types (BCUs). On average, FSRQs display softer spectra and stronger variability in the gamma-ray band than BL Lacs do, confirming previous findings. All AGNs detected by ground-based atmospheric Cerenkov telescopes are also found in the 4LAC.
This article argues that through the EU’s technology regulation, technological concepts permeate legal language. Such concepts may function as transplants, even irritants, causing tensions and ...uncertainties. As technology regulation is increasingly horizontal, i.e. obligating private and public actors alike, these newfound legal concepts remain disconnected from established public law vocabulary and the power constellations it represents and embeds. We approach this evolution of legal language from a public law perspective and concentrate on the concepts of ‘user’ and 'deployer' in the EU’s upcoming Artificial Intelligence Act. We discuss these emerging legal concepts in relation to the rich theorizing on the concepts in human-computer interaction research. Our analysis demonstrates a discrepancy between legal and technology-oriented conceptualizations of the ‘user-deployer’. We draw three conclusions. First, the digital revolution is taking place in conceptual-linguistic practices of law, and not only when translating law into code. Second, when external concepts are appropriated into law, they are uprooted from their established habitat, which may result in unpredictability in future legal interpretation. Third, in public law, adopting the ‘user-deployer' may have some additional challenges, as it introduces a new agent into the relationship between public authority and private entities. Simultaneously, citizens seem to be mainly excluded from the legal conceptualizing, which risks blurring traditional power constellations.
This article argues that through the EU's technology regulation, technological concepts permeate legal language. Such concepts may function as transplants, even irritants, causing tensions and ...uncertainties. As technology regulation is increasingly horizontal, i.e. obligating private and public actors alike, these newfound legal concepts remain disconnected from established public law vocabulary and the power constellations it represents and embeds. We approach this evolution of legal language from public law perspective and concentrate on the concepts of ‘user’ and ‘deployer’ in the EU's upcoming Artificial Intelligence Act. We discuss these emerging legal concepts in relation to the rich theorizing on the concepts in human–computer interaction research. Our analysis demonstrates a discrepancy between legal and technology-oriented conceptualizations of the ‘user-deployer’. We draw three conclusions. First, the digital revolution is taking place in conceptual-linguistic practices of law, and not only when translating law into code. Second, when external concepts are appropriated into law, they are uprooted from their established habitat, which may result in unpredictability in future legal interpretation. Third, in public law, adopting the ‘user-deployer’ may have some additional challenges, as it introduces a new agent into the relationship between public authority and private entities. Simultaneously, citizens seem to be mainly excluded from the legal conceptualizing, which risks blurring traditional power constellations.
As health care systems worldwide struggle with rising costs, a consensus is emerging to refocus reform efforts on value, as determined by the evaluation of patient outcomes relative to costs. One ...method of using outcome data to improve health care value is the disease registry. An international study of thirteen registries in five countries (Australia, Denmark, Sweden, the United Kingdom, and the United States) suggests that by making outcome data transparent to both practitioners and the public, well-managed registries enable medical professionals to engage in continuous learning and to identify and share best clinical practices. The apparent result: improved health outcomes, often at lower cost. For example, we calculate that if the United States had a registry for hip replacement surgery comparable to one in Sweden that enabled reductions in the rates at which these surgeries are performed a second time to replace or repair hip prostheses, the United States would avoid $2 billion of an expected $24 billion in total costs for these surgeries in 2015.
This study tested a new experimental apparatus to estimate thermal preferences of fish. The apparatus was designed to minimise the effect of the thermal history of the fish and allow for easy ...feeding. The set-up consisted of two connected sections of an aquarium, both receiving an excess of food, with slightly different water temperatures. Initially, the fish spent most of its time in one of the sections, but when the temperatures were slowly increased (or decreased), the fish spent increasingly more time in the other. The temperature at which the fish spent equal time in both sections was defined as the preferred temperature. Brown trout, Salmo trutta, preferred the reported optimal temperature for growth of the species. However, Arctic charr, Salvelinus alpinus, selected a significantly lower temperature than its optimal temperature for growth and thus contradicted the general view of good correlation between the optimal temperature for growth and preferred temperature of fish. The reason for this may be that charr is optimising its growth efficiency instead of its growth rate. Individuals that utilise a limited resource in an optimal way, by selecting a temperature where the growth efficiency is maximised, will possibly be favoured. Several factors affect the distribution of fish in lakes, but the difference in thermal preference between charr and trout might partly explain the frequently observed niche segregation of these two species in Scandinavian lakes.
Display omitted
•Laboratory dissolver with disk is not effective in dispersing microfine cement.•Laboratory dissolver with rotor-stator technique is much more effective than disk.•Ultrasound is more ...effective than the rotor-stator technique.
Grout based on microfine cement is mainly used for sealing of rock fractures in underground construction with high sealing requirements. This grout is known as hard dispersed, compared to grouts based on ordinary Portland cement. This study investigates the possibility of using ultrasound to improve the dispersion of microfine cement grout. Ultrasound dispersion is compared with dispersion efficiency of an ordinary laboratory mixer equipped with a disk, and using the rotor-stator technique. Dispersion efficiency was measured with a filter pump. The grout dispersed with the laboratory mixer and disk could not pass through a 154 µm filter. The laboratory mixer using the rotor-stator technique showed much better efficiency, with a measured dispersion that varied between 77 and 104 µm. Dispersion with ultrasound yielded more reliable results, with lower variation; the grout passed through a 77 µm filter. These results showed that a laboratory mixer with a disk is not an effective method for dispersion of microfine cement. A laboratory mixer using the rotor-stator method is much more effective. Ultrasound is not only an effective method but is even better than a mixer using the rotor-stator technique.
Despite advancements in computational resources, the discrete element method (DEM) still requires considerable computational time to solve detailed problems, especially when it comes to the ...large-scale models. In addition to the geometry scale of the problem, the particle shape has a dramatic effect on the computational cost of DEM. Therefore, many studies have been performed with simplified spherical particles or clumps. Particle scaling is an approach to increase the particle size to reduce the number of particles in the DEM. Although several particle scaling methods have been introduced, there are still some disagreements regarding their applicability to certain aspects of problems. In this study, the effect of particle scalping on the shear behavior of granular material is explored. Real granular particles were scanned and imported as polygonal particles in the direct shear test. The effect of particle size distribution, particle angularity, and the amount of scalping were investigated. The results show that particle scalping can simulate the correct shear behavior of the model with significant improvement in computational time. Also, the accuracy of the scalping method depends on the particle angularity and particle size range.
Display omitted
•The scalping method can simulate the correct shear behavior of granular material.•The scalping method improves computational time dramatically despite some loss of accuracy in predicted values.•Particle size distribution has a significant effect on effectiveness of particle scalping.•Particle angularity and rotation affect the accuracy of the scalping method.
Socialtjänsten har som ansvar att ge vård och stöd till personer med alkohol-och narkotikarelaterade problem. Trots att riskbedömning av klientens tillstånd är ett kvalificerat arbete som kräver ...svåra överväganden har utbildning om riskbedömning samt professionellt stöd för socialtjänstens handläggare många gånger saknats. I den här artikeln presenterar vi forskningsresultat från testning och utvärdering av en webbaserad stödfunktion för riskbedömning som kallas riskindikation (RI). RI är baserad på data från mer än 55,000 av socialtjänstens klienter med riskfylld eller svår substansanvändning och ger handläggare inom socialtjänsten kunskap om hur hög risknivå klienten har inom områdena alkohol, narkotika, familj, psykisk hälsa, fysisk hälsa och kriminalitet jämfört med alla andra klienter av de 55,000 klienter som finns i databasen.
Swedish social services have the responsibility to assess and provide treatment and other supports to clients with risky or severe substance use. Even though the assessment and care planning is a professional activity with difficult decisions few social workers receive professional support in this area. In this article we present research results from the testing and evaluation of a web-based support function to help assess client risks. This support function, Risk-indication is developed based on Addiction Severity Index assessment data from more than 55,000 of social services clients with risky or severe substance use. RI provides, at the time of the assessment interview, specific information about how client risks are in the area alcohol use, drug use, family-social relationship, physical health and criminality, this in comparison to all the other clients of 55,000 individuals in the ASI data base.
The region around the Galactic Center (GC) is now well established to be brighter at energies of a few GeV than what is expected from conventional models of diffuse gamma-ray emission and catalogs of ...known gamma-ray sources. We study the GeV excess using 6.5 yr of data from the Fermi Large Area Telescope. We characterize the uncertainty of the GC excess spectrum and morphology due to uncertainties in cosmic-ray source distributions and propagation, uncertainties in the distribution of interstellar gas in the Milky Way, and uncertainties due to a potential contribution from the Fermi bubbles. We also evaluate uncertainties in the excess properties due to resolved point sources of gamma rays. The GC is of particular interest, as it would be expected to have the brightest signal from annihilation of weakly interacting massive dark matter (DM) particles. However, control regions along the Galactic plane, where a DM signal is not expected, show excesses of similar amplitude relative to the local background. Based on the magnitude of the systematic uncertainties, we conservatively report upper limits for the annihilation cross-section as a function of particle mass and annihilation channel.
We present a catalog of sources detected above 10 GeV by the Fermi Large Area Telescope (LAT) in the first 7 years of data using the Pass 8 event-level analysis. This is the Third Catalog of Hard ...Fermi-LAT Sources (3FHL), containing 1556 objects characterized in the 10 GeV-2 TeV energy range. The sensitivity and angular resolution are improved by factors of 3 and 2 relative to the previous LAT catalog at the same energies (1FHL). The vast majority of detected sources (79%) are associated with extragalactic counterparts at other wavelengths, including 16 sources located at very high redshift (z > 2). Of the sources, 8% have Galactic counterparts and 13% are unassociated (or associated with a source of unknown nature). The high-latitude sky and the Galactic plane are observed with a flux sensitivity of 4.4 to 9.5 × 10−11 ph cm−2 s−1, respectively (this is approximately 0.5% and 1% of the Crab Nebula flux above 10 GeV). The catalog includes 214 new γ-ray sources. The substantial increase in the number of photons (more than 4 times relative to 1FHL and 10 times to 2FHL) also allows us to measure significant spectral curvature for 32 sources and find flux variability for 163 of them. Furthermore, we estimate that for the same flux limit of 10−12 erg cm−2 s−1, the energy range above 10 GeV has twice as many sources as the range above 50 GeV, highlighting the importance, for future Cherenkov telescopes, of lowering the energy threshold as much as possible.