Chris Marsden argues that co-regulation is the defining feature of the Internet in Europe. Co-regulation offers the state a route back into questions of legitimacy, governance and human rights, ...thereby opening up more interesting conversations than a static no-regulation versus state regulation binary choice. The basis for the argument is empirical investigation, based on a multi-year, European Commission-funded study and is further reinforced by the direction of travel in European and English law and policy, including the Digital Economy Act 2010. He places Internet regulation within the regulatory mainstream, as an advanced technocratic form of self- and co-regulation which requires governance reform to address a growing constitutional legitimacy gap. The literature review, case studies and analysis shed a welcome light on policymaking at the centre of Internet regulation in Brussels, London and Washington, revealing the extent to which states, firms and, increasingly, citizens are developing a new type of regulatory bargain.
This book explains the beginnings of net neutrality regulation in the United States and Europe, and some of the current debate over access to Specialised Services: fast lanes with higher Quality of ...Service (QoS). It examines the new European law of 2015 and the interaction between that law and interception/privacy. The book takes a deep dive into UK self- and co-regulation of net neutrality. In each of the national case studies, initial confusion at lack of clarity in net neutrality laws gave way to significant cases, particularly since 2014, which have given regulators the opportunity to clarify their legislation or regulation. The majority of such cases relate to mobile net neutrality, and in particular so-called 'zero rating' practices. The book compares results and proposes a regulatory toolkit for those jurisdictions that intend effective practical partial or complete implementation of net neutrality. It sets out a future research agenda for exploring implementation of regulation. The book outlines competition policy's purpose, referring to the exceptionally rigorous recent analysis of competition law suitability to regulate net neutrality by Maniadaki. Having analysed regulatory tools with little chance of success, it then examines what communications regulators actually do: regulating telecoms access based on the UK case study. The book considers whether zero rating poses a serious challenge to Open Internet use. It explores some of the wider international problems of regulating the newest manifestation of discrimination: zero rating. The book also considers the various means by which government can regulate net neutrality.
It is well known that architecturally the brain is a neural network, i.e. a collection of many relatively simple units coupled flexibly. However, it has been unclear how the possession of this ...architecture enables higher-level cognitive functions, which are unique to the brain. Here, we consider the brain from the viewpoint of dynamical systems theory and hypothesize that the unique feature of the brain, the self-organized plasticity of its architecture, could represent the means of enabling the self-organized plasticity of its velocity vector field. We propose that, conceptually, the principle of cognition could amount to the existence of appropriate rules governing self-organization of the velocity field of a dynamical system with an appropriate account of stimuli. To support this hypothesis, we propose a simple non-neuromorphic mathematical model with a plastic self-organized velocity field, which has no prototype in physical world. This system is shown to be capable of basic cognition, which is illustrated numerically and with musical data. Our conceptual model could provide an additional insight into the working principles of the brain. Moreover, hardware implementations of plastic velocity fields self-organizing according to various rules could pave the way to creating artificial intelligence of a novel type.
Strong scaling relations between host galaxy properties (such as stellar mass, bulge mass, luminosity, effective radius etc) and their nuclear supermassive black hole's mass point toward a close ...co-evolution. In this work, we first review previous efforts supporting the fundamental importance of the relation between supermassive black hole mass and stellar velocity dispersion (MBH-σe). We then present further original work supporting this claim via analysis of residuals and principal component analysis applied to some among the latest compilations of local galaxy samples with dynamically measured supermassive black hole masses. We conclude with a review on the main physical scenarios in favor of the existence of a MBH-σe relation, with a focus on momentum-driven outflows.
Abstract This article examines artificial intelligence (AI) co-regulation in the EU AI Act and the critical role of standards under this regulatory strategy. It engages with the foundation of ...democratic legitimacy in EU standardization, emphasizing the need for reform to keep pace with the rapid evolution of AI capabilities, as recently suggested by the European Parliament. The article highlights the challenges posed by interdisciplinarity and the lack of civil society expertise in standard-setting. It critiques the inadequate representation of societal stakeholders in the development of AI standards, posing pressing questions about the potential risks this entails to the protection of fundamental rights, given the lack of democratic oversight and the global composition of standard-developing organizations. The article scrutinizes how under the AI Act technical standards will define AI risks and mitigation measures and questions whether technical experts are adequately equipped to standardize thresholds of acceptable residual risks in different high-risk contexts. More specifically, the article examines the complexities of regulating AI, drawing attention to the multi-dimensional nature of identifying risks in AI systems and the value-laden nature of the task. It questions the potential creation of a typology of AI risks and highlights the need for a nuanced, inclusive, and context-specific approach to risk identification and mitigation. Consequently, in the article we underscore the imperative for continuous stakeholder involvement in developing, monitoring, and refining the technical rules and standards for high-risk AI applications. We also emphasize the need for rigorous training, certification, and surveillance measures to ensure the enforcement of fundamental rights in the face of AI developments. Finally, we recommend greater transparency and inclusivity in risk identification methodologies, urging for approaches that involve stakeholders and require a diverse skill set for risk assessment. At the same time, we also draw attention to the diversity within the European Union and the consequent need for localized risk assessments that consider national contexts, languages, institutions, and culture. In conclusion, the article argues that co-regulation under the AI Act necessitates a thorough re-examination and reform of standard-setting processes, to ensure a democratically legitimate, interdisciplinary, stakeholder-inclusive, and responsive approach to AI regulation, which can safeguard fundamental rights and anticipate, identify, and mitigate a broad spectrum of AI risks.
In this work we present “Astera’’, a cosmological visualization tool that renders a mock universe in real time using Unreal Engine 4. The large scale structure of the cosmic web is hard to visualize ...in two dimensions, and a 3D real time projection of this distribution allows for an unprecedented view of the large scale universe, with visually accurate galaxies placed in a dynamic 3D world. The underlying data are based on empirical relations assigned using results from N-Body dark matter simulations, and are matched to galaxies with similar morphologies and sizes, images of which are extracted from the Sloan Digital Sky Survey. Within Unreal Engine 4, galaxy images are transformed into textures and dynamic materials (with appropriate transparency) that are applied to static mesh objects with appropriate sizes and locations. To ensure excellent performance, these static meshes are “instanced’’ to utilize the full capabilities of a graphics processing unit. Additional components include a dynamic system for representing accelerated-time active galactic nuclei. The end result is a visually realistic large scale universe that can be explored by a user in real time, with accurate large scale structure. Astera is not yet ready for public release, but we are exploring options to make different versions of the code available for both research and outreach applications.
"This study explains the concept of network neutrality and its history as an extension of the rights and duties of common carriers, as well as its policy history as examined in US and European ...regulatory proceedings from 1999. The book compares national and regional legislation and regulation of net neutrality from an interdisciplinary and international perspective. It also examines the future of net neutrality battles in Europe, the United States and in developing countries such as India and Brazil, and explores the case studies of Specialized Services and Content Delivery Networks for video over the Internet, and zero rating or sponsored data plans. Finally, Network neutrality offers co-regulatory solutions based on FRAND and non-exclusivity. This is a must-read for researchers and advocates in net neutrality debate, and those interested in the context of communications regulation, law and economic regulation, human rights discourse and policy, and the impact of science and engineering on policy and governance."