In 1979, Valiant showed that the complexity class VPe of families with polynomially bounded formula size is contained in the class VPs of families that have algebraic branching programs (ABPs) of ...polynomially bounded size. Motivated by the problem of separating these classes, we study the topological closure VPe, i.e., the class of polynomials that can be approximated arbitrarily closely by polynomials in VPe. We describe VPe using the well-known continuant polynomial (in characteristic different from 2). Further understanding this polynomial seems to be a promising route to new formula size lower bounds. Our methods are rooted in the study of ABPs of small constant width. In 1992, Ben-Or and Cleve showed that formula size is polynomially equivalent to width-3 ABP size. We extend their result (in characteristic different from 2) by showing that approximate formula size is polynomially equivalent to approximate width-2 ABP size. This is surprising because in 2011 Allender and Wang gave explicit polynomials that cannot be computed by width-2 ABPs at all! The details of our construction lead to the aforementioned characterization of VPe. As a natural continuation of this work, we prove that the class VPN can be described as the class of families that admit a hypercube summation of polynomially bounded dimension over a product of polynomially many affine linear forms. This gives the first separations of algebraic complexity classes from their nondeterministic analogs.
The next-generation Versatile Video Coding (VVC) standard introduces a new Multi-Type Tree (MTT) block partitioning structure that supports Binary-Tree (BT) and Ternary-Tree (TT) splits in both ...vertical and horizontal directions. This new approach leads to five possible splits at each block depth. It thereby improves the coding efficiency of VVC over that of the preceding High Efficiency Video Coding (HEVC) standard, which only supports Quad-Tree (QT) partitioning with a single split per block depth. However, MTT also has brought a considerable impact on encoder computational complexity. This paper proposes a two-stage learning-based technique to tackle the complexity overhead of MTT in VVC intra encoders. In our scheme, the input block is first processed by a Convolutional Neural Network (CNN) to predict its spatial features through a vector of probabilities describing the partition at each 4×4 edge. Subsequently, a Decision Tree (DT) model leverages this vector of spatial features to predict the most likely splits at each block. Finally, based on this prediction, only the N most likely splits are processed by the Rate-Distortion (RD) process of the encoder. In order to train our CNN and DT models on a wide range of image contents, we also propose a public VVC frame partitioning dataset based on existing image dataset encoded with the VVC reference software encoder. Our solution relying on the top-3 configuration reaches 47.4% complexity reduction for a negligible bitrate increase of 0.79%. A top-2 configuration enables a higher complexity reduction of 70.4% for 2.49% bitrate loss. These results emphasize a better trade-off between VTM intra-coding efficiency and complexity reduction compared to the state-of-the-art solutions. The source code of the proposed method and the training dataset are made publicly available at GitHub.
Descriptive complexity provides intrinsic, that is,machine-independent, characterizations of the major complexity classes. On the other hand, logic can be useful for designing programs in a natural ...declarative way. This is particularly important for parallel computation models such as cellular automata, because designing parallel programs is considered a difficult task.This paper establishes three logical characterizations of the three classical complexity classes modeling minimal time, called real-time, of one-dimensional cellular automata according to their canonical variants: unidirectional or bidirectional communication, input word given in a parallel or sequential way.Our three logics are natural restrictions of existential second-order Horn logic with built-in successor and predecessor functions. These logics correspond exactly to the three ways of deciding a language on a square grid circuit of side n according to one of the three canonical locations of an input word of length n: along a side of the grid, on the diagonal that contains the output cell, or on the diagonal opposite to the output cell.The key ingredient of our results is a normalization method that transforms a formula from one of our three logics into an equivalent normalized formula that faithfully mimics a grid circuit.Then, we extend our logics by allowing a limited use of negation on hypotheses like in Stratified Datalog. By revisiting in detail a number of representative classical problems - recognition of the set of primes by Fisher’s algorithm, Dyck language recognition, Firing Squad Synchronization problem,etc. - we show that this extension makes easier programming and we prove that it does not change the complexity of our logics in real-time.Finally, starting from our experience in expressing those representative problems in logic, we argue that our logics are high-level programming languages: they allow to express in a natural,precise and synthetic way the algorithms of literature, based on signals, and to translate them automatically into cellular automata of the same complexity.
Previously referred to as 'miraculous' in the scientific literature because of its powerful properties and its wide application as optimal solution to the problem of induction/inference, ...(approximations to) Algorithmic Probability (AP) and the associated Universal Distribution are (or should be) of the greatest importance in science. Here we investigate the emergence, the rates of emergence and convergence, and the Coding-theorem like behaviour of AP in Turing-subuniversal models of computation. We investigate empirical distributions of computing models in the Chomsky hierarchy. We introduce measures of algorithmic probability and algorithmic complexity based upon resource-bounded computation, in contrast to previously thoroughly investigated distributions produced from the output distribution of Turing machines. This approach allows for numerical approximations to algorithmic (Kolmogorov-Chaitin) complexity-based estimations at each of the levels of a computational hierarchy. We demonstrate that all these estimations are correlated in rank and that they converge both in rank and values as a function of computational power, despite fundamental differences between computational models. In the context of natural processes that operate below the Turing universal level because of finite resources and physical degradation, the investigation of natural biases stemming from algorithmic rules may shed light on the distribution of outcomes. We show that up to 60% of the simplicity/complexity bias in distributions produced even by the weakest of the computational models can be accounted for by Algorithmic Probability in its approximation to the Universal Distribution.
S. luridus, S. rivulatus, P. miles, and E. golanii are four Lessepsians among the approximately 90 alien fish and have already demonstrated detectable negative impacts on local fish stocks and ...ecosystems in the Mediterranean Sea. The purpose of this review is to evaluate the feasibility and sustainability of their eradication through depletion fishing based on available knowledge and practices because there is positive mutual reinforcement between the altered state of a marine ecosystem due to changes in biogeochemical cycling and fishing and aliens’ establishment. Mediterranean artisanal and recreational fisheries already use efficient fishing gear, namely gill and trammel nets for S. luridus and S. rivulatus, spearguns for P. miles, and pelagic trawls for E. golanii. Removal of the aliens of concern restores local fish populations in field and modeling studies but the effects appear not to be long-lasting due to hydrodynamic connectivity with source populations and insufficient local natural mortality and predation either at the early (egg, larva) or later (juvenile, adult) stages of the aliens’ growth. Τo transcend the goal of management from local control toward resolution for these four Lessepsians’ entry and establishment in the Mediterranean Sea, the re-active intervention of their depletion fishing needs to be aligned with pro-active interventions at their sources of entry and means of transport upstream and with inter-active interventions of their monitoring by the multiple active users of marine resources. Sustainability of the resolution goal will be ensured provided it is designed in the large Mediterranean marine ecosystem jurisdiction, which includes both sources and sinks of aliens; planned locally, which caters for differential effects of aliens on local ecologies, societies, and economies; and considers mediators and moderators of the whole context of multiple-source, multiple-sector, nascent alien trophic web, and altered state of local ecosystems.
•most efficient gear is gill and trammel nets (S. luridus and S. rivulatus), speargun (P. miles), pelagic trawl (E. golanii).•restoration of local fish stocks depends on fishing effort, population densities of aliens, and ecosystem state.•depletion of aliens requires system-level design and regional implementation jurisdiction.
Social complexity has been one of the recent emerging topics in the study of animal and human societies, but the concept remains both poorly defined and understood. In this paper, I critically review ...definitions and studies of social complexity in invertebrate and vertebrate societies, arguing that the concept is being used inconsistently in studies of vertebrate sociality. Group size and cohesion define one cornerstone of social complexity, but the nature and patterning of social interactions contribute more to interspecific variation in social complexity in species with individual recognition and repeated interactions. Humans provide the only example where many other unique criteria are used, and they are the only species for which intraspecific variation in social complexity has been studied in detail. While there is agreement that complex patterns emerge at the group level as a result of simple interactions and as a result of cognitive abilities, there is consensus neither on their relative importance nor on the role of specific cognitive abilities in different lineages. Moreover, aspects of reproduction and parental care have also been invoked to characterize levels of social complexity, so that no single comprehensive measure is readily available. Because even fundamental components of social complexity are difficult to compare across studies and species because of inconsistent definitions and operationalization of key social traits, I define and characterize social organization, social structure, mating system, and care system as distinct components of a social system. Based on this framework, I outline how different aspects of the evolution of social complexity are being studied and suggest questions for future research.
Mode division multiplexing (MDM) using orbital angular momentum (OAM) is a recently developed physical layer transmission technique, which has obtained intensive interest among optics, ...millimeter-wave, and radio frequency due to its capability to enhance communication capacity while retaining an ultra-low receiver complexity. In this paper, the system model based on OAM-MDM is mathematically analyzed and it is theoretically concluded that such system architecture can bring a vast reduction in receiver complexity without capacity penalty compared with conventional line-of-sight multiple-inmultiple-out systems under the same physical constraint. Furthermore, a 4×4 OAM-MDM communication experiment adopting a pair of easily realized Cassegrain reflector antennas capable of multiplexing/demultiplexing four orthogonal OAM modes of l = -3, -2, +2, and +3 is carried out at a microwave frequency of 10 GHz. The experimental results show high spectral efficiency as well as low receiver complexity.
The increasing capacity of storage devices and the growing frequency of correlated failures demands better fault tolerance by using maximum distance separable (MDS) array codes with triple-parity. ...Although many constructions of triple-parity MDS array codes have been proposed, they either have large update complexity or have large encoding/decoding complexity or only support specified parameters. In this paper, we propose two classes of triple-parity MDS array codes, called extended EVENODD+ and STAR+ that both have asymptotically optimal update complexity and lower encoding/decoding complexity. We show that the existing extended EVENODD and STAR are a special case of our extended EVENODD+ and STAR+, respectively. Moreover, we show that our extended EVENODD+ and STAR+ have strictly less encoding/decoding/update complexity than that of extended EVENODD and STAR for most parameters.