Job Shop Scheduling is a combinatorial optimization problem of particular importance for production environments where the goal is to complete a production task in the shortest possible time given ...limitations in the resources available. Due to its computational complexity it quickly becomes intractable for problems of interesting size. The emerging technology of Quantum Annealing provides an alternative computational architecture that promises improved scalability and solution quality. However, several limitations as well as open research questions exist in this relatively new and rapidly developing technology. This paper studies the application of quantum annealing to solve the job shop scheduling problem, describing each step required from the problem formulation to the fine-tuning of the quantum annealer and compares the solution quality with various classical solvers. Particular attention is devoted to aspects that are often overlooked, such as the computational cost of representing the problem in the formulation required by the quantum annealer, the relative qubits requirements and how to mitigate chain breaks. Furthermore, the impact of advanced tools such as reverse annealing is presented and its effectiveness discussed. The results indicate several challenges emerging at various stages of the experimental pipeline which bring forward important research questions and directions of improvement.
The design of algorithms that generate personalized ranked item lists is a central topic of research in the field of recommender systems. In the past few years, in particular, approaches based on ...deep learning (neural) techniques have become dominant in the literature. For all of them, substantial progress over the state-of-the-art is claimed. However, indications exist of certain problems in today’s research practice, e.g., with respect to the choice and optimization of the baselines used for comparison, raising questions about the published claims. To obtain a better understanding of the actual progress, we have compared recent results in the area of neural recommendation approaches based on collaborative filtering against a consistent set of existing simple baselines. The worrying outcome of the analysis of these recent works—all were published at prestigious scientific conferences between 2015 and 2018—is that 11 of the 12 reproducible neural approaches can be outperformed by conceptually simple methods, e.g., based on the nearest-neighbor heuristic or linear models. None of the computationally complex neural methods was actually consistently better than already existing learning-based techniques, e.g., using matrix factorization or linear models. In our analysis, we discuss common issues in today’s research practice, which, despite the many papers that are published on the topic, have apparently led the field to a certain level of stagnation.
1
The promise of quantum computing to open new unexplored possibilities in several scientific fields has been long discussed, but until recently the lack of a functional quantum computer has confined ...this discussion mostly to theoretical algorithmic papers. It was only in the last few years that small but functional quantum computers have become available to the broader research community. One paradigm in particular, quantum annealing, can be used to sample optimal solutions for a number of NP-hard optimization problems represented with classical operations research tools, providing an easy access to the potential of this emerging technology. One of the tasks that most naturally fits in this mathematical formulation is feature selection. In this paper, we investigate how to design a hybrid feature selection algorithm for recommender systems that leverages the domain knowledge and behavior hidden in the user interactions data. We represent the feature selection as an optimization problem and solve it on a real quantum computer, provided by D-Wave. The results indicate that the proposed approach is effective in selecting a limited set of important features and that quantum computers are becoming powerful enough to enter the wider realm of applied science.
Deep learning techniques have become the method of choice for researchers working on algorithmic aspects of recommender systems. With the strongly increased interest in machine learning in general, ...it has, as a result, become difficult to keep track of what represents the state-of-the-art at the moment, e.g., for top-n recommendation tasks. At the same time, several recent publications point out problems in today's research practice in applied machine learning, e.g., in terms of the reproducibility of the results or the choice of the baselines when proposing new models.
In this work, we report the results of a systematic analysis of algorithmic proposals for top-n recommendation tasks. Specifically, we considered 18 algorithms that were presented at top-level research conferences in the last years. Only 7 of them could be reproduced with reasonable effort. For these methods, it however turned out that 6 of them can often be outperformed with comparably simple heuristic methods, e.g., based on nearest-neighbor or graph-based techniques. The remaining one clearly outperformed the baselines but did not consistently outperform a well-tuned non-neural linear ranking method. Overall, our work sheds light on a number of potential problems in today's machine learning scholarship and calls for improved scientific practices in this area.
The development of the information technology and, very recently, of new application scenarios like Internet of Things (IoT) and Industry 4.0 has pushed the research towards new technological ...platforms. In this frame, the spatial freedom permitted by flexible short-range connections and flexible devices has become as important as “classical” parameters such as low weight, low power consumption, and electromagnetic immunity. Accordingly, first the flexible electronics and later the flexible photonics have produced innovative components and devices. The present paper aims at presenting a brief overview of this broad area, underlining the achievements and the remaining challenges in the different routes to the manufacturing of flexible photonic devices. Material platforms remain at the core of such developments, and it is interesting to note that glassy materials still constitute a fundamental piece in the present and future scenario.
•A short summary of the early steps of flexible electronics is presented.•Early developments of flexible photonics are discussed.•Various material platforms of flexible photonics are reviewed:∙Semiconductor on polymer flexible structures∙All-polymeric structures∙Glass on polymer structures∙Flexible glass substrates and components•Further potential developments are outlined
The investigation of genetic forms of juvenile neurodegeneration could shed light on the causative mechanisms of neuronal loss. Schinzel-Giedion syndrome (SGS) is a fatal developmental syndrome ...caused by mutations in the SETBP1 gene, inducing the accumulation of its protein product. SGS features multi-organ involvement with severe intellectual and physical deficits due, at least in part, to early neurodegeneration. Here we introduce a human SGS model that displays disease-relevant phenotypes. We show that SGS neural progenitors exhibit aberrant proliferation, deregulation of oncogenes and suppressors, unresolved DNA damage, and resistance to apoptosis. Mechanistically, we demonstrate that high SETBP1 levels inhibit P53 function through the stabilization of SET, which in turn hinders P53 acetylation. We find that the inheritance of unresolved DNA damage in SGS neurons triggers the neurodegenerative process that can be alleviated either by PARP-1 inhibition or by NAD + supplementation. These results implicate that neuronal death in SGS originates from developmental alterations mainly in safeguarding cell identity and homeostasis.
As of today, most movie recommendation services base their recommendations on collaborative filtering (CF) and/or content-based filtering (CBF) models that use metadata (e.g., genre or cast). In most ...video-on-demand and streaming services, however, new movies and TV series are continuously added. CF models are unable to make predictions in such a scenario, since the newly added videos lack interactions—a problem technically known as new item cold start (CS). Currently, the most common approach to this problem is to switch to a purely CBF method, usually by exploiting textual metadata. This approach is known to have lower accuracy than CF because it ignores useful collaborative information and relies on human-generated textual metadata, which are expensive to collect and often prone to errors. User-generated content, such as tags, can also be rare or absent in CS situations. In this paper, we introduce a new movie recommender system that addresses the new item problem in the movie domain by (i) integrating state-of-the-art audio and visual descriptors, which can be automatically extracted from video content and constitute what we call the
movie genome
; (ii) exploiting an effective data fusion method named
canonical correlation analysis
, which was successfully tested in our previous works Deldjoo et al. (in: International Conference on Electronic Commerce and Web Technologies. Springer, Berlin, pp 34–45,
2016b
; Proceedings of the Twelfth ACM Conference on Recommender Systems. ACM,
2018b
), to better exploit complementary information between different modalities; (iii) proposing a two-step hybrid approach which trains a CF model on warm items (items with interactions) and leverages the learned model on the movie genome to recommend cold items (items without interactions). Experimental validation is carried out using a system-centric study on a large-scale, real-world movie recommendation dataset both in an absolute cold start and in a cold to warm transition; and a user-centric online experiment measuring different subjective aspects, such as satisfaction and diversity. Results show the benefits of this approach compared to existing approaches.
The unique properties of the Eu3+ ion make it a powerful spectroscopic tool to investigate structure or follow processes and mechanisms in several high-tech application areas such as biology and ...health, structural engineering, environment monitoring systems and quantum technology, mainly concerning photonics. The traditional method is to exploit the unique photoluminescent properties of Eu3+ ions to understand complex dynamical processes and obtain information useful to develop materials with specific characteristics. The objective of this review is to focus on the use of Eu3+ optical spectroscopy in some condensed matter issues. After a short presentation of the more significant properties of the Eu3+ ion, some examples regarding its use as a probe of the local structure in sol–gel systems are presented. Another section is devoted to dynamical processes such as the important technological role of nanocrystals as rare-earth sensitizers. The appealing effect of the site-selection memory, observed when exciting different sites into the 5D1 state, which the 5D0 → 7F0 emission band reflects following the sites’ distribution, is also mentioned. Finally, a section is devoted to the use of Eu3+ in the development of a rare-earth-based platform for quantum technologies.
ADPKD is erroneously perceived as a not rare condition, which is mainly due to the repeated citation of a mistaken interpretation of old epidemiological data, as reported in the Dalgaard's work ...(1957). Even if ADPKD is not a common condition, the correct prevalence of ADPKD in the general population is uncertain, with a wide range of estimations reported by different authors. In this work, we have performed a meta-analysis of available epidemiological data in the European literature. Furthermore we collected the diagnosis and clinical data of ADPKD in a province in the north of Italy (Modena). We describe the point and predicted prevalence of ADPKD, as well as the main clinical characteristics of ADPKD in this region.
We looked at the epidemiological data according to specific parameters and criteria in the Pubmed, CINAHL, Scopus and Web of Science databases. Data were summarized using linear regression analysis. We collected patients' diagnoses in the Province of Modena according to accepted clinical criteria and/or molecular analysis. Predicted prevalence has been calculated through a logistic regression prediction applied to the at-risk population.
The average prevalence of ADPKD, as obtained from 8 epidemiological studies of sufficient quality, is 2.7: 10,000 (CI95 = 0.73-4.67). The point prevalence of ADPKD in the province of Modena is 3.63: 10,000 (CI95 = 3.010-3.758). On the basis of the collected pedigrees and identification of the at-risk subjects, the predicted prevalence in the Province of Modena is 4.76: 10,000 (CI 95% = 4.109-4.918).
As identified in our study, point prevalence is comparable with the majority of the studies of literature, while predicted prevalence (4.76: 10,000) generally appears higher than in the previous estimates of the literature, with a few exceptions. Thus, this could suggest that undiagnosed ADPKD subjects, as predicted by our approach, could be relevant and will most likely require more clinical attention. Nevertheless, our estimation, in addition to the averaged ones derived from literature, not exceeding the limit of 5:10,000 inhabitants, are compatible with the definition of rare disease adopted by the European Medicines Agency and Food and Drug Administration.
Celotno besedilo
Dostopno za:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK