In this paper an implementation of a smart predictive monitoring and adaptive control system for the public lighting have been carried out. The vehicular traffic flow acquired using a smart camera ...has been analyzed and several predictive methods have been studied. Then, a control strategy based on the given traffic forecasts and on the dynamical street class downgrade allowed by the law, has been implemented. Experimental results provided by a real life testbed showed that the proposed strategy has high potential energy savings without affecting safety.
Objective
Dosimetric comparison of HIPO (hybrid inverse planning optimisation) and IPSA (inverse planning simulated annealing) inverse and forward optimisation (FO) methods in brachytherapy (BT) of ...breast, cervical and prostate cancer.
Methods
At our institute 38 breast, 47 cervical and 50 prostate cancer patients treated with image-guided interstitial high-dose-rate BT were selected. Treatment plans were created using HIPO and IPSA inverse optimisation methods as well as FO. The dose–volume parameters of different treatment plans were compared with Friedman ANOVA and the LSD post-hoc test.
Results
IPSA creates less dose coverage to the target volume than HIPO or FO: V100 was 91.7%, 91% and 91.9% for HIPO, IPSA and FO plans (
p
= 0.1784) in breast BT; 90.4%, 89.2% and 91% (
p
= 0.0045) in cervical BT; and 97.1%, 96.2% and 97.7% (
p
= 0.0005) in prostate BT, respectively. HIPO results in more conformal plans: COIN was 0.72, 0.71 and 0.69 (
p
= 0.0306) in breast BT; 0.6, 0.47 and 0.58 (
p
< 0.001) in cervical BT; and 0.8, 0.7 and 0.7 (
p
< 0.001) in prostate BT, respectively. In breast BT, dose to the skin and lung was smaller with HIPO and FO than with IPSA. In cervical BT, dose to the rectum, sigmoid and bowel was larger using IPSA than with HIPO or FO. In prostate BT, dose to the urethra was higher and the rectal dose was smaller using FO than with inverse methods.
Conclusion
In interstitial breast and prostate BT, HIPO results in comparable dose–volume parameters to FO, but HIPO plans are more conformal. In cervical BT, HIPO produces dosimetrically acceptable plans only when more needles are used. The dosimetric quality of IPSA plans is suboptimal and results in unnecessary larger active lengths.
Stylometry is one of the research areas in greater development within Digital Humanities. However, few studies have worked until recently with texts in Spanish and even less so from Spanish-speaking ...countries. The aim of this paper is to present in Spanish, and without prior statistical knowledge from the reader, one of the main methods used in stylometry, the measure of textual distance Burrows’ Delta. This paper explains this measure using a very small corpus of proverbs and then checks the results in a corpus of Spanish novels. Both data and Python scripts are available to the community through GitHub, commented step by step so that you can play and visualize each step.
Stylometry is one of the research areas in greater development within Digital
Humanities. However, few studies have worked until recently with texts in
Spanish and even less so from Spanish-speaking ...countries. The aim of this
paper is to present in Spanish, and without prior statistical knowledge from the
reader, one of the main methods used in stylometry, the measure of textual
distance Burrows' Delta. This paper explains this measure using a very small
corpus of proverbs and then checks the results in a corpus of Spanish novels.
Both data and Python scripts are available to the community through GitHub,
commented step by step so that you can play and visualize each step.
La estilometría es una de las áreas de investigación en las Humanidades
Digitales con mayor desarrollo. Sin embargo pocos estudios han trabajado
hasta hace poco con textos en español y menos aún se han desarrollado en
países hispanohablantes. El objetivo de este artículo es presentar en español y
sin presuponer conocimientos estadísticos por parte del lector uno de los
principales métodos utilizados en la estilometría: la medida de distancia textual
de Burrows llamada Delta. El artículo explica este algoritmo usando un corpus
mínimo de refranes y posteriormente comprueba los resultados en un corpus
de novelas españolas. Tanto los datos como los archivos de programación
Python están a disposición de la comunidad mediante GitHub, comentados
paso por paso para que se pueda reproducir y visualizar cada paso.
The paper deals with the Fleet Composition Problem (FCP) for a 2-echelon fuel distribution system, composed of a fuel regional warehouse (depot) and 100 gas stations (customers). The customers’ ...orders include random quantities of 4 types of fuel, which are distributed by a specialized fleet of road tankers. The fleet includes 4- and 8-chamber tankers that are characterized by different load capacities and fixed and variable costs. Since different types of fuel cannot be mixed together during transportation process specific assignment of orders (transportation tasks) to the vehicles (their chambers) is required. The decision problem consists in composing an optimal fleet of tankers, i.e. in defining optimal types of tankers to be used and an optimal number of vehicles in each type. It is considered as a vehicle assignment problem in which different types of vehicles are assigned to customers’ orders and formulated as a single objective mathematical programming problem, where the optimization criterion is a total daily distribution cost. Two alternative formulations of the decision problem are proposed, based on different definitions of the vehicles to be assigned to transportation tasks. To solve the problem, represented by both formulations, specialized alternative heuristic procedures are constructed. They are based on: local search (LS), evolutionary algorithms (EA) and hybrid algorithm (LS + EA). Computational experiments are carried out and their results are presented and compared in the paper.
Afin de s'adapter aux architectures multicoeurs et aux machines de plus en plus complexes, les modèles de programmations basés sur un parallélisme de tâche ont gagné en popularité dans la communauté ...du calcul scientifique haute performance. Les moteurs d'exécution fournissent une interface de programmation qui correspond à ce paradigme ainsi que des outils pour l'ordonnancement des tâches qui définissent l'application. Dans cette étude, nous explorons la conception de solveurs directes creux à base de tâches, qui représentent une charge de travail extrêmement irrégulière, avec des tâches de granularités et de caractéristiques différentes ainsi qu'une consommation mémoire variable, au-dessus d'un moteur d'exécution. Dans le cadre du solveur qr mumps, nous montrons dans un premier temps la viabilité et l'efficacité de notre approche avec l'implémentation d'une méthode multifrontale pour la factorisation de matrices creuses, en se basant sur le modèle de programmation parallèle appelé "flux de tâches séquentielles" (Sequential Task Flow). Cette approche, nous a ensuite permis de développer des fonctionnalités telles que l'intégration de noyaux dense de factorisation de type "minimisation de cAfin de s'adapter aux architectures multicoeurs et aux machines de plus en plus complexes, les modèles de programmations basés sur un parallélisme de tâche ont gagné en popularité dans la communauté du calcul scientifique haute performance. Les moteurs d'exécution fournissent une interface de programmation qui correspond à ce paradigme ainsi que des outils pour l'ordonnancement des tâches qui définissent l'application. Dans cette étude, nous explorons la conception de solveurs directes creux à base de tâches, qui représentent une charge de travail extrêmement irrégulière, avec des tâches de granularités et de caractéristiques différentes ainsi qu'une consommation mémoire variable, au-dessus d'un moteur d'exécution. Dans le cadre du solveur qr mumps, nous montrons dans un premier temps la viabilité et l'efficacité de notre approche avec l'implémentation d'une méthode multifrontale pour la factorisation de matrices creuses, en se basant sur le modèle de programmation parallèle appelé "flux de tâches séquentielles" (Sequential Task Flow). Cette approche, nous a ensuite permis de développer des fonctionnalités telles que l'intégration de noyaux dense de factorisation de type "minimisation de cAfin de s'adapter aux architectures multicoeurs et aux machines de plus en plus complexes, les modèles de programmations basés sur un parallélisme de tâche ont gagné en popularité dans la communauté du calcul scientifique haute performance. Les moteurs d'exécution fournissent une interface de programmation qui correspond à ce paradigme ainsi que des outils pour l'ordonnancement des tâches qui définissent l'application.
To face the advent of multicore processors and the ever increasing complexity of hardware architectures, programming models based on DAG parallelism regained popularity in the high performance, scientific computing community. Modern runtime systems offer a programming interface that complies with this paradigm and powerful engines for scheduling the tasks into which the application is decomposed. These tools have already proved their effectiveness on a number of dense linear algebra applications. In this study we investigate the design of task-based sparse direct solvers which constitute extremely irregular workloads, with tasks of different granularities and characteristics with variable memory consumption on top of runtime systems. In the context of the qr mumps solver, we prove the usability and effectiveness of our approach with the implementation of a sparse matrix multifrontal factorization based on a Sequential Task Flow parallel programming model. Using this programming model, we developed features such as the integration of dense 2D Communication Avoiding algorithms in the multifrontal method allowing for better scalability compared to the original approach used in qr mumps. In addition we introduced a memory-aware algorithm to control the memory behaviour of our solver and show, in the context of multicore architectures, an important reduction of the memory footprint for the multifrontal QR factorization with a small impact on performance. Following this approach, we move to heterogeneous architectures where task granularity and scheduling strategies are critical to achieve performance. We present, for the multifrontal method, a hierarchical strategy for data partitioning and a scheduling algorithm capable of handling the heterogeneity of resources. Finally we present a study on the reproducibility of executions and the use of alternative programming models for the implementation of the multifrontal method. All the experimental results presented in this study are evaluated with a detailed performance analysis measuring the impact of several identified effects on the performance and scalability. Thanks to this original analysis, presented in the first part of this study, we are capable of fully understanding the results obtained with our solver.
Our paper presents an automatic generation of high resolution urban digital elevation models (DEMs) based on a highly redundant correlation process. We will discuss the difficulties of such a task by ...commenting on the state of the art, and we propose an approach in three main steps. In the first step, the image acquisition specification as image sequences leads to pairs with various base/height ratios in order to obtain good precision and few errors due to hidden parts. In the second step we use various stereovision methods and we merge the results, thus attributing to each pixel the most probable and precise elevation. In the third step we automatically extract terrain-DEM and building-DEM from computed DEM in order to specifically post-process each class. Finally, we combine these two DEMs to generate a final DEM which presents the best continuity for ground surface, and which respects sharp building discontinuities. The results obtained with an operational example (including image size, difficulty of the scene) demonstrate the feasibility of generating metric resolution urban data bases from automated digital stereo methods.
Severe pelvic injuries are often complicated by abdominal lesions. The main problem is excessive bleeding with threatening exsanguination. Bleeding originates mostly from the presacral venous plexus. ...By external fixation and reduction of the pelvic volume, the blood loss can be diminished or stopped. Laparotomy before external fixation will increase fracture dislocation and provoke further severe bleeding by loss of the tension band effect that the intact abdominal wall gives to the pelvis, protecting the latter against further symphysical diastasis.
Background
The “Minimalistic Hybrid Approach” (MHA) has been proposed to reduce the invasiveness of chronic total occlusion (CTO) percutaneous coronary intervention (PCI).
Aims
This study aims to ...assess whether MHA may also reduce the utilization of PCI resources (devices, radiations, and contrast) by comparing it with other conventional algorithms.
Methods
We aimed to assess the impact of MHA on device, radiation, and contrast usage during CTO‐PCI analyzing data from the Belgian Working Group on CTO (BWG‐CTO) registry. Patients were divided, depending on the algorithm used, into two groups: Conventional versus Minimalistic. Primary objectives were procedure performance measures such as device usage (microcatheters and guidewires), radiological parameters, and contrast use. At 1‐year follow‐up, patients were evaluated for target vessel failure (TVF), defined as a composite of cardiac death, new myocardial infarction, and target vessel revascularization.
Results
Overall, we analyzed 821 CTO‐PCIs (Conventional n = 650, Minimalistic n = 171). The Minimalistic group demonstrated higher complexity of CTO lesions. After adjusting for propensity score, the Minimalistic group had a significantly lower number of microcatheters used (1.49 ± 0.85 vs. 1.24 ± 0.64, p = 0.026), while the number of guidewires was comparable (4.80 ± 3.29 vs. 4.35 ± 2.94, p = 0.30). Both groups had similar rates of success and procedural complications, as well as comparable procedural and fluoroscopic times and contrast volume used. At the 1‐year follow‐up, both groups showed comparable rates of TVF (hazard ratio: 0.57; 95% confidence interval: 0.24–1.34, p = 0.195).
Conclusion
The MHA may slightly reduce the number of dedicated devices used during CTO‐PCI, without adversely affecting the procedural success or long‐term outcome.