Urban liveability is a key concept in the New Urban Agenda (NUA) adopted by the United Nations (UN) in 2016. The UN has recognized that effective benchmarks and monitoring mechanisms are essential ...for the successful implementation of the NUA. However, the timely and cost effective collection of objective international quality of life urban data remains a significant challenge. Urban liveability indexes are often complex, resource intensive and time consuming to collect, and as a result costly. At the same time, competing methodologies and agendas may result in subjective or non-comparable data. Historically, transit has been a central organizing factor around which communities have been built. This paper explores the use of Uber data as a simple real-time indicator of urban liveability. Using data from the Uber Ride Request (URR) API for the Brazilian city of Natal, our preliminary findings suggest that Uber Estimated Time to Arrive (ETA) data is strongly correlated with selected quality of life indicators at a neighborhood and region level. Furthermore, unlike other urban liveability indicators, our findings suggest that Uber ETA data is context-sensitive reflecting daily and seasonal factors thereby providing more granular insights. This preliminary study finds strong evidence that Uber data can provide a simple, comparable, low cost, international urban liveability indicator at both city and neighborhood level for urban policy setting and planning.
Mobile network traffic prediction is an important input in to network capacity planning and optimization. Existing approaches may lack the speed and computational complexity to account for bursting, ...non-linear patterns or other important correlations in time series mobile network data. We compare the performance of two deep learning architectures - Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) - for predicting mobile Internet traffic using two months of Telecom Italia data for the metropolitan area of Milan. K-Means clustering was used a priori to group cells based on Internet activity and the Grid Search method was used to identify the best configurations for each model. The predictive quality of the models was evaluated using root mean squared error. Both Deep Learning algorithms were effective in modeling Internet activity and seasonality, both within days and across two months. We find variations in performance across clusters within the city. Overall, the LSTM outperformed the GRU in our experiments.
To meet service level agreement (SLA) requirements, the majority of enterprise IT infrastructure is typically overprovisioned, underutilized, non-compliant and lacking in required agility resulting ...in significant inefficiencies. As enterprises introduce and migrate to next-generation applications designed to be horizontally scalable, they require infrastructure that can manage the duality of legacy and next generation application requirements. To address this, composable data center infrastructure disaggregates and refactors compute, storage, network and other infrastructure resources in to shared resources pools that can be "composed" and allocated on-demand. In this paper, we model a theorical problem of resource allocation in a composable data center infrastructure as a bounded multidimensional knapsack and then apply multi-objective optimization algorithms, Non-dominated Sorting Genetic Algorithm (NSGA-II) and Generalized Differential Evolution (GDE3), to allocate resources efficiently. The main goal is to maximize resource availability for the application owner, while meeting minimum requirements (in terms of CPU, memory, network, and storage) within budget constraints. We consider two different scenarios to analyze heterogeneity and variability aspects when allocating resources on composable data center infrastructure.