Remarkable prevalence of cloud computing has enabled smart cars to provide infotainment services. However, retrieving infotainment contents from long-distance data centers poses a significant delay, ...thus hindering to offer stringent latency-aware infotainment services. Multi-access edge computing is a promising option to meet strict latency requirements. However, it imposes severe resource constraints with respect to caching, and computation. Similarly, communication resources utilized to fetch the infotainment contents are scarce. In this paper, we jointly consider communication, caching, and computation (3C) to reduce infotainment content retrieval delay for smart cars. We formulate the problem as a mix-integer, nonlinear, and nonconvex optimization to minimize the latency. Furthermore, we relax the formulated problem from NP-hard to linear programming. Then, we propose a joint solution (3C) based on the alternative direction method of multipliers technique, which operates in a distributed manner. We compare the proposed 3C solution with various approaches, namely, greedy, random, and centralized. Simulation results reveal that the proposed solution reduces delay up to <inline-formula><tex-math notation="LaTeX">\text{9}\%</tex-math></inline-formula> and <inline-formula><tex-math notation="LaTeX">\text{28}\%</tex-math></inline-formula> compared to the greedy and random approaches, respectively.
In recent years, in order to provide a better quality of service (QoS) to Internet of Things (IoT) devices, the cloud computing paradigm has shifted toward the edge. However, the resource capacity ...(e.g., bandwidth) in fog network technology is limited and it is essential to efficiently bind the IoT applications with stringent QoS requirements with the available network infrastructure. In this paper, we formulate a joint user association and resource allocation problem in the downlink of the fog network, considering the evergrowing demand of QoS requirements imposed by the ultra-reliable low latency communications and enhanced mobile broadband services. First, we determine the priority of different QoS requirements of heterogeneous IoT applications at the fog network by enforcing the analytical framework using an analytic hierarchy process (AHP). Using the AHP, we then formulate a two-sided matching game to initiate stable association between the fog network infrastructure (i.e., fog devices) and IoT devices. Subsequently, we consider the externalities in the matching game that occurs due to job delay and solve the network resource allocation problem by applying the "best-fit" resource allocation strategy during matching. The simulation results illustrate the stability of the user association and efficiency of resource allocation with higher utility gain.
An unprecedented proliferation of autonomous driving technologies has been observed in recent years, resulting in the emergence of reliable and safe transportation services. In the foreseeable ...future, millions of autonomous cars will communicate with each other and become prevalent in smart cities. Thus, scalable, robust, secure, fault-tolerant, and interoperable technologies are required to support such a plethora of autonomous cars. In this article, we investigate, highlight, and report premier research advances made in autonomous driving by devising a taxonomy. A few indispensable requirements for successful deployment of autonomous cars are enumerated and discussed. Furthermore, we discover and present recent synergies and prominent case studies on autonomous driving. Finally, several imperative open research challenges are identified and discussed as future research directions.
Caching content at base stations has proven effective at reducing transmission delays. This paper investigates the caching problem in a network of highly dynamic cache-enabled Unmanned Aerial ...Vehicles (UAVs), which serve ground users as aerial base stations. In this scenario, UAVs share their caches to minimize total transmission delays for requested content while simultaneously adjusting their locations. To address this challenge, we formulate a non-convex optimization problem that jointly controls UAV mobility, user association, and content caching to minimize transmission delay time. Considering the highly dynamic environment where traditional optimization approaches fall short, we propose a deep reinforcement learning (RL)-based algorithm. Specifically, we employ the actor-critic-based Deep Deterministic Policy Gradient (DDPG) algorithm to solve the optimization problem effectively. We conducted extensive simulations with respect to different cache sizes and the number of associated users with their home UAVs and compared our proposed algorithm with two baselines. Our proposed solution has demonstrated noteworthy enhancements over the two baseline approaches across various scenarios, including diverse cache sizes and varying numbers of users associated with their respective home UAVs.
Recently, the coexistence of ultra-reliable and low-latency communication (URLLC) and enhanced mobile broadband (eMBB) services on the same licensed spectrum has gained a lot of attention from both ...academia and industry. However, the coexistence of these services is not trivial due to the diverse multiple access protocols, contrasting frame distributions in the existing network, and the distinct quality of service requirements posed by these services. Therefore, such coexistence drives towards a challenging resource scheduling problem. To address this problem, in this paper, we first investigate the possibilities of scheduling URLLC packets in incumbent eMBB traffic. In this regard, we formulate an optimization problem for coexistence by dynamically adopting a superposition or puncturing scheme. In particular, the aim is to provide spectrum access to the URLLC users while reducing the intervention on incumbent eMBB users. Next, we apply the one-to-one matching game to find stable URLLC-eMBB pairs that can coexist on the same spectrum. Then, we apply the contract theory framework to design contracts for URLLC users to adopt the superposition scheme. Simulation results reveal that the proposed contract-based scheduling scheme achieves up to 63% of the eMBB rate for the "No URLLC" case compared to the "Puncturing" scheme.
We study the problem of user clustering and power assignment for a network comprised of cellular users and underlay device-to-device (D2D) users operating under a non-orthogonal multiple access ...(NOMA) scheme. Our goal is to maximize the sum-rate of the network by jointly optimizing the user clustering and power assignment. Moreover, we also aim to provide interference protection for the cellular users. The formulated optimization problem is a mixed-integer non-convex problem. Thus, the original problem is decomposed into two subproblems. The first subproblem of user clustering is formulated as a matching game with externalities, where this matching game is solved sequentially while the second subproblem pertaining to power assignment is solved using complementary Geometric programming. Finally, an efficient joint iterative algorithm is proposed that can achieve a suboptimal solution for the mix integer non-convex NP-hard problem. Simulation results show that the proposed algorithm can achieve up to 70% and 92% of performance gains in terms of the average sum-rate in comparison with the general NOMA and traditional orthogonal frequency-division multiple access (OFDMA) schemes, respectively. Moreover, our results show that the proposed scheme significantly enhances the network connectivity in terms of the number of admitted users compared with the traditional OFDMA, NOMA, and D2D schemes.
5G is poised to support new emerging service types that help in the realization of futuristic applications. These services include enhanced Mobile BroadBand (eMBB), ultra-Reliable Low Latency ...Communication (uRLLC), and massive Machine-Type Communication (mMTC). Even though the new services offer a variety of new use-cases to be implemented, it is still a challenge to guarantee the Quality of Service (QoS) they demand. Moreover, as considerable amount of computational resources are introduced in the evolved Radio Access Network (RAN) following the Mobile Edge Computing (MEC) concept, computational resource allocation optimization along with radio allocation becomes essential. In this paper, we examine the characteristics of the new 5G services and propose a joint computational and radio resource allocation framework that analyzes the QoS performance of each 5G service individually. The framework is developed based on per-service load characterization. Therefore, a computational load distribution algorithm is developed that balances the workloads subject to user association constraint. Further, radio resource allocation performs load-based eMBB-mMTC slicing and uRLLC puncturing. The simulation results show that the proposed solution reduces the packet loss ratio by up to 15% and increases the user data rate by up to 7% for 4G-like services. Furthermore, the effect of resource granularity in radio allocation has been identified as crucial factor for effective allocation of services with small data loads. Finally, the problem of small granularity has been solved by adapting the allocation interval.
Recent years have disclosed a remarkable proliferation of compute-intensive applications in smart cities. Such applications continuously generate enormous amounts of data which demand strict ...latency-aware computational processing capabilities. Although edge computing is an appealing technology to compensate for stringent latency-related issues, its deployment engenders new challenges. In this article, we highlight the role of edge computing in realizing the vision of smart cities. First, we analyze the evolution of edge computing paradigms. Subsequently, we critically review the state-of-the-art literature focusing on edge computing applications in smart cities. Later, we categorize and classify the literature by devising a comprehensive and meticulous taxonomy. Furthermore, we identify and discuss key requirements, and enumerate recently reported synergies of edge computing-enabled smart cities. Finally, several indispensable open challenges along with their causes and guidelines are discussed, serving as future research directions.
Wireless network virtualization has been introduced to satisfy the ever-increasing user requirements through resource sharing, and it can reduce operating costs for the network. Virtualized resources ...of an infrastructure provider can be allocated as slices to mobile virtual network operators to satisfy their users' demands. Thus, an efficient resource allocation method is needed. Furthermore, existing works have mostly considered resource allocation methods using one infrastructure provider in the system model. However, in realistic and practical environments, multiple infrastructure providers should be considered so that the mobile virtual network operator can choose the appropriate infrastructure provider to maximize its revenue. Therefore, in this paper, a new approach based on matching theory and auctions is proposed for slice allocation for a system with multiple infrastructure providers. Moreover, a matching algorithm and an auction are utilized to work as the distributed methods for solving the user association problem and slice allocation problem, respectively. To connect these two problems, the user association result is used as an input of the auction model so that the mobile virtual network operator can decide on the appropriate infrastructure provider to submit the bidding value. Simulation results show that the developed solutions achieve stable matching and maximize the social welfare of all bidders.
The dense and pervasive deployment of wireless small cells can boost the performance of existing macrocellular networks; however, it poses significant challenges pertaining to the cross-tier ...interference management. In this letter, the downlink resource allocation problem for an underlay small cell network is studied. In this network, the protection of the macrocell tier is achieved by imposing cross-tier interference constraints in the resource allocation problem. To solve the underlying mixed-integer resource allocation problem, we propose two different algorithms. The first algorithm is developed by applying the duality-based optimization approach for the relaxed problem, which enables distributed implementation. The second distributed algorithm, which enables coordination is devised based on matching theory. Simulation results show that the proposed duality-based algorithm outperforms the greedy approach by 4% in terms of sum-rate whereas the matching-based algorithm with tier-coordination yields performance gains up to 17% compared with the duality-based approach.