Cooperation between the fog and the cloud in mobile cloud computing environments could offer improved offloading services to smart mobile user equipment (UE) with computation intensive tasks. In this ...paper, we tackle the computation offloading problem in a mixed fog/cloud system by jointly optimizing the offloading decisions and the allocation of computation resource, transmit power, and radio bandwidth while guaranteeing user fairness and maximum tolerable delay. This optimization problem is formulated to minimize the maximal weighted cost of delay and energy consumption (EC) among all UEs, which is a mixed-integer non-linear programming problem. Due to the NP-hardness of the problem, we propose a low-complexity suboptimal algorithm to solve it, where the offloading decisions are obtained via semidefinite relaxation and randomization, and the resource allocation is obtained using fractional programming theory and Lagrangian dual decomposition. Simulation results are presented to verify the convergence performance of our proposed algorithms and their achieved fairness among UEs, and the performance gains in terms of delay, EC, and the number of beneficial UEs over existing algorithms.
The proliferation of smart vehicular terminals (VTs) and their resource hungry applications impose serious challenges to the processing capabilities of VTs and the delivery of vehicular services. ...Mobile Edge Computing (MEC) offers a promising paradigm to solve this problem by offloading VT applications to proximal MEC servers, while TV white space (TVWS) bands can be used to supplement the bandwidth for computation offloading. In this paper, we consider a cognitive vehicular network that uses the TVWS band, and formulate a dual-side optimization problem, to minimize the cost of VTs and that of the MEC server at the same time. Specifically, the dual-side cost minimization is achieved by jointly optimizing the offloading decision and local CPU frequency on the VT side, and the radio resource allocation and server provisioning on the server side, while guaranteeing network stability. Based on Lyapunov optimization, we design an algorithm called DDORV to tackle the joint optimization problem, where only current system states, such as channel states and traffic arrivals, are needed. The closed-form solution to the VT-side problem is obtained easily by derivation and comparing two values. For MEC server side optimization, we first obtain server provisioning independently, and then devise a continuous relaxation and Lagrangian dual decomposition based iterative algorithm for joint radio resource and power allocation. Simulation results demonstrate that DDORV converges fast, can balance the cost-delay tradeoff flexibly, and can obtain more performance gains in cost reduction as compared with existing schemes.
Femtocells are being considered a promising technique to improve the capacity and coverage for indoor wireless users. However, the cross-tier interference in the spectrum-sharing deployment of ...femtocells can degrade the system performance seriously. The resource allocation problem in both the uplink and the downlink for two-tier networks comprising spectrum-sharing femtocells and macrocells is investigated. A resource allocation scheme for cochannel femtocells is proposed, aiming to maximize the capacity for both delay-sensitive users and delay-tolerant users subject to the delay-sensitive users' quality-of-service constraint and an interference constraint imposed by the macrocell. The subchannel and power allocation problem is modeled as a mixed-integer programming problem, and then, it is transformed into a convex optimization problem by relaxing subchannel sharing; finally, it is solved by the dual decomposition method. Subsequently, an iterative subchannel and power allocation algorithm considering heterogeneous services and cross-tier interference is proposed for the problem using the subgradient update. A practical low-complexity distributed subchannel and power allocation algorithm is developed to reduce the computational cost. The complexity of the proposed algorithms is analyzed, and the effectiveness of the proposed algorithms is verified by simulations.
Satellites have largely been designed as application-specific and isolated for the past decades. Though with certain benefits, it might lead to resource under utilization and limited satellite ...applications. As an emerging networking technology, software-defined networking has recently been introduced into satellite networks. In this letter, we propose a software-defined satellite networking (SDSN) architecture, which simplifies networking among versatile satellites and enables new protocols to be easily tested and deployed. In particular, we propose a seamless handover mechanism based on SDSN and conduct physical layer simulation, which shows significant improvement over the existing hard handover and hybrid handover mechanisms in terms of handover latency, throughput, and quality of experience of users.
Ambient backscatter communications (AmBackComs) have been recognized as a spectrum- and energy-efficient technology for the Internet of Things, as it allows passive backscatter devices (BDs) to ...modulate their information into the legacy signals, e.g., cellular signals, and reflect them to their associated receivers while harvesting energy from the legacy signals to power their circuit operation. However, the co-channel interference between the backscatter link and the legacy link and the nonlinear behavior of energy harvesters at the BDs have largely been ignored in the performance analysis of AmBackComs. Taking these two aspects, this article provides a comprehensive outage performance analysis for an AmBackCom system with multiple backscatter links, where one of the backscatter links is opportunistically selected to leverage the legacy signals transmitted in a given resource block. For any selected backscatter link, we propose an adaptive reflection coefficient (RC), which is adapted to the nonlinear energy harvesting (EH) model and the location of the selected backscatter link, to minimize the outage probability of the backscatter link. In order to study the impact of co-channel interference on both backscatter and legacy links, for a selected backscatter link, we derive the outage probabilities for the legacy link and the backscatter link. Furthermore, we study the best and worst outage performances for the backscatter system where the selected backscatter link maximizes or minimizes the signal-to-interference-plus-noise ratio (SINR) at the backscatter receiver. We also study the best and worst outage performances for the legacy link where the selected backscatter link results in the lowest and highest co-channel interference to the legacy receiver, respectively. Computer simulations validate our analytical results and reveal the impacts of the co-channel interference and the EH model on the AmBackCom performance. In particular, the co-channel interference leads to the outage saturation phenomenon in AmBackComs, and the conventional linear EH model results in an overestimated outage performance for the backscatter link.
Cognitive small cell networks have been envisioned as a promising technique for meeting the exponentially increasing mobile traffic demand. Recently, many technological issues pertaining to cognitive ...small cell networks have been studied, including resource allocation and interference mitigation, but most studies assume non-cooperative schemes or perfect channel state information (CSI). Different from the existing works, we investigate the joint uplink subchannel and power allocation problem in cognitive small cells using cooperative Nash bargaining game theory, where the cross-tier interference mitigation, minimum outage probability requirement, imperfect CSI and fairness in terms of minimum rate requirement are considered. A unified analytical framework is proposed for the optimization problem, where the near optimal cooperative bargaining resource allocation strategy is derived based on Lagrangian dual decomposition by introducing time-sharing variables and recalling the Lambert-W function. The existence, uniqueness, and fairness of the solution to this game model are proved. A cooperative Nash bargaining resource allocation algorithm is developed, and is shown to converge to a Pareto-optimal equilibrium for the cooperative game. Simulation results are provided to verify the effectiveness of the proposed cooperative game algorithm for efficient and fair resource allocation in cognitive small cell networks.
Mobile-edge computing (MEC) is a promising paradigm to improve the quality of computation experience of mobile devices because it allows mobile devices to offload computing tasks to MEC servers, ...benefiting from the powerful computing resources of MEC servers. However, the existing computation-offloading works have also some open issues: 1) security and privacy issues; 2) cooperative computation offloading; and 3) dynamic optimization. To address the security and privacy issues, we employ the blockchain technology that ensures the reliability and irreversibility of data in MEC systems. Meanwhile, we jointly design and optimize the performance of blockchain and MEC. In this article, we develop a cooperative computation offloading and resource allocation framework for blockchain-enabled MEC systems. In the framework, we design a multiobjective function to maximize the computation rate of MEC systems and the transaction throughput of blockchain systems by jointly optimizing offloading decision, power allocation, block size, and block interval. Due to the dynamic characteristics of the wireless fading channel and the processing queues at MEC servers, the joint optimization is formulated as a Markov decision process (MDP). To tackle the dynamics and complexity of the blockchain-enabled MEC system, we develop an asynchronous advantage actor-critic-based cooperation computation offloading and resource allocation algorithm to solve the MDP problem. In the algorithm, deep neural networks are optimized by utilizing asynchronous gradient descent and eliminating the correlation of data. The simulation results show that the proposed algorithm converges fast and achieves significant performance improvements over existing schemes in terms of total reward.
Recently, energy efficiency in wireless networks has become an important objective. Aside from the growing proliferation of smartphones and other high-end devices in conventional human-to-human (H2H) ...communication, the introduction of machine-to-machine (M2M) communication or machine-type communication into cellular networks is another contributing factor. In this paper, we investigate quality-of-service (QoS)-driven energy-efficient design for the uplink of long term evolution (LTE) networks in M2M/H2H co-existence scenarios. We formulate the resource allocation problem as a maximization of effective capacity-based bits-per-joule capacity under statistical QoS provisioning. The specific constraints of single carrier frequency division multiple access (uplink air interface in LTE networks) pertaining to power and resource block allocation not only complicate the resource allocation problem, but also render the standard Lagrangian duality techniques inapplicable. We overcome the analytical and computational intractability by first transforming the original problem into a mixed integer programming (MIP) problem and then formulating its dual problem using the canonical duality theory. The proposed energy-efficient design is compared with the spectral efficient design along with round robin (RR) and best channel quality indicator (BCQI) algorithms. Numerical results, which are obtained using the invasive weed optimization (IWO) algorithm, show that the proposed energy-efficient uplink design not only outperforms other algorithms in terms of energy efficiency while satisfying the QoS requirements, but also performs closer to the optimal design.
With the introduction of femtocells, cellular networks are moving from the conventional centralized network architecture to a distributed one, where each network cell should make its own radio ...resource allocation decisions, while providing inter-cell interference mitigation. However, realizing such distributed network architecture is not a trivial task. In this paper, we first introduce a simple self-organization rule, based on minimizing cell transmit power, following which a distributed cellular network is able to converge into an efficient resource reuse pattern. Based on such self-organization rule and taking realistic resource allocation constraints into account, we also propose two novel resource allocation algorithms, being autonomous and coordinated, respectively. Performance of the proposed self-organization rule and resource allocation algorithms are evaluated using system-level simulations, and show that power efficiency is not necessarily in conflict with capacity improvements at the network level. The proposed resource allocation algorithms provide significant performance improvements in terms of user outages and network capacity over cutting-edge resource allocation algorithms proposed in the literature.
In this letter, we investigate the relay selection (RS) problem in cooperative non-orthogonal multiple access networks, where one base station communicates with two paired users with different ...priorities through multiple relays. A novel two-stage RS scheme is proposed to satisfy the quality-of-service requirements of both users by considering the employment of the instantaneous channel state information in the user ordering. We derive a closed-form expression of the outage probability and obtain the diversity order for the proposed scheme. The simulation results show that the proposed RS scheme achieves a lower outage probability than the existing RS schemes.