To support massive access for future wireless networks, we propose a novel reconfigurable intelligent surface (RIS)-enhanced downlink multi-user multi-input single-output (MU-MISO) symbiotic radio ...(SR) system. In the proposed system, each RIS not only enhances the primary transmission from the primary transmitter (PT) to the associated primary receiver (PR) nearby, but also acts as an Internet-of-Things (IoT) device to enable IoT transmissions to the same PR. Therefore, each PR needs to jointly decode the information from both the PT and its corresponding RIS. We are interested in maximizing the weighted sum-rate of both primary and IoT transmissions by jointly designing the active transmit beamforming at PT and the passive beamforming at each RIS under the maximum transmit power constraint at the PT and various constraints on the reflection coefficients (RCs), which include the ideal, continuous-phase and the discrete-phase cases. The formulated problem is non-convex, which cannot be solved directly. Thus, fractional programming (FP) method and alternating optimization (AO) technique are adopted to tackle the problem. In particular, three low-complexity algorithms are proposed to trade off between computational complexity and convergence rate. Compared to different benchmark schemes, simulation results demonstrate that with the aid of the RISs, the PRs can benefit from the enhanced primary transmission from the PT, and receive information from the associated RISs via IoT transmission.
A cognitive radio network (CRN) is formed by either allowing the secondary users (SUs) in a secondary communication network (SCN) to opportunistically operate in the frequency bands originally ...allocated to a primary communication network (PCN) or by allowing SCN to coexist with the primary users (PUs) in PCN as long as the interference caused by SCN to each PU is properly regulated. In this paper, we consider the latter case, known as spectrum sharing, and study the optimal power allocation strategies to achieve the ergodic capacity and the outage capacity of the SU fading channel under different types of power constraints and fading channel models. In particular, besides the interference power constraint at PU, the transmit power constraint of SU is also considered. Since the transmit power and the interference power can be limited either by a peak or an average constraint, various combinations of power constraints are studied. It is shown that there is a capacity gain for SU under the average over the peak transmit/interference power constraint. It is also shown that fading for the channel between SU transmitter and PU receiver is usually a beneficial factor for enhancing the SU channel capacities.
In this paper, a novel reconfigurable intelligent surface (RIS)-assisted multiple-input multiple-output (MIMO) symbiotic radio (SR) system is proposed, in which an RIS, operating as a secondary ...transmitter (STx), sends messages to a multi-antenna secondary receiver (SRx) by using cognitive backscattering communication, and simultaneously, it enhances the primary transmission from a multi-antenna primary transmitter (PTx) to a multi-antenna primary receiver (PRx) by intelligently reconfiguring the wireless environment. We are interested in the joint design of active transmit beamformer at the PTx and passive reflecting beamformer at the STx to minimize the total transmit power at the PTx, subject to the signal-to-noise-ratio (SNR) constraint for the secondary transmission and the rate constraint for the primary transmission. Due to the non-convexity of the formulated problem, we decouple the original problem into a series of subproblems using the alternating optimization method and then iteratively solve them. The convergence performance and computational complexity of the proposed algorithm are analyzed. Furthermore, we develop a low-complexity algorithm to design the reflecting beamformer by solving a backscatter link enhancement problem through the semi-definite relaxation (SDR) technique. Then, theoretical analysis is performed to reveal the insights of the proposed system. Finally, simulation results are presented to validate the effectiveness of the proposed algorithms and the superiority of the proposed system.
It is widely acknowledged that network slicing can tackle the diverse use cases and connectivity services of the forthcoming next-generation mobile networks (5G). Resource scheduling is of vital ...importance for improving resource-multiplexing gain among slices while meeting specific service requirements for radio access network (RAN) slicing. Unfortunately, due to the performance isolation, diversified service requirements, and network dynamics (including user mobility and channel states), resource scheduling in RAN slicing is very challenging. In this paper, we propose an intelligent resource scheduling strategy (iRSS) for 5G RAN slicing. The main idea of an iRSS is to exploit a collaborative learning framework that consists of deep learning (DL) in conjunction with reinforcement learning (RL). Specifically, DL is used to perform large time-scale resource allocation, whereas RL is used to perform online resource scheduling for tackling small time-scale network dynamics, including inaccurate prediction and unexpected network states. Depending on the amount of available historical traffic data, an iRSS can flexibly adjust the significance between the prediction and online decision modules for assisting RAN in making resource scheduling decisions. Numerical results show that the convergence of an iRSS satisfies online resource scheduling requirements and can significantly improve resource utilization while guaranteeing performance isolation between slices, compared with other benchmark algorithms.
Non-Terrestrial Networks (NTNs) composed of space-borne (e.g., satellites) and airborne vehicles (e.g., drones and blimps) have recently been proposed by 3GPP as a new paradigm of infrastructures to ...enhance the capacity and coverage of existing terrestrial wireless networks. The mobility of non-terrestrial base stations (NT-BSs) however leads to a dynamic environment, which imposes unique challenges for handover and throughput optimization particularly in multi-user access control for NTNs. To achieve performance optimization, each terrestrial user equipment (UE) should autonomously estimate the dynamics of moving NT-BSs, which is different from the existing user access control schemes in terrestrial wireless networks. Consequently, new learning schemes for optimum multi-user access control are desired. In this article, we therefore propose a UE-driven deep reinforcement learning (DRL) based scheme, in which a centralized agent deployed at the backhaul side of NT-BSs is responsible for training the parameter of a deep Q-network (DQN), and each UE independently makes its own access decisions based on the parameter from the trained DQN. With the proposed scheme, each UE is able to access a proper NT-BS intelligently to enhance the long-term system throughput and avoid frequent handovers among NT-BSs. Through comprehensive simulation studies, we justify the performance of the proposed scheme, and show its effectiveness in addressing the fundamental issues in the NTNs deployment.
Tag signal detection is one of the key tasks in ambient backscatter communication (AmBC) systems. However, obtaining perfect channel state information (CSI) is challenging and costly, which makes ...AmBC systems suffer from a high bit error rate (BER). To eliminate the requirement of channel estimation and to improve the system performance, in this paper, we adopt a deep transfer learning (DTL) approach to implicitly extract the features of channel and directly recover tag symbols. To this end, we develop a DTL detection framework which consists of offline learning, transfer learning, and online detection. Specifically, a DTL-based likelihood ratio test (DTL-LRT) is derived based on the minimum error probability (MEP) criterion. As a realization of the developed framework, we then apply convolutional neural networks (CNN) to intelligently explore the features of the sample covariance matrix, which facilitates the design of a CNN-based algorithm for tag signal detection. Exploiting the powerful capability of CNN in extracting features of data in the matrix formation, the proposed method is able to further improve the system performance. In addition, an asymptotic explicit expression is also derived to characterize the properties of the proposed CNN-based method when the number of samples is sufficiently large. Finally, extensive simulation results demonstrate that the BER performance of the proposed method is comparable to that of the optimal detection method with perfect CSI.
Reconfigurable intelligent surface (RIS) is a revolutionary technology to achieve spectrum-, energy-, and cost-efficient wireless networks. This paper considers an RIS-assisted downlink ...non-orthogonal-multiple-access (NOMA) system. To optimize the rate performance and ensure user fairness, we maximize the minimum decoding signal-to-interference-plus-noise-ratio (equivalently the rate) of all users, by jointly optimizing the (active) transmit beamforming at the base station (BS) and the phase shifts (i.e., passive beamforming) at the RIS. A combined-channel-strength based user-ordering scheme for NOMA decoding is first proposed to decouple the user-ordering design and the joint beamforming design. Efficient algorithms are further proposed to solve the non-convex problem, by leveraging the block coordinated descent and semidefinite relaxation (SDR) techniques. For the single-antenna BS setup, the optimal power allocation at the BS and the asymptotically optimal phase shifts at the RIS are obtained in closed forms. For the multiple-antenna BS setup, it is shown that the rank of the SDR solution of the transmit beamforming design is upper bounded by two. Also, the proposed algorithms are analyzed in terms of convergence and complexity. Simulation results show that the RIS-assisted NOMA system can enhance the rate performance significantly, compared to traditional NOMA without RIS and traditional orthogonal multiple access with/without RIS.
In this paper, we are interested in symbiotic radio networks (SRNs), in which an Internet-of-Things (IoT) network parasitizes in a primary cellular network to achieve spectrum-, energy-, and ...infrastructure-efficient communications. Each IoT device transmits its own information by backscattering the signals from the primary network without using active radio-frequency (RF) transmitter chain. We consider the symbiosis between the cellular network and the IoT network and focus on the user association problem in SRN. Specifically, the base station (BS) in the primary network serves multiple cellular users using time division multiple access (TDMA) and each IoT device is associated with one cellular user for information transmission. The objective of user association is to link each IoT device to an appropriate cellular user by maximizing the sum rate of all IoT devices. However, the difficulty in obtaining the full real-time channel information makes it difficult to design an optimal policy for this problem. To overcome this issue, we propose two deep reinforcement learning (DRL) algorithms, both use historical information to infer the current information in order to make appropriate decisions. One algorithm, referred to as centralized DRL, makes decisions for all IoT devices at one time with globally available information. The other algorithm, referred to as distributed DRL, makes a decision only for one IoT device at one time using locally available information. Finally, simulation results show that the two proposed DRL algorithms achieve performance comparable to the optimal user association policy which requires perfect real-time information, and the distributed DRL algorithm has the advantage of scalability.
Internet-of-Things (IoT) is a promising technology to connect massive machines and devices in the future communication networks. In this paper, we study a wireless-powered IoT network (WPIN) with ...short packet communication (SPC), in which a hybrid access point (HAP) first transmits power to the IoT devices wirelessly, then the devices in turn transmit their short data packets achieved by finite blocklength codes to the HAP using the harvested energy. Different from the long packet communication in conventional wireless network, SPC suffers from transmission rate degradation and a significant packet error rate. Thus, conventional resource allocation in the existing literature based on Shannon capacity achieved by the infinite blocklength codes is no longer optimal. In this paper, to enhance the transmission efficiency and reliability, we first define effective-throughput and effective-amount-of-information as the performance metrics to balance the transmission rate and the packet error rate, and then jointly optimize the transmission time and packet error rate of each user to maximize the total effective-throughput or minimize the total transmission time subject to the users' individual effective-amount-of-information requirements. To overcome the non-convexity of the formulated problems, we develop efficient algorithms to find high-quality suboptimal solutions for them. The simulation results show that the proposed algorithms can achieve similar performances as that of the optimal solution via exhaustive search, and outperform the benchmark schemes.
The rising popularity of wireless services resulting in spectrum shortage has motivated dynamic spectrum sharing to facilitate efficient usage of the underutilized spectrum. Wideband spectrum sensing ...is a critical functionality to enable dynamic spectrum access by enhancing the opportunities of exploring spectral holes, but entails a major implementation challenge in compact commodity radios that only have limited energy and computation capabilities. In contrast to the traditional sub-Nyquist approaches where a wideband signal or its power spectrum is first reconstructed from compressed samples, this paper proposes a sub-Nyquist wideband spectrum sensing scheme that locates occupied channels blindly by recovering the signal support, based on the jointly sparse nature of multiband signals. Exploiting the common signal support shared among multiple secondary users (SUs), an efficient cooperative spectrum sensing scheme is developed, in which the energy consumption on wideband signal acquisition, processing, and transmission is reduced with detection performance guarantee. Based on subspace decomposition, the low-dimensional measurement matrix, computed at each SU from local sub-Nyquist samples, is deployed to reduce the transmission and computation overhead while improving noise robustness. The theoretical analysis of the proposed sub-Nyquist wideband sensing algorithm is derived and verified by numerical analysis and further tested on real-world TV white space signals. It shows that the proposed scheme can achieve good detection performance as well as reduce the computation and implementation complexity, in comparison with the conventional cooperative wideband spectrum sensing schemes.