Workshops and Tutorials

Tutorials

Session TUT01

Tutorial: Federated Learning at the Network Edge: Fundamentals, Key Technologies, and Future Trends

Conference
9:00 AM — 12:15 PM KST
Local
May 24 Sun, 5:00 PM — 8:15 PM PDT

Federated Learning at the Network Edge: Fundamentals, Key Technologies, and Future Trends

Howard Yang (Singapore University of Technology and Design, Singapore), Zhongyuan Zhao (Beijing University of Posts and Telecommunications, China), Tony Q. S. Quek (Singapore University of Technology and Design, Singapore)

14
The burgeoning advances from machine learning and wireless technologies are forging a new paradigm for future networks, which are expected to possess higher degrees of intelligence via the inference from vast data set and being able to respond to local events in a prompt manner. Due to the sheer volume of data generated by the end devices, as well as the increasing concerns about sharing private information, a new branch of machine learning model, namely the federated learning, has emerged from the intersection of artificial intelligence and edge computing. In contrast to the conventional machine learning methods, federated learning brings the models directly to the device for training, where only the resultant parameters shall be sent to the edge servers. The local copies of the model on the devices bring along great advantages of eliminating the network latency and preserving data privacy. Nevertheless, to make federated learning possible, one needs to tackle new challenges that require a fundamental departure from the standard methods designed for distributed optimizations. In this tutorial, we deliver a comprehensive introduction of the federated learning. Specifically, we first survey the basis of federated learning, including its distinct features from conventional machine learning models, the fundamental theories that ensure the successful operation of federated learning, and the algorithms to avail an effective adoption. We then enumerate several critical issues associated with the deployment of federated learning in a wireless network, and show how technologies from different perspectives, ranging from algorithmic design, on-device training, to communication resource management, shall be jointly integrated to facilitate the full implementation. Finally, we conclude by shedding light on future works.

Session Chair

Howard Yang (SUTD, Singapore)

Play Session Recording
Session TUT02

Tutorial: Towards Smart and Reconfigurable Environment: Intelligent Reflecting Surface Aided Wireless Networks

Conference
9:00 AM — 12:15 PM KST
Local
May 24 Sun, 5:00 PM — 8:15 PM PDT

Towards Smart and Reconfigurable Environment: Intelligent Reflecting Surface Aided Wireless Networks

Rui Zhang (National University of Singapore, Singapore)

18
In this tutorial, we introduce a new wireless communication paradigm by employing a massive number of low-cost passive reflecting elements with independently controllable amplitude and phase, named intelligent reflecting surface (IRS), which is able to smartly reconfigure the wireless signal propagation and realize 3D reflect beamforming for signal enhancement and/or interference suppression. We first present the signal and channel models of IRS by taking into account its hardware implementation constraints in practice. We then illustrate the main applications of IRS in achieving spectrum and energy efficient as well as secure and sustainable wireless networks, and highlight its cost and performance advantages as compared to other existing technologies such as small-cell network, massive MIMO, and active relaying. We also report the state-of-the-art results based on recently conducted experiments from both industry and academia. Next, we focus on discussing the main challenges in analyzing, designing and implementing IRS-aided wireless networks, including capacity characterization, joint active and passive beamforming optimization, channel acquisition, IRS deployment, hardware imperfections, and so on. This is then followed by several selected case studies of IRS-aided wireless system design to show its practical performance gains and draw useful insights. Finally, we discuss other extensions and point out promising directions for future research and investigation.

Session Chair

Rui Zhang (National University of Singapore, Singapore)

Play Session Recording
Session TUT03

Tutorial: Machine Learning for Future Wireless Networks

Conference
9:00 AM — 12:15 PM KST
Local
May 24 Sun, 5:00 PM — 8:15 PM PDT

Machine Learning for Future Wireless Networks

Kwang-Cheng Chen (University of South Florida, USA)

16
With amazing advances of machine learning technology, a new technological paradigm emerges. This tutorial presents a holistic comprehension about machine learning knowledge and the methodology to apply machine learning to assist, enhance, and enable wireless networking functionalities. It further introduces a new research area about wireless networking for smart machines of artificial intelligence. Future network architecture to fully utilize the machine learning capacity serves the closing focus of this tutorial. This tutorial is organized in a vertical manner by introducing machine learning techniques, while showing examples to apply after each machine learning technique and orienting criterion/conditions to apply each machine learning technique.

Session Chair

Kwang-Cheng Chen (University of South Florida, USA)

Play Session Recording
Session TUT05

Tutorial: Moving Towards Zero-Touch Automation, A Key Enabler for 6G: The Challenges & Opportunities

Conference
9:00 AM — 12:15 PM KST
Local
May 24 Sun, 5:00 PM — 8:15 PM PDT

Moving Towards Zero-Touch Automation, A Key Enabler for 6G: The Challenges & Opportunities

Ali Imran (University of Oklahoma, USA), Muhammad Ali Imran (University of Glasgow, United Kingdom (Great Britain))

5
Compounding operational complexity, diverging service requirements and exploding degrees of freedom in hybrid terrestrial and aerial architecture being conceived for 6G combined with steadily shrinking profit margins, hinge the technical and financial viability of future mobile networks on achieving zero touch automation. However, despite the recent success of AI for enabling automation in other domains, in mobile networks attempts towards AI powered zero touch automation are hampered by two fundamental challenges: 1)Sparsity of the training data in mobile networks: unlike many other native applications of AI, real cellular data for training AI is generally both scarce and sparse. This is because operators generally do not test a wide range of parameters on live network, and whatever data they have cannot be extracted and shared easily. This limits the utility of some of the most powerful AI tools such as DNN for solving many practical problems in mobile networks; 2) Hyper parameterization native to mobile networks: even before the training can begin, designing and tuning the hyperparameters of an AI model to deliver reliable performance in presence of dynamics that are hallmark of mobile networks remains more of an art than science. In bulk of AI based solutions for mobile networks in literature, hyper-parameterization is either done through hit and trial based human effort or simply already hyper-parameterized models are borrowed from other domains and then trained on mobile network data. Furthermore, current approach to hyper-parameterization requires advanced expertise both in mobile network domain knowledge and machine learning, making it a generally unachievable task for an expert in either domain. Without addressing these two challenges explicitly and timely, despite the hype and hopes, full potential of AI cannot be harnessed for mobile networks. The goal of this tutorial is to first introduce the zero-touch automation framework and then provide an in-depth analysis of sparsity and hyper-parameterization problems and their multi-faceted implications on the performance of AI based solutions in mobile networks. Leveraging insights and latest results from several ongoing projects focused on zero touch automation, the rest of the tutorial then focuses on a set of promising approaches for addressing the data sparsity and hyper-parameter design challenges in mobile networks. Some of the approaches to be discussed include, one shot learning, inductive transfer learning, transductive transfer learning, unsupervised transfer learning, leveraging different types of network geometries, use of generative adversarial networks (GANs) and novel methods for realistic synthetic data generation to address the sparsity challenge. Auto-ML techniques such as Neuro-Evolutionary Algorithm to design and tune DNN, Reinforcement Learning to design and tune DNN, Federated Learning and Bayesian optimization as a tool to solve hyper-parameterization problem will be discussed. The tutorial will conclude with introduction of new real problems of mobile industry interest that require AI based solutions and potential solution approaches and opportunities therein to trigger the much-needed focused research effort in mobile network native AI to ultimately enable zero touch automation.

Session Chair

Ali Imran (University of Oklahoma, USA)

Play Session Recording
Session TUT06

Tutorial: B5G: A New Frontier for Non-Orthogonal Multiple Access

Conference
9:00 AM — 12:15 PM KST
Local
May 24 Sun, 5:00 PM — 8:15 PM PDT

B5G: A New Frontier for Non-Orthogonal Multiple Access

Zhiguo Ding (University of Manchester, United Kingdom (Great Britain))

9
Non-orthogonal multiple access (NOMA) is an essential enabling technology for future wireless networks to meet the heterogeneous demands on low latency, high reliability, massive connectivity, improved fairness, and high throughput. The key idea behind NOMA is to serve multiple users in the same resource block, such as a time slot, subcarrier, or spreading code. The NOMA principle provides a general framework, where various recently proposed 5G multiple access techniques can be viewed as special cases. Recent demonstrations by industry show that the use of NOMA can significantly improve the spectral efficiency of mobile networks. Because of its superior performance, NOMA has been also recently included in 3GPP Releases 14 and 15 for downlink transmission, proposed to Release 16 for uplink transmission, and included into the next generation digital TV standard, e.g. ATSC (Advanced Television Systems Committee) 3.0. This tutorial is to provide an overview of the latest research results and innovations in NOMA technologies as well as their applications. Future research challenges regarding NOMA in B5G and beyond are also presented.

Session Chair

Dr. Gyeongrae Im (ETRI, Korea)

Play Session Recording
Session TUT04

Tutorial: Wireless Transmission for Advanced Internet of Things: A Unifying Data-Oriented Approach

Conference
2:00 PM — 5:15 PM KST
Local
May 24 Sun, 10:00 PM — 1:15 AM PDT

Wireless Transmission for Advanced Internet of Things: A Unifying Data-Oriented Approach

Hong-Chuan Yang (University of Victoria, Canada), Mohamed-Slim Alouini (King Abdullah University of Science and Technology (KAUST), Saudi Arabia)

2
Wireless communication systems will play an essential role in data transmission for future Internet of Things (IoT). The design and optimization of wireless transmission strategies for diverse IoT applications that generate data of variable sizes and dramatically different quality of service requirements are of critical contemporary interest. In this proposed tutorial, we present a unique data-oriented approach for wireless transmission system design, specifically targeting vertical IoT applications that demand ultra-reliable low-latency and extremely high energy efficiency. We introduce novel data-oriented metrics to characterize theoretical performance limits for various transmission scenarios. These performance metrics are also applied to the analysis and design of practical transmission schemes. The analysis is also generalized to cognitive secondary transmission. The data-oriented approach offers important new insights and leads to interesting new research directions. Through this tutorial, the attendees can obtain a brand new perspective to the analysis and optimization of wireless transmission technologies for advanced IoT applications.

Session Chair

Hong-Chuan Yang (University of Victoria, Canada)

Play Session Recording
Session TUT07

Tutorial: UAV Communications in 5G and Beyond: Integration of Sensing, Control, and Learning

Conference
2:00 PM — 5:15 PM KST
Local
May 24 Sun, 10:00 PM — 1:15 AM PDT

UAV Communications in 5G and Beyond: Integration of Sensing, Control, and Learning

Lingyang Song (Peking University, China), Zhu Han (University of Houston, USA), Hongliang Zhang (University of Houston, USA)

8
The emerging unmanned aerial vehicles (UAVs) have been playing an increasing role in the military, public, and civil applications. Very recently, 3GPP has approved the study item on enhanced support to seamlessly integrate UAVs into future cellular networks. Unlike terrestrial cellular networks, UAV communications have many distinctive features such as high dynamic network topologies and weakly connected communication links. In addition, they still suffer from some practical constraints such as battery power, no-fly zone, etc. As such, many standards, protocols, and design methodologies used in terrestrial wireless networks are not directly applicable to airborne communication networks. Therefore, it is essential to develop new communication, signal processing, and optimization techniques in support of the ultra-reliable and real-time sensing applications, but enabling high data-rate transmissions to assist the terrestrial communications in LTE. Typically, to integrate UAVs into cellular networks, one needs to consider two main scenarios of UAV applications as follows.

First, dedicated UAVs, also called drones, can be used as communication platforms in the way as wireless access points or relays nodes, to further assist the terrestrial communications. This type of applications can be referred to as UAV Assisted Cellular Communications. UAV-assisted cellular communications have numerous use cases, including traffic offloading, wireless backhauling, swift service recovery after natural disasters, emergency response, rescue and search, information dissemination/broadcasting, and data collection from ground sensors for machine-type communications. However, different from traditional cellular networks, how to plan the time-variant placements of the UAVs served as base station (BS)/relay is very challenging due to the complicated 3D propagation environments as well as many other practical constraints such as power and flying speed. In addition, spectrum sharing with existing cellular networks is another interesting topic to investigate.

Second type of application is to exploit UAVs for sensing purposes due to its advantages of on-demand flexible deployment, larger service coverage compared with the conventional fixed sensor nodes, and ability to hover. Specially, UAVs, equipped with cameras or sensors, have come into our daily lives to execute critical real-time sensing tasks, such as smart agriculture, security monitoring, forest fire detection, and traffic surveillance. Due to the limited computation capability of UAVs, the real-time sensory data needs to be transmitted to the BS for real-time data processing. In this regard, the cellular networks are necessarily committed to support the data transmission for UAVs, which we refer to as Cellular assisted UAV Sensing. Nevertheless, to support real-time sensing streaming, it is desirable to design joint sensing and communication protocols, develop novel beamforming and estimation algorithms, and study efficient distributed resource optimization methods.

The aim of this tutorial is to bring together control, signal processing engineers, computer and information scientists, applied mathematicians and statisticians, as well as systems engineers to carve out the role that analytical and experimental engineering has to play in UAV research and development. This proposal will emphasize on UAV technologies and applications for cellular networks. There are four main objectives. The first objective is to provide an introduction to the UAV paradigm, from 5G and beyond communication perspective. The second objective is to introduce the key methods, including optimization, game, and graph theory, for UAV applications, in a comprehensive way. The third objective is to discuss UAV assisted cellular communications. The fourth objective is to present the state-of-the-art for cellular network assisted UAV sensing. Many examples will be illustrated in details so as to provide wide scope for general audiences.

Session Chair

Hongliang Zhang (University of Houston, USA)

Play Session Recording
Session TUT08

Tutorial: Orthogonal Time Frequency Space (OTFS) Modulation and Applications

Conference
2:00 PM — 5:15 PM KST
Local
May 24 Sun, 10:00 PM — 1:15 AM PDT

Orthogonal Time Frequency Space (OTFS) Modulation and Applications

Emanuele Viterbo and Yi Hong (Monash University, Australia)

7
Emerging mass transportation systems – such as self-driving cars, high-speed trains, drones, flying cars, and supersonic flight – will challenge the design of future wireless networks due to high-mobility environments: a large number of high-mobility users require high data rates and low latencies. The physical layer modulation technique is a key design component to meet the system requirements of high mobility. Currently, orthogonal frequency division multiplexing (OFDM) is the modulation scheme deployed in 4G long term evolution (LTE) mobile systems, where the wireless channel typically exhibits time-varying multipath fading. OFDM can only achieve a near-capacity performance over a doubly dispersive channel with a low Doppler effect, but suffers heavy degradations under high Doppler conditions, typically found in high-mobility environments. Orthogonal time frequency space (OTFS) modulation has been recently proposed by Hadani et al. at WCNC’17, San Francisco. It was shown to provide significant advantages over OFDM in doubly dispersive channels. OTFS multiplexes each information symbol over a 2D orthogonal basis functions, specifically designed to combat the dynamics of the time-varying multipath channels. As a result, all information symbols experience a constant flat fading equivalent channel. OTFS is only in its infancy, leaving many opportunities for significant developments on both practical and theoretical fronts.

Session Chair

Emanuele Viterbo (Monash University, Australia)

Play Session Recording
Session TUT09

Tutorial: URLLC for 5G and Beyond: Physical, MAC and Network Design and Solutions

Conference
2:00 PM — 5:15 PM KST
Local
May 24 Sun, 10:00 PM — 1:15 AM PDT

URLLC for 5G and Beyond: Physical, MAC and Network Design and Solutions

Branka Vucetic, Yonghui Li and Mahyar Shirvanimoghaddam (University of Sydney, Australia), Rana Abbas (The University of Sydney, Australia), Changyang She (University of Sydney, Australia)

9
The world is currently witnessing the rise of many mission critical applications such as tele-surgery, intelligent transportation, industry automation, virtual reality and augmented reality, vehicular communications, etc. Some of these applications will be enabled by the fifth-generation of cellular networks (5G), which will provide the required ultra-reliable low-latency communication (URLLC). However, guaranteeing these stringent reliability and end-to-end latency requirements continues to prove to be quite challenging, due to the significant shift in paradigms required in both theoretical fundamentals of wireless communications as well as design principles. In this tutorial, we cover the challenges and potential solutions for 5G and beyond 5G to support URLLC, in terms of error control coding improving reliability, channel access protocols for reducing latency, and multi-connectivity for improving network availability.

Session Chair

Branka Vucetic (University of Sydney, Australia)

Play Session Recording
Session TUT10

Tutorial: NOMA-Based Random Access for Massive MTC in 5G

Conference
2:00 PM — 5:15 PM KST
Local
May 24 Sun, 10:00 PM — 1:15 AM PDT

NOMA-Based Random Access for Massive MTC in 5G

Jinho Choi (Deakin University, Australia)

6
Machine-type communication (MTC) becomes a key element for the Inter- net of Things (IoT) as it enables to support massive IoT connectivity in 5th generation (5G) and future wireless systems. Due to the sparse device activity, uncoordinated transmission schemes (e.g., random access) are considered for most existing MTC schemes in standards. In general, since the performance of MTC is limited by the system bandwidth, a wide system bandwidth is required to support a large number of MTC/IoT devices. To increase the number of MTC/IoT devices with a given system bandwidth, non-orthogonal multiple access (NOMA), which has been extensively studied to improve the spectral efficiency for conventional or human-type communication (HTC), can be applied to MTC. In this tutorial, we demonstrate how the notion of NOMA can be applied to MTC in 5G in order to support a large number of IoT devices. To this end, we present various NOMA-based random access schemes for MTC and explain how they can be designed and analyzed.

Session Chair

Jinho Choi (Deakin University, Australia)

Play Session Recording

Made with in Toronto · Privacy Policy · © 2020 Duetone Corp.