Login | Register

Performance Enhancement of Random Access for Low Latency Communication in IoT using Machine Learning

Title:

Performance Enhancement of Random Access for Low Latency Communication in IoT using Machine Learning

Almahjoub, Alhusein (2022) Performance Enhancement of Random Access for Low Latency Communication in IoT using Machine Learning. PhD thesis, Concordia University.

[thumbnail of Almahjoub_PhD_F2022.pdf]
Preview
Text (application/pdf)
Almahjoub_PhD_F2022.pdf - Accepted Version
Available under License Spectrum Terms of Access.
6MB

Abstract

In today’s fast-moving world, where everything is moving toward being autonomous, the internet of things (IoT) provides the platform at which this feat can be achieved. With increasing data rates in long-term evolution advanced (LTE-A) and 5G, the IoT devices are already in the market, sending event triggered or autonomous data towards other IoT devices or to the servers. With the growing number of IoT devices, it is imperative to use an infrastructure network which is LTE-A or 5G. The first step in getting access to the network is the random access process after which the scheduled access starts. If the bottleneck appears at this step, then the throughput of scheduled access also gets affected. With a large number of IoT devices and limited resources, collisions are bound to happen which translates into a waste of resources and a waste of energy. My Ph.D. research focuses on this issue. The preliminary studies about the random access process in LTE-A and already employed techniques to reduce collisions have been carried out. Machine learning (ML) techniques are also studied and one of the techniques called Naïve Bayesian estimation is applied to estimate the number of IoT devices. The estimation is done to optimize the number of successes and reduce the end-to-end delay. Grant-free random access is becoming the talk of the town due to its low access delay as the devices need not exchange messages with eNodeB to transmit. But if there is more traffic, its performance may be worse than traditional random access. We take up this issue and devise an algorithm to realize grant-free transmission such that the success probability is maximum and the delay is minimum. We use deep learning to estimate the channel outcome which helps in achieving a high success probability. In the 5G era, the deployment of the eNodeBs can be quite dense to increase the throughput. However, it comes with the issue of non-uniform traffic to each eNodeBs as the traffic generated by the devices is random in nature. Based on ML, we devise an algorithm to select the lightly loaded eNodeB such that the probability of success is maximum. The simulations of the problems are carried out using Matlab to verify the theoretical results.

Divisions:Concordia University > Gina Cody School of Engineering and Computer Science > Electrical and Computer Engineering
Item Type:Thesis (PhD)
Authors:Almahjoub, Alhusein
Institution:Concordia University
Degree Name:Ph. D.
Program:Psychology
Date:5 September 2022
Thesis Supervisor(s):Dongyu, Qiu
ID Code:991062
Deposited By: Alhusein Almahjoub
Deposited On:21 Jun 2023 14:12
Last Modified:01 Oct 2024 00:00
All items in Spectrum are protected by copyright, with all rights reserved. The use of items is governed by Spectrum's terms of access.

Repository Staff Only: item control page

Downloads per month over past year

Research related to the current document (at the CORE website)
- Research related to the current document (at the CORE website)
Back to top Back to top