Login | Register

Cache-Aided Delivery Networks with Correlated Content in a Shared Cache Framework

Title:

Cache-Aided Delivery Networks with Correlated Content in a Shared Cache Framework

Merikhi, Behnaz ORCID: https://orcid.org/0000-0002-3441-2949 (2022) Cache-Aided Delivery Networks with Correlated Content in a Shared Cache Framework. PhD thesis, Concordia University.

[thumbnail of Merikhi_PhD_S2023.pdf]
Preview
Text (application/pdf)
Merikhi_PhD_S2023.pdf - Accepted Version
Available under License Spectrum Terms of Access.
5MB

Abstract

Internet traffic is growing exponentially due to the penetration of powerful internet-connected devices and cutting-edge technologies. Additionally, the rise in internet usage has coincided with a shift in the nature of data traffic from voice-based to content-based usage, putting significant stress on delivery networks. Despite the infrastructural advancements in communication networks over the past few years, content delivery networks (CDNs) still face challenges in keeping up with the high delivery data rates and suffer from the imbalanced network load between off-peak hours and peak hours.
In this regard, content caching has emerged as an efficient technique to combat the high delivery date rates and maintain a balanced network load while improving the quality of services (QoS) by storing some popular content close to the end users. Caching networks operate in two phases; the placement phase during off-peak hours before users reveal their demands and the delivery phase, which is accomplished when users’ demands are revealed to the server during peak hours. As the server is unaware of the demands during the placement phase, this phase must be designed carefully to minimize the delivery rate regardless of the requested content during peak hours.
This dissertation studies cache-aided delivery networks with correlated content in a shared cache framework. A shared cache framework is beneficial in the current and next-generation wireless networks as it provides a local cache to all users within small base stations (SBSs), relieving strain on the backhaul. Furthermore, the library of a caching network could consist of content with a high degree of similarity in many practical applications; Therefore, exploiting the similarity among library content can also be leveraged to reduce the delivery rate in such networks.
In this dissertation, we look at the proposed caching network from an information-theoretic perspective and formulate it as a distributed source coding problem with side information at the decoder. Then, the critical question arises as to what should be cached as side information to reduce the delivery rate of the network efficiently.
To answer this question, we propose an automatic clustering scheme using artificial intelligence (AI)-based optimization techniques to identify the selected side information for the entire library. We comprehensively evaluate the performance of the general clustering framework in a separate chapter by considering different datasets and distance measures. The general clustering framework enables us to develop two novel clustering schemes as a part of the placement phase of the proposed caching networks under different settings throughout this study, considering both the similarity and popularity of the library content.
Upon identifying the selected side information for such networks, the next question that should be answered is how we should place the side information into caches; And consequently, what is the delivery strategy for this placement scheme? We have furnished our answer to these questions by considering three different caching networks: first, a network in a single shared cache framework under lossy caching. Next is a network with multiple shared caches under uniform popularity, and finally, a network with multiple shared caches under non-uniform preferences. In such networks, we address the placement and delivery strategy to show the trade-off between the delivery rate and the memory size of the system. We calculate the peak and expected rates of the studied networks by considering the rate-distortion function and caching strategy. We also introduce the optimum library partitioning formulated to minimize the peak delivery rate in the system.
The performance analysis and extensive simulations of the proposed solution confirm that our scheme provides a considerable boost in network efficiency compared to legacy caching schemes due to our problem formulation and the careful extraction of side information during the placement phase.

Divisions:Concordia University > Gina Cody School of Engineering and Computer Science > Electrical and Computer Engineering
Item Type:Thesis (PhD)
Authors:Merikhi, Behnaz
Institution:Concordia University
Degree Name:Ph. D.
Program:Electrical and Computer Engineering
Date:11 October 2022
Thesis Supervisor(s):Soleymani, Mohammad Reza
ID Code:991430
Deposited By: Behnaz Merikhi
Deposited On:21 Jun 2023 14:14
Last Modified:21 Jun 2023 14:14
All items in Spectrum are protected by copyright, with all rights reserved. The use of items is governed by Spectrum's terms of access.

Repository Staff Only: item control page

Downloads per month over past year

Research related to the current document (at the CORE website)
- Research related to the current document (at the CORE website)
Back to top Back to top