AN EFFICIENT HYBRID FRAMEWORK FOR OPTIMAL RESOURCE ALLOCATION IN IOT-FOG-CLOUD SYSTEM

Main Article Content

Priyadharshini A
Dhinakaran S

Abstract

The amount of data produced by intelligent devices has grown significantly with the emergence of the Internet of Things (IoT) systems and other technologies. To analyze and store this data, cloud computing offers limitless processing and storage capacity. However, it suffers from a lack of geographic awareness, excessive energy usage, and high transmission delay. Additionally, it is not suitable for handling this data since the generated data is delay-sensitive. Therefore, the fog paradigm has been developed, which enables data to be analyzed in the proximity of IoT systems. However, its capacity constraints make it unsuitable for analyzing massive amounts of data. To ensure the efficient completion of delay-sensitive tasks and handle the massive amount of data generated, it is essential to integrate the fog and cloud paradigms with a common goal. This article proposes an effective Resource Allocation (RA) technique to utilize fog and cloud resources for completing delay-sensitive tasks and handling the massive amount of data generated by IoT devices. Initially, tasks in the arrival queue are categorized and assigned to appropriate resources in the cloud and fog layers based on the task guarantee ratio. A Deep Neural Network (DNN) classifier is then applied to historical allocation data to categorize new arriving tasks and assign suitable resources for execution in their respective layers. Besides, the optimal resource allocation in the fog and cloud layers is achieved using the Groupers and Moray Eels (GME) optimization algorithm, which effectively reduces the system's execution time and latency. Extensive simulations demonstrate that the DNN-GME algorithm outperforms existing algorithms in IoT-fog-cloud settings.

Downloads

Download data is not yet available.

Article Details

Section
Articles