Comparative study of Job Schedulers in Hadoop Environment

Arpitha HV, Shoney Sebastian


Hadoop is a structure for BigData handling in distributed applications. Hadoop bunch is worked for running information intensive distributed applications. Hadoop distributed file system is the essential stockpiling territory for BigData. MapReduce is a model to total undertakings of a job. Task assignment is conceivable by schedulers. Schedulers ensure the reasonable assignment of assets among clients. At the point when a client presents a job, it will move to a job queue. From the job queue, job will be divided into tasks and distributed to various nodes. By the correct assignment of tasks, job finish time will decrease. This can guarantee better execution of the job. This paper gives the comparison of different Hadoop Job Schedulers.

Keywords: Hadoop, HDFS, MapReduce, Scheduling, FIFO Scheduling, Fair Scheduling, Capacity Scheduling

Full Text:




  • There are currently no refbacks.

Copyright (c) 2017 International Journal of Advanced Research in Computer Science