Comparison Analysis of Time Series Data Algorithm Complexity for Forecasting of Dengue Fever Occurrences

Main Article Content

Agus Qomaruddin Munir
Retantyo Wardoyo

Abstract

Times series complexity is a testing standard of a particular algorithm to achieve efficient time execution when it is implemented into programming language. Complexity of the algorithm is differentiated into two parts; those are time series complexity and space complexity. Time series complexity is determined from the numbers of computation steps needed to run algorithm as a function of several data n (input size). However, space series complexity is measured from the memory used by data structure found in the algorithm as the function of several data n. The objective of the study is to test the comparison of three forecasting algorithms among time series complexities by using operational empirical analysis with running time application when the algorithm is used in programming language or a particular application. The algorithms tested in this study were Linear Regression, SMO Regression, and Multilayer Perceptron by using Weka Application. The analysis result and the test result all three algorithms showed that running time was significantly influenced by algorithm’s complexity and additional data set numbers that became the input. Furthermore, running time analysis from those three algorithms showed stable rate respectively from SMO Regression, Linear Regression, and Multilayer Perceptron. However, for Big-O algorithm analysis Linear Regression and SMO Regression had almost similar time complexity, but they had far different from Multilayer Perceptron that tended to be more complex.

Keywords: time complexity, time series data, linear regression, SMO Regression, multilayer perceptron, Big-O

Downloads

Download data is not yet available.

Article Details

Section
Articles