RAPID PSO BASED FEATURES SELECTION FOR CLASSIFICATION

Main Article Content

SURENDRA KUMAR
Dr.Harsh Kumar

Abstract

Sentiment analysis is one of the most promising research areas in the field of data analysis. In sentiment analysis we do classification of the data samples into positive, negative or neutral classes and bring out proper conclusions from the data. From movie reviews to the twitter data and many others we can impose sentiment analysis. There are many classification models that can be used for sentiment analysis. Some of these models are effective in doing the analysis and some are not. Though the effectiveness and efficiency of the analysis depend on many factors one of the major factors is the underlying feature selection process of the classification models. The features are the attributes of the data samples. Instead of classify the data samples based on the complete set of attributes one can build a model which uses the subset of attribute. Our proposed framework integrates a metaheutristic optimization technique which is a variation of Particle Swarm Optimization which we have named as RAPID-PSO with the multiple classification models which extract a subset of features from the complete set of features. Based on this sub set of features the framework farther classifies the samples. The result shows that our proposed frame work outperforms the other frameworks that use other classical optimization techniques such as –Grid search, Gradient Descend, Classical-PSO, Multiple-PSO, IPSO in terms of effectiveness and efficiency.

Downloads

Download data is not yet available.

Article Details

Section
Articles
Author Biography

SURENDRA KUMAR, Dr.K.N.Modi UNiversity

Coimputer Science & ENgineering

References

S. Banker and R. Patel, “a Brief Review of Sentiment Analysis Methods,†Int. J. Inf. Sci. Tech., vol. 62, no. 1, pp. 89–95, 2016.

N. F. F. Da Silva, E. R. Hruschka, and E. R. Hruschka, “Tweet sentiment analysis with classifier ensembles,†Decis. Support Syst., vol. 66, pp. 170–179, 2014.

L. Marsh and C. Onof, “Stigmergic epistemology, stigmergic cognition,†Cogn. Syst. Res., vol. 9, no. 1–2, pp. 136–149, 2008.

W. Deng et al., “An improved CACO algorithm based on adaptive method and multi-variant strategies,†Soft Comput., vol. 19, no. 3, pp. 701–713, 2014.

I. Guyon and A. Elisseeff, “An Introduction to Variable and Feature Selection,†J. Mach. Learn. Res., vol. 3, no. 3, pp. 1157–1182, 2003.

B. Xue, M. J. Zhang, and W. N. Browne, “Particle swarm optimization for feature selection in classification: a multi-objective approach,†IEEE Trans. Cybern., vol. 43, no. 6, pp. 1656–71, 2013.

Y. Saeys, I. Inza, and P. Larra??aga, “A review of feature selection techniques in bioinformatics,†Bioinformatics, vol. 23, no. 19, pp. 2507–2517, 2007.

“Sentiment Classifier | uClassify.†[Online]. Available: https://www.uclassify.com/browse/uclassify/sentiment?input=Text. [Accessed: 09-Dec-2017].

“Text Classification and Sentiment Analysis – Ahmet Taspinar.†[Online]. Available: http://ataspinar.com/2015/11/16/text-classification-and-sentiment-analysis/. [Accessed: 11-Dec-2017].

M. G. Smith and L. Bull, “Genetic programming with a genetic algorithm for feature construction and selection,†Genet. Program. Evolvable Mach., vol. 6, no. 3, pp. 265–281, 2005.

F. E. B. Otero, M. M. S. Silva, A. A. Freitas, and J. C. Nievola, “Genetic Programming for Attribute Construction in Data Mining,†Genet. Program. Proc. EuroGP’2003, vol. 2610, pp. 384–393, 2003.

C. T. Tran, M. Zhang, P. Andreae, and B. Xue, “Improving performance for classification with incomplete data using wrapper-based feature selection,†Evol. Intell., vol. 9, no. 3, pp. 81–94, 2016.

G.-G. Wang, A. Hossein Gandomi, X.-S. Yang, and A. Hossein Alavi, “A novel improved accelerated particle swarm optimization algorithm for global numerical optimization,†Eng. Comput., vol. 31, no. 7, pp. 1198–1220, 2014.

W. Deng, R. Chen, B. He, Y. Liu, L. Yin, and J. Guo, “A novel two-stage hybrid swarm intelligence optimization algorithm and application,†Soft Comput., vol. 16, no. 10, pp. 1707–1722, 2012.

A. Mahanipour and H. Nezamabadi-pour, “Improved PSO-based feature construction algorithm using Feature Selection Methods,†4. https://www.uclassify.com/browse/uclassify/sentiment?input=Text, 2017. [Online]. Available: http://ieeexplore.ieee.org/document/7940173/.

D. p. Alonso M.G., “Particle Swarm Optimization (Pso): an Alternative Method for Composite Optimizati,†10th World Congr. Struct. Multidiscip. Optim., pp. 1–10, 2013.

J. Kennedy and R. Eberhart, “Particle swarm optimization,†Neural Networks, 1995. Proceedings., IEEE Int. Conf., vol. 4, pp. 1942–1948 vol.4, 1995.

B. F. De Souza, C. P. L. F. De Carvalho, R. Calvo, R. Porf, A. T. Sao-carlense, and S. Carlos, “Multiclass SVM Model Selection Using Particle Swarm Optimization,†Proc. Sixth Int. Conf. Hybrid Intell. Syst., pp. 4–7, 2006.

C. L. Chan and C. L. Chen, “A cautious PSO with conditional random,†Expert Syst. Appl., vol. 42, no. 8, pp. 4120–4125, 2015.

B. Liu, Web Data Mining. 2011.