CFP last date
15 May 2024
Call for Paper
June Edition
IJAIS solicits high quality original research papers for the upcoming June edition of the journal. The last date of research paper submission is 15 May 2024

Submit your paper
Know more
Reseach Article

Construction of Co-occurrence Matrix using Gabor Wavelets for Classification of Arecanuts by Decision Trees

by Suresha M, Ajit Danti
International Journal of Applied Information Systems
Foundation of Computer Science (FCS), NY, USA
Volume 4 - Number 6
Year of Publication: 2012
Authors: Suresha M, Ajit Danti
10.5120/ijais12-450775

Suresha M, Ajit Danti . Construction of Co-occurrence Matrix using Gabor Wavelets for Classification of Arecanuts by Decision Trees. International Journal of Applied Information Systems. 4, 6 ( December 2012), 33-39. DOI=10.5120/ijais12-450775

@article{ 10.5120/ijais12-450775,
author = { Suresha M, Ajit Danti },
title = { Construction of Co-occurrence Matrix using Gabor Wavelets for Classification of Arecanuts by Decision Trees },
journal = { International Journal of Applied Information Systems },
issue_date = { December 2012 },
volume = { 4 },
number = { 6 },
month = { December },
year = { 2012 },
issn = { 2249-0868 },
pages = { 33-39 },
numpages = {9},
url = { https://www.ijais.org/archives/volume4/number6/364-0775/ },
doi = { 10.5120/ijais12-450775 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2023-07-05T10:47:33.841475+05:30
%A Suresha M
%A Ajit Danti
%T Construction of Co-occurrence Matrix using Gabor Wavelets for Classification of Arecanuts by Decision Trees
%J International Journal of Applied Information Systems
%@ 2249-0868
%V 4
%N 6
%P 33-39
%D 2012
%I Foundation of Computer Science (FCS), NY, USA
Abstract

In this paper, a novel method developed for classification of arecanuts based on texture features. A Gabor response co-occurrence matrix (GRCM) is constructed analogous to Gray Level co-occurrence matrix (GLCM). Classification is done using kNN and Decision Tree (DT) classifier based on GRCM features. There are twelve Gabor filters are designed (Four orientations and three scaling) to capture variations in lighting conditions. Results are compared with kNN classifier and found better results. Splitting rules for growing decision tree that are included are gini diversity index(gdi), twoing rule, and entropy. Proposed approach is experimented on large data set using cross validation and found good success rate in decision tree classifier.

References
  1. Nemati, R. J. , Javed, M. Y. 2008. Fingerprint verification using filter-bank of Gabor and Log Gabor filters. 15th International Conference on Systems, Signals and Image Processing, 363 – 366.
  2. Lei, Yu. , Yan, Ma. , Zijun, Hu. 2011. Gabor Texture Information for Face Recognition Using the Generalized Gaussian Model. Sixth International Conference on Image and Graphics (ICIG), 303 – 308, 2011.
  3. Chengjun, Liu. , Wechsler, H. 2002. Gabor feature based classification using the enhanced fisher linear discriminant model for face recognition. IEEE Transactions on Image Processing, 467 – 476, Volume: 11 , Issue: 4.
  4. Yuanbo, Yang. , Jinguang, Sun. 2010. Face Recognition Based on Gabor Feature Extraction and Fractal Coding. Third International Symposium on Electronic Commerce and Security (ISECS), 302 – 306.
  5. Foster, P. and Pedro, D. 2003. Tree induction for probability-based ranking. Journal in Machine Learning. ACM, 52(3), 199-215.
  6. Pierre, G. , Damien, E. and Louis, W. 2006. Extremely randomized Trees. Machine Learning. 36(1), 3-42.
  7. Leo, Breiman. , Jerome, Friedman. , Charles, Stone and Richard, Olshen. 1984. Classification and Regression Trees.
  8. Ross, Quinlan. 1993. C4. 5: Programs for Machine Learning. Morgan Kaufmann. San Francisco, CA, USA.
  9. Peter, F. and Edson, M. 2007. A simple lexicographic ranker and probability estimator. Proceedings of the 18th European Conference on Machine Learning (ECML 2007). 575-582, Springer.
  10. Andrew, B. 1997. The use of the area under the ROC curve in the evaluation of machine learning algorithms. Pattern Recognition. 30(7), 1145-1159.
  11. Foster, P. and Venkateswarlu, K. 1999. A survey of methods for scaling up inductive algorithms. Data Mining and Knowledge. 3(2), 131-169.
  12. Cesar, F. Peter, F. and Jose, H. 2003. Improving the AUC of probabilistic estimation trees. Proceedings of the 14th European Conference on Machine Learning. 121-132, Springer.
  13. Cristina, Olaru and Louis, W. 2003. A complete fuzzy decision tree technique. Journal Fuzzy Sets and Systems - Theme: Learning and modeling, ACM, 221 – 254.
  14. Fernando, B. Juan-Carlos, C. Nicolas, M. Daniel, S. 2004. Building multi-way decision trees with numerical attributes. International Journal of Information Sciences 165. Elsevier, 73–90.
  15. Barros. , Basgalupp. , de, Carvalho. Freitas. 2012. A Survey of Evolutionary Algorithms for Decision-Tree Induction. IEEE Tran. on Systems, Man, and Cybernetics Applications, 291 – 312.
  16. Moustakidis. , Mallinis. , Koutsias. , Theocharis. and Petridis. 2012. SVM-Based Fuzzy Decision Trees for Classification of High Spatial Resolution Remote Sensing Images. IEEE Transactions on Geoscience and Remote Sensing. 149 – 169.
  17. Juan, Sun. , Xi-Zhao, Wang. 2005. An initial comparison on noise resisting between crisp and fuzzy decision trees. Proceedings of International Conference on Machine Learning and Cybernetics, 2545 – 2550.
  18. Setiono, R. , Huan, Liu. 1999. A connectionist approach to generating oblique decision trees. IEEE Transactions on Systems, Man, and Cybernetics, Volume: 29 , Issue:3, 440 – 444.
  19. Haiwei, Xu. , Minhua, Yang. Liang, Liang. 2010. An improved random decision trees algorithm with application to land cover classification. International Conference on Geoinformatics, 1-4.
  20. Pedrycz, W. , Sosnowski, Z A. 2005. C-fuzzy decision trees. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews. 498–511.
  21. Crockett, K. Bandar, Z. Mclean, D. 2002. Combining multiple decision trees using fuzzy-neural inference. IEEE International Conference on Fuzzy Systems. 1523 – 1527.
  22. Rafael, C. G. and Richard, E. Woods. and Steven, L. Eddins. 2009. Digital Image Processing using MATLAB. PPH.
  23. Newsam, S. D. and Kamath, C. 2004. Retrieval using texture features in high resolution multi-spectral satellite imagery. In SPIE Conference on Data Mining and Knowledge Discovery.
  24. Harlick, R. M. , Shanmugam, K. , and Dinstein, I. 1973. Textural Features for image classification. IEEE Transaction on System, man and Cybernetics, 610 – 621.
  25. Apte, C. S, Weiss. 1997. Data mining with decision tress and decision rules. Future Generation Computer Systems, 13:197–210.
  26. Breiman, L. 1996. Some properties of splitting criteria. Machine Learning. 24:41–47.
  27. De'ath, G. K, Fabricius. 2000. Classification and regression trees: a powerful yet simple technique for ecological data analysis. Ecology, 81(11):3178–3192.
Index Terms

Computer Science
Information Sciences

Keywords

Arecanut classification Decision Trees Gabor response co-occurrence matrix Gabor Wavelets Gray level co-occurrence matrix Texture Features