CFP last date
15 April 2024
Reseach Article

Evaluation of N-gram Text Representations for Automated Essay-Type Grading Systems

by Odunayo Esther Oduntan, Ibrahim Adepoju Adeyanju, Stephen Olatunde Olabiyisi, Elijah Olusayo Omidiora
International Journal of Applied Information Systems
Foundation of Computer Science (FCS), NY, USA
Volume 9 - Number 4
Year of Publication: 2015
Authors: Odunayo Esther Oduntan, Ibrahim Adepoju Adeyanju, Stephen Olatunde Olabiyisi, Elijah Olusayo Omidiora
10.5120/ijais15-451394

Odunayo Esther Oduntan, Ibrahim Adepoju Adeyanju, Stephen Olatunde Olabiyisi, Elijah Olusayo Omidiora . Evaluation of N-gram Text Representations for Automated Essay-Type Grading Systems. International Journal of Applied Information Systems. 9, 4 ( July 2015), 25-31. DOI=10.5120/ijais15-451394

@article{ 10.5120/ijais15-451394,
author = { Odunayo Esther Oduntan, Ibrahim Adepoju Adeyanju, Stephen Olatunde Olabiyisi, Elijah Olusayo Omidiora },
title = { Evaluation of N-gram Text Representations for Automated Essay-Type Grading Systems },
journal = { International Journal of Applied Information Systems },
issue_date = { July 2015 },
volume = { 9 },
number = { 4 },
month = { July },
year = { 2015 },
issn = { 2249-0868 },
pages = { 25-31 },
numpages = {9},
url = { https://www.ijais.org/archives/volume9/number4/769-1394/ },
doi = { 10.5120/ijais15-451394 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2023-07-05T19:00:01.981947+05:30
%A Odunayo Esther Oduntan
%A Ibrahim Adepoju Adeyanju
%A Stephen Olatunde Olabiyisi
%A Elijah Olusayo Omidiora
%T Evaluation of N-gram Text Representations for Automated Essay-Type Grading Systems
%J International Journal of Applied Information Systems
%@ 2249-0868
%V 9
%N 4
%P 25-31
%D 2015
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Automated grading systems can reduce stress and time constraints faced by examiners especially where large numbers of students are enrolled. Essay-type grading involves a comparison of the textual content of a student's script with the marking guide of the examiner. In this paper, we focus on analyzing the n-gram text representation used in automated essay-type grading system. Each question answered in a student script or in the marking guide is viewed as a document in the document term matrix. Three n-gram representation schemes were used to denote a term vis-à-vis unigram 1-gram, bigram 2-gram and both )"(unigram )+( bi-gram)"(. A binary weighting scheme was used for each document vector with cosine similarity to compare documents across the student scripts and marking guide. The final student score is computed as a weighted aggregate of documents' similarity scores as determined by marks allocated to each question in the marking guide. Our experiment compared effectiveness of the three representation schemes using electronically transcribed handwritten students' scripts and marking guide from a first year computer science course of a Nigerian Polytechnic. The machine generated scores were then compared with those provided by the Examiner for the same scripts using mean absolute error and Pearson correlation coefficient. Experimental results indicate )"(unigram )+( bigram)" representation outperformed the other two representations with a mean absolute error of 7. 6 as opposed to 15. 8 and 10. 6 for unigram and bigram representations respectively. These results are reinforced by the correlation coefficient with "unigram + bigram" representation having 0. 3 while unigram and bigram representations had 0. 2 and 0. 1 respectively. The weak but positive correlation indicates that the Examiner might have considered other issues not necessarily documented in the marking guide. We intend to test other datasets and apply techniques for reducing sparseness in our document term matrices to improve performance.

References
  1. Allen M. J. 2004 Assessing Academic Programs in Higher Education. Bolton M. A, Anker Publishing.
  2. Anson, C. M. 2003. Responding to and assessing student writing: The uses and limits of technology. In P. Takayoshi & B. Huot Eds. , Teaching writing with computers: An introduction, 234-245
  3. Braga, I. A. 2009 Evaluation of Stopwords Removal on the Statistical Approach for Automatic Term Extraction, Proceedings of the 2009 Seventh Brazilain Symposium in Information and Human Language Technology. IEEE Computer Society Washington, DC, USA 142-149
  4. Bull, J. 1999. Computer-assisted assessment: Impact on higher education institutions. Educational Technology & Society, 23. Retrieved from http://www. ifets. info/journals/2_3/joanna_bull. pdf
  5. Elliot, S. 2003. Intellimetric™: From here to validity. In M. D. Shermis & J. C. Burstein Eds. , Automatic essay scoring: A cross-disciplinary perspective , Mahwah, NJ, USA
  6. Fitzgerald, K. R. 1994. Computerized scoring? A question of theory and practice. Journal of Basic Writing, 132, 3-17.
  7. Guven, O. B, Kahpsiz O. 2006 "Advanced Information Extration with n-gram based LSI" in Proceedings of World Academy of Science, Engineering and Technology, 17, 13-18.
  8. Hanna, G. S. , Dettmer, P. A. 2004. Assessment for Effective Teaching Using Context-Adaptive Planning. New York: Pearson
  9. Haswell, R. H. 2004. Post-secondary entry writing placement: A brief synopsis of research. Retrieved from http://comppile. tamucc. edu/writingplacementresearch. htm
  10. Herrington, A. , Moran, C. 2001. What happens when machines read our students' writing? College English, 634, 480-499.
  11. Islam, M. M. , Hogue, A. S. M. L. ,2012 "Automated Essay Scoring Using Generalized Latent Semantic Analysis", Journal of Computers, 7 3, 616-626
  12. Kaplan, R. M. , Wolff, S. , Burstein J. Li. C. , Rock, D. , and Kaplan, B. , 1998. Scoring essays automatically using surface features, Technical Report 94-21P, New Jersey, USA: Educational Testing Service.
  13. Kukich, K. 2000. Beyond automated essay scoring. IEEE Intelligent Systems, 155, 22-27.
  14. Page, E. B. 1966. The imminence of grading essays by computers. Phi Delta Kappan, 47, 238-243.
  15. Rudner, L. , and Gagne, P. 2001. An overview of three approaches to scoring written essays by computer. ERIC Digest, ERIC Clearinghouse on Assessment and Evaluation. ERIC Document Reproduction Service No. ED458290.
  16. Saad S. A 2009 Computer-based testing vs Paper testing: Establishing the comparability of Reading Test through the Evolution of A New Comparability Model in a Saudi EFL context. PhD Thesis. University of Essex, UK.
  17. Shermis, M. D. , and Burstein, J. C. 2003. Preface. In M. D. Shermis & J. C. Burstein Eds. , Automatic essay scoring: A cross-disciplinary perspective. Mahwah, NJ: Lawrence Erlbaum Associates.
  18. Sorana-Daniela B, Lorentz J. 2007 Computer-based tesing on physical chemistry: A case study. Inter J. Educ. Develop using Info Comm. Technology. 31 94-95.
  19. Vantage Learning 2001b. IntelliMetric™ : From here to validity. Report No. RB-504. Newtown, PA: Vantage Learning.
  20. Vantage Learning. 2003. How does IntelliMetric™ score essay responses? Report No. RB-929. Newtown, PA: Vantage Learning.
  21. Warschauer, M. and Ware, P. 2006. Automated writing evaluation: Defining the classroom research agenda. Language Teaching Research, 10, 2, 1–24. Retrieved from the University of California - Irvine Department of Education Web site: http://www. gse. uci. edu/person/markw/awe. pdf
  22. Williams B. 2007. Students' Perception of pre-hospital web-based examinations. International journal of Educational Development Info Comm. Technology. 31 54-63.
  23. Wohlpart, J. , Lindsey, C. , and Rademacher, C. 2008. The reliability of computer software to score essays: Innovations in a humanities course. Science Direct Computers and Composition
  24. Wresch, W. 1993. The imminence of grading essays by computers—25 years later. Computers and Composition 102, 45-58.
  25. Yang, Y. , Buckendahl, C. W. , Juszkiewicz, P. J. , and Bhola, D. S. 2002. A review of strategies for validating computer automated scoring. Applied Measurement in Education, 154, 391-412.
Index Terms

Computer Science
Information Sciences

Keywords

Automated essay grading n-gram text representation