Automatic Content Analyzer
Keywords:
Content Analyzer, MsNLP, Electronic Essay Rater, Latent Semantic Analysis, LSA, BOW, POS, NLP, Feature Extraction, Word SimilarityAbstract
Essays and short answers are crucial testing tools for assessing academic achievement, integration of ideas and ability to recall, but are expensive and time consuming to grade manually. Manual grading of essays takes up a significant amount of instructors' valuable time, and hence is an expensive process. Automated grading, if proven to match or exceed the reliability of human graders, will significantly reduce costs. The work done in our project on Content Analyzer System analyzes the subjective type answers and grade them based on the features of a written text such as language, grammar, organization and content . Our system automatically grades the essays or short answers based on the above mentioned features and provides the user with essay statistics which includes word count, sentence count, paragraph count and the overall weighted score which is the mean of scores of each feature.
References
- Shristi Drolia., et al. “Automated Essay Rater using Natural Language Processing” International Journal of Computer Applications (0975 - 8887) Volume 163 - No 10, April 2017
- Shermis, Mark D., and Jill C. Burstein, eds. Automated essay scoring: A cross-disciplinary perspective. Routledge, 2003
- Burstein, Jill, Martin Chodorow, and Claudia Leacock. "Automated essay evaluation: The Criterion online writing service." Ai Magazine 25.3 (2004): 27.
- Rudner, Lawrence M., and Tahung Liang. "Automated essay scoring using Bayes' theorem." The Journal of Technology, Learning and Assessment 1.2 (2002).
- Powers, Donald E., et al. "Stumping EāRater: Challenging the validity of automated essay scoring." ETS Research Report Series 2001.1 (2001).
- Dikli, Semire. "An overview of automated scoring of essays." The Journal of Technology, Learning and Assessment 5.1 (2006).
- Landauer, T. K., Laham, D., & Foltz, P. W. (2003). Automated Essay Scoring: A Cross Disciplinary Perspective. In Mark D. Shermis and Jill C. Burstein (Eds.), Automated Essay
- Scoring and Annotation of Essays with the Intelligent Essay Assessor. Mahwah, NJ:Lawrence Erlbaum Associates.
- Landauer, T. K., Laham, D., & Foltz, P. W. (September/ October, 2000). The Intelligent Essay Assessor. In Marti A. Hearst (Ed), The debate on automated essay grading. IEEE Intelligent systems, 27- 31.
- Rudner, L. & Gagne, P. (2001). An overview of three approaches to scoring written essays by computer (ERIC Digest number ED 458 290).
- Streeter, L., Psotka, J., Laham, D., & MacCuish, D. (2004). The credible grading machine: essay scoring in the DOD [Department of Defense].
- Burstein, Jill, Martin Chodorow, and Claudia Leacock. "Automated essay evaluation: The Criterion online writing service." Ai Magazine 25.3 (2004): 27.
- Rudner, Lawrence M., and Tahung Liang. "Automated essay scoring using Bayes' theorem." The Journal of Technology, Learning and Assessment 1.2 (2002).
- Semire DIKLI, “Automated Essay Scoring”, Turkish Online Journal of Distance Education-TOJDE January 2006 ISSN 1302-6488 Volume: 7
Downloads
Published
Issue
Section
License
Copyright (c) IJSRCSEIT

This work is licensed under a Creative Commons Attribution 4.0 International License.