Full-text resources of CEJSH and other databases are now available in the new Library of Science.
Visit https://bibliotekanauki.pl

PL EN


2012 | 12 | 3 | 18-35

Article title

COMPUTER FAMILIARITY AND TEST PERFORMANCE ON A COMPUTER-BASED CLOZE ESL READING ASSESSMENT

Content

Title variants

Languages of publication

EN

Abstracts

EN
Researchers have raised questions regarding the connection between learner familiarity with computers and performance on computerized tests virtually since interest arose in studying the applicability of computers for assessment purposes. However, despite this longstanding attention, at present, there has been a surprising lack of research that explores the connection between computer familiarity and performance on computerized tests that fall outside of the traditional multiple-choice discrete-point tests that have historically predominated in the fielf of testing and assessment. The current study aims to address this gap in previous research by examining the relationship between computer familiarity and computer-based test performance on a computer-based test of second language reading that is integrative rather than discrete-point. The study investigated the online reading ability of ESL students from one secondary school in a large city in western Canada (61 females and 59 males in the sample, ages 13-19, M=15.73). The students responded to a questionnaire about their computer familiarity and then completed an online multiple-choice cloze test. Contrary to other most other findings based on discrete-point tests, the results revealed that the familiarity variables do account for a small but significant amount of the variability in the computer-based test scores.

Year

Volume

12

Issue

3

Pages

18-35

Physical description

Contributors

  • Georgia State University, Atlanta, Georgia, USA

References

  • Al-Amri, S. (2008). Computer-based testing vs. paper-based testing: A comprehensive approach to examining the comparability of testing modes. Essex Graduate Student Papers in Language & Linguistics, 10, 22-44.
  • Canadian Counsel on Learning (2008). Lessons in learning: Understanding the academic trajectories of ESL students. Accessed March 24, 2012. http://www.ccl-cca.ca/pdfs/LessonsInLearning/Oct-02-08-Understanding-the-acedemic.pdf.
  • Chapelle, C. A. (2008). Utilizing technology in language assessment. In E. Shohamy, & N. H. Hornberger (Eds.), Encyclopedia of language and education, 2nd Edition, Volume 7: Language testing and assessment (pp. 123–134). New York: Springer Publishers.
  • Chapelle, C. A. & Douglas, D. (2006). Assessing Language to Computer Technology. Cambridge: Cambridge University Press.
  • Chihara, T., Oller, J. W., Weaver, K. A.., & Chavez-Oller, M. A. (1992). Are cloze items sensitive to constraints across sentences? Language Learning, 27, 63-73.
  • Choi, I. C., Kim, K. S., & Boo, J. (2003). Comparability of a paper-based language test and a computer-based language test. Language Testing, 20, 295-320.
  • Clariana, R. & Wallace, P. (2002). Paper-based versus computer-based assessment: Key factors associated with the test mode effect. British Journal of Educational Technology, 33, 593-602.
  • Dooey, P. (2008). Language testing and technology: Problems of transition to a new era. ReCALL, 20, 21-34.
  • Espin, C. A. & Froegen, A. (1996). Validity of general outcome measures for predicting secondary students’ performance on content-area tasks. Exceptional Children, 62, 497-514.
  • Fraenkel, J. R. & Wallen, N. E. (2005). How to Design and Evaluate Research in Education. New York: McGraw-Hill.
  • Garnett, B. & Aman, C. (April 4, 2008). Academic performance of immigrant youth in Vancouver, Montreal and Toronto: The Vancouver context. Paper presented at 10th National Metropolis Conference in Halifax, Nova Scotia.
  • Gribbons, B. & Herman, J. (1997). True and quasi-experimental designs. Practical Assessment, Research & Evaluation, 5(14). Retrieved August 9, 2010 from http://PAREonline.net/getvn.asp?v=5&n=14.
  • Gunderson, L. (2009). ESL (ELL) Literacy Instruction: A Guidebook to Theory and Practice (2nd ed.). New York: Routledge.
  • Gunderson, L., D’Silva, R. & Murphy Odo, D. (2010). The Lower Mainland English Reading Assessment (LOMERA) Manual. Vancouver: The Lower Mainland ESL Assessment Consortium. Retrieved August 9, 2011 from http://www.eslassess.ca/esl/.
  • Hale, G. A., Stansfield, C. W., Rock, D. A., Hicks, M. M., Butler, F. A. & Oller, J. W. (1989). The relation of multiple-choice cloze items to the Test of English as a Foreign Language. Language Testing, 6, 47-75.
  • Higgins, J., Russell, M., & Hoffmann, T. (2005). Examining the effect of computer-based passage presentation on reading test performance. Journal of Technology, Learning, and Assessment, 3. Retrieved June 9, 2010, from http://escholarship.bc.edu/jtla
  • Hoffman, L., & Sable, J . (2006). Public Elementary and Secondary Students, Staff, Schools, and School Districts: School Year 2003-2004. Washington, DC: National Center for Education Statistics
  • Kauffman, A. S. (2003). Practice effects. Speech and Language Forum. http://www.speechandlanguage.com/cafe/13.asp.
  • Kindler, A . (2002). Survey of the States' Limited English Proficient Students and Available Education Programs and Services: 2000-2001 Summary Report. Washington, D C: National Clearinghouse for English Language Acquisition
  • Kirsch, I., Jamieson, J., Taylor, C. & Eignor, D. (1998). Computer Familiarity among TOEFL Examinees. TOEFL Research report 59, March, 1998, Princeton, NJ: Educational Testing Service.
  • Menken, K. (2008). English Learners Left Behind: Standardized Testing as Language Policy. Toronto: Multilingual Matters.
  • National Clearinghouse for English Language Acquisition (2006). How Has the English Language Learner Population Changed in Recent Years? Washington, DC: NCELA; Accessed March 24, 2012. www.ncela.gwu.edu/expert/faq/08leps.html.
  • Oller, J. W. & Jonz, J. (Eds.).(1994). Cloze and Coherence. Cranbury, NJ: Bucknes University Press.
  • Pomplun, M., Frey, S. & Becker, D. (2002). The score equivalence of paper-and-pencil and computerized versions of a speeded test of reading comprehension. Educational and Psychological Measurement, 62, 337-354.
  • Radwanski, G. (1987). Ontario Study of the Relevance of Education and the Issue of Dropouts. Toronto: Ontario Ministry of Education.
  • Russell, M. (1999). Testing on Computers: A Follow-Up Study Comparing Performance on Computer and on Paper. Retrieved from http://epaa.asu.edu/ojs/issue/view/7.
  • Sawaki, Y. (2001). Comparability of conventional and computerized tests of reading in a second language. Language Learning & Technology, 5, 38-59.
  • Shohamy, E. (2000). The Power of Tests: A Critical Perspective on the Uses of Language Tests. Harlow: Pearson.
  • Snow, C. (2008). Crosscutting themes and future research directions. In D. August & Shanahan, T. (Eds.), Developing Reading and Writing in Second Language Learners (pp. 275–300). New York: Routledge.
  • Statistics Canada (2010). The Daily Statistics Canada. Retrieved July 16, 2011, from: http://www.statcan.gc.ca/daily-quotidien/100309/dq100309a-eng.htm.
  • Taylor, C., Kirsch, I., Eignor, D. & Jamieson, J. (1999). Examining the relationship between computer familiarity and performance on computer-based language tasks. Language Learning, 49, 219-274.
  • Toronto District School Board. (n.d.). Facts and Figures. Accessed March 24, 2012. http://www.tdsb.on.ca/communications/tdsbfacts.html.
  • Watt, D. & Roessingh, H. (2001). The dynamics of ESL dropout: Plus ca change... Canadian Modern Language Review, 58, 203-222.
  • Yu, G. (2010). Effects of presentation mode and computer familiarity on summarization of extended texts. Language Assessment Quarterly, 7, 119-136.

Document Type

Publication order reference

Identifiers

YADDA identifier

bwmeta1.element.desklight-a2950dc9-4caf-4622-b794-b1254c71a7a3
JavaScript is turned off in your web browser. Turn it on to take full advantage of this site, then refresh the page.