Main Article Content
The aim of this paper is the validation of measurement tools which assess critical thinking and creativity as general constructs instead of subject-specific skills. Specifically, this research examined whether there is convergent and discriminant (or divergent) validity between measurement tools of creativity and critical thinking. For this purpose, the multi-trait and multi-method matrix suggested by Campbell and Fiske (1959) was used. This matrix presented the correlation of scores that students obtain in different assessments in order to reveal whether the assessments measure the same or different constructs. Specifically, the two methods used were written and oral exams, and the two traits measured were critical thinking and creativity. For the validation of the assessments, 30 secondary-school students in Greece and 21 in England completed the assessments. The sample in both countries provided similar results. The critical thinking tools demonstrated convergent validity when compared with each other and discriminant validity with the creativity assessments. Furthermore, creativity assessments which measure the same aspect of creativity demonstrated convergent validity. To conclude, this research provided indicators that critical thinking and creativity as general constructs can be measured in a valid way. However, since the sample was small, further investigation of the validation of the assessment tools with a bigger sample is recommended.
International Journal of Assessment Tools in Education
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).
BERA (2011). Ethical guidelines for educational research. British Educational Research Association. Available at BERA website: https://www.bera.ac.uk/researchers-resources/publications/ethical-guidelines-for-educational-research-2011 (access: 5 August 2017)
Berliner, D. C. (2011). ‘The Context for Interpreting PISA Results in the USA: Negativism, Chauvinism, Misunderstanding, and the Potential to Distort the Educational Systems of Nations’. In Pereyra, M.A., Kotthoff, H. & Cowen, R. (ed.) PISA Under Examination: Changing Knowledge, Changing Tests, and Changing Schools. (pp. 77-96). Rotterdam: Sense Publishers
Burton, S.J., Sudweeks, R.R., Merrill, P.G. & Wood, B. (1991). How to Prepare Better Multiple-Choice Test Items: Guidelines for University Faculty. Brigham Young University Testing Services and The Department of Instructional Science.
Campbell, D.T. & Fiske, D.W. (1959). Convergent and Discriminant Validation by the Multitrait-multimethod matrix. Psychological Bulletin, 56 (2), 81-105
Coe, R. (2012). ‘Conducting Your Research: Inference and Interpretation’. In Arthur, J., Waring, M., Coe, R. & Hedges. L.V. (ed.) Education Research: Methods and Methodologies. (pp. 41-52). London: Sage
Cox, R.C. & Vargas, J.S. (1966). A comparison of Item Selection Techniques for Norm-Referenced and Criterion-Referenced Tests. University of Pittsburgh.
Critical Thinking Society (2013). Defining Critical Thinking. Available at: http://www.criticalthinking.org/pages/defining-critical-thinking/766 (Accessed: 28 January 2015).
Department for Education (n.d.) National Curriculum in England: Framework for key stages 1 to 4. Available at Gov.UK website: https://www.gov.uk/government/publications/national-curriculum-in-england-framework-for-key-stages-1-to-4/the-national-curriculum-in-england-framework-for-key-stages-1-to-4#the-school-curriculum-in-england (access: 6 August 2017)
El-Murad, J. & West, D. C. (2004). The Definition and Measurement of Creativity: What do we know?. Journal of Advertising Research, 44(2), 188-201. doi: 10.1017/S0021849904040097
Ennis, R.H. (1993). Critical thinking assessment, Theory Into Practice, 32(3), 179-186. doi: 10.1080/00405849309543594
Ennis, R.H., Gardiner, W., Guzzetta, J., Morrow, R., Paulus, D. & Ringel, L. (1964). Cornell Critical Thinking Test Series. The Cornell Critical Reasoning Test. Form X. University of Illinois.
Ennis, R.H. & Weir, E. (1985). The Ennis-Weir Critical Thinking Test: Test, Manual, Criteria, Scoring Sheet. An instrument for teaching and testing. Pacific Grove: Midwest Publications
Facione, P. A. (1990). Critical Thinking: A statement of expert consensus for Purposes of Educational Assessment and Instruction (The Delphi Report). California State University.
Facione, P.A. (2015). Critical Thinking: What it is and why it counts. Revised. Insight Assessment.
Foddy, W. (1993). Constructing Questions for Interviews and Questionnaires. Cambridge: Cambridge University Press
Gelerstein, D., del Río, R., Nussbaum, M., Chiuminatto, P., & López, X. (2016). Designing and implementing a test for measuring critical thinking in primary school. Thinking Skills and Creativity, 20, 40-49.
Getzels, J.W. & Jackson, P. W. (1962). Creativity and Intelligence: Explorations with Gifted Students. London and New York: John Wiley and Sons Inc.
Guilford, J.P. (1967). The nature of Human Intelligence. New York: Mc Graw-Hill Book Company
Haladyna, T. M. (1994). Developing and validating multiple-choice test items. UK: Lawrence Erlbaum Associates.
Hewitt, M.A. & Homan, S.P. (2003). Readability level of standardized test items and student performance: The forgotten validity variable, Reading Research and Instruction, 43(2), 1-16.
Hocevar, D. (1979). Measurement of Creativity: Review and Critique. Annual Meeting of the Rocky Mountain Psychological Association, Colorado, 12-14th April
Iozzi, L.A. & Cheu, J. (1978). Preparing for Tomorrow’s World: An Alternative Curriculum Model for the Secondary Schools Paper. First Annual Conference of the Education Section et the world Future, Texas, 22nd October
Jiang, H. & Zhang, Q. (2014). Development and Validation of Team Creativity Measures: A Complex System Perspective. Creativity and Innovation Management, 23 (3), 264-275.
Johanson, G.A. & Brooks, G. P. (2010). Initial Scale Development: Sample Size for Pilot Studies. Educational and Psychological Measurement, 70(3), 394-400.
Johnson, S. & Johnson, R. (2009). Conceptualising and interpreting reliability. UK: Ofqual.
Kampylis, P. G., & Valtanen, J. (2010). Redefining creativity-analyzing definitions, collocations, and consequences. The Journal of Creative Behavior, 44(3), 191-214.
Kane, M.T. (2009).Validating the Interpretations and Uses of Test Scores. In Lissitz, R.W. (ed.) The Concept of Validity Revisions, New Directions, and Applications (pp. 39-64). United States: Information Age Publishing Inc.
Kaufman, J.C. (2006). Self-Reported Differences in Creativity by Ethnicity and Gender. Applied Cognitive Psychology, 20, 1065-1082.
Koretz, D. (2006). Measuring Up: What educational testing really tells us. Cambridge, MA: Harvard University Press.
Krathwohl, D.R. (2002). A Revision of Bloom’s Taxonomy: an overview. Theory into Practice, 41(4), 212-218.
Lambert, D. and Lines, D. (2000) Understanding assessment: purposes, perceptions, practice. London: Routledge Falmer
Lipman, M. (1987). Critical thinking: What can it be? Analytic Teaching, 8(1), 5-12.
Lipman, M. (2003). Thinking in Education. 2nd edn. Cambridge: Cambridge University Press
McPeck, J.E. (1981). Critical Thinking and Education. Oxford: Martin Robertson
McPeck, J. E. (1990). Critical Thinking and Subject Specificity: A Reply to Ennis. Educational Researcher, 19(4), 10-12.
Mednick, S.A. (1962). The associate basis of the creative process. Psychological Review, 69(3), 220-232.
Messick, S. (1995). Validity of psychological assessment: Validation of inferences from persons' responses and performances as scientific inquiry into score meaning. American psychologist, 50(9), 741-749.
Moss, P.A (1994). Can there be Validity without Reliability? Educational Researcher, 23(2), 5-12.
Newton, P.E. (2012). Clarifying the Consensus Definition of Validity. Measurement: Interdisciplinary Research & Perspective, 10(1-2), 1-29.
Newton, D.P. (2014). Thinking with Feeling: Fostering productive thought in the classroom. New York: Routledge
Nisbett, R. E. & Wilson, T. D. (1977). The Halo Effect: Evidence for Unconscious Alteration of Judgements. Journal of Personality and Social Psychology, 35(4), 250-256.
Norris, S. P. & King, R. (1984). The design of a Critical Thinking Test on Appraising Observations. Studies in Critical Thinking. Research Report No1. Canada: Institute for Educational Research and Development.
Nusbaum, E. C., Silvia, P. J., & Beaty, R. E. (2017). Ha ha? Assessing individual differences in humor production ability. Psychology of Aesthetics, Creativity, and the Arts, 11(2), 231-241.
Plucker, J A. & Makel., M.C. (2010) Assessment of creativity. In Kaufam, J.C. & Sternberg, R.J. (ed.) The Cambridge handbook of creativity (pp.48-73). Cambridge: Cambridge University Press
Propp, V. (1968). Morphology of the Folk Tale. Translation by Laurence Scott. The American Folklore Society and Indiana University
Richards, J.C. (2005). Communicative Language Teaching Today. SEAMEO Regional Language Center
Rodari, G. (1996). The Grammar of Fantasy: An Introduction to the Art of Inventing Stories. Translation and introduction by Jack Zippes. New York: Teachers & Writers Collaborative
Rungo, M. A. & Jaeger, G.J. (2012). The Standard Definition of Creativity. Creativity Research Journal, 24(1), 92-96. doi: 10.1080/10400419.2012.650092
Silvia, P.A. (2015). Intelligence and Creativity are Pretty Similar After All. Educational Psychology Review. 27 (4), 599-606. doi: 10.1007/s10648-015-9299-1
Sireci, S.G. (2009). Packing and Unpacking Sources of Validity Evidence. In Lissitz, R.W. (ed.) The Concept of Validity Revisions, New Directions, and Applications (pp. 19-37). United States: Information Age Publishing Inc.
Su, C.T. & Parham, L.D. (2002). Case Report- Generating a valid questionnaire translation for cross-cultural use. American Journal of Occupational Therapy, 56, 581-585.
Tiruneh, D. T., De Cock, M., Weldeslassie, A. G., Elen, J., & Janssen, R. (2017). Measuring critical thinking in physics: Development and validation of a critical thinking test in electricity and magnetism. International Journal of Science and Mathematics Education, 15, 663-682.
Torrance, E. P., Ball, O. E. & Safter H.T. (2008). Torrance Tests of Creative Thinking: Streamlined Scoring Guide for Figural Forms A and B. Bensenville: Scholastic Testing Service Inc.
Treffinger, D.J., Young, G.C., Selby, E.C. & Shepardson, C. (2002). Assessing Creativity: A guide for educators. Florida: Center for Creative Learning
Weisberg, R. W. (2015). On the usefulness of “value” in the definition of creativity. Creativity Research Journal, 27(2), 111-124.
Yoon, C. H. (2017). A validation study of the Torrance Tests of Creative Thinking with a sample of Korean elementary school students. Thinking Skills and Creativity, 26, 38-50.