- Abosalem, Y. (2016). Assessment techniques and students’ higher-order thinking skills. International Journal of Secondary Education, 4(1), 1-11. [Google Scholar]
- Abouelkheir, H. M. (2018). The criteria and analysis of multiple‑choice questions in undergraduate dental examinations. J Dent Res Rev, 5, 59-64 [Google Scholar]
- Alfaki, I. M. (2014). Sudan English language syllabus: Evaluating reading comprehension questions using Bloom’s taxonomy. International Journal of English Language Teaching, 2(3), 53-74. [Google Scholar]
- Almuhaidib, N. (2010). Types of item‑writing flaws in multiple choice question pattern: A comparative study. Umm Al Qura Univ JEduc Psychol Sci, 2, 10‑45. [Google Scholar]
- Amedahe, F. K. (1989). Testing practices in secondary schools in the Central Region of Ghana. Unpublished master’s thesis, University of Cape Coast, Cape Coast. [Google Scholar]
- Assaly, I. R., & Smadi, O. M. (2015). Using Bloom’s taxonomy to evaluate the cognitive levels of master class textbook’s questions. English Language Teaching, 8(5), 100-110. [Google Scholar]
- Baig, M., Ali, S. K., Ali, S., & Huda, N. (2014). Evaluation of multiple choice and short essay question items in basic medical sciences. Pak J Med Sci, 30, 3-6. [Google Scholar]
- Bloom, B. S. (1956). Taxonomy of educational objectives: Handbook 1 cognitive domain. London: Longmans [Google Scholar]
- Boyd, B. (2008). Effects of state tests on classroom test items in mathematics. School Science and Mathematics, 108(6), 251-261. [Google Scholar]
- Clay, B., & Root, E. (2001). Is this a trick question? A short guide to writing effective test questions. Kansas: Kansas Curriculum Center [Google Scholar]
- Cobbinah, A. (2016). Items’ sequencing on difficulty level and students’ achievement in mathematics test in Central Region of Ghana. African Journal of Interdisciplinary Studies, 9, 55-62 [Google Scholar]
- Cobbinah, A., Daramola, D. S., Owolabi, H. O., & Olutola, A. T. (2017). Analysis of levels of thinking required in West African senior secondary school certificate in core mathematics multiple choice items. African Journal of Interdisciplinary Studies, 10, 1-7. [Google Scholar]
- Costello, E., Holland, J. C., & Kirwan, C. (2018). Evaluation of MCQs from MOOCs for common item writing flaws. BMC Research Notes, 11(849), 1-3 [Google Scholar]
- Davidson, R. A., & Baldwin, B. A. (2005). Cognitive skills objectives in intermediate accounting textbooks: Evidence from end-of-chapter material. Journal of Accounting Education, 23(2), 79-95. [Google Scholar]
- DiSantis, D. J., Ayoob, A. R., & Williams, L. E. (2015). Prevalence of flawed multiple-choice questions in continuing medical education activities of major radiology journals. American Journal of Roentgenology, 204, 698-702 [Google Scholar]
- Downing, S. M. (2002). Construct-irrelevant variance and flawed test questions: do multiple-choice item-writing principles make any difference? Acad Med, 77(10), S103-S104. [Google Scholar]
- Downing, S. M. (2005). The effects of violating standard item writing principles on tests and students: The consequences of using flawed test items on achievement examinations in medical education. Advances in Health Sciences Education, 10(2), 133-143. [Google Scholar]
- Ebadi, S., & Mozafari, V. (2016). Exploring Bloom’s revised taxonomy of educational objectives in TPSOL textbooks. Journal of Teaching Persian to Speakers of Other Languages, 5(1), 65-93 [Google Scholar]
- Ebadi, S., & Mozafari, V. (2016). Exploring Bloom’s revised taxonomy of educational objectives in TPSOL textbooks. Journal of Teaching Persian to Speakers of Other Languages, 5(1), 65-93 [Google Scholar]
- Etsey, Y. K. (2012). Assessment in education. Cape Coast: University of Cape Coast Press [Google Scholar]
- Haladyna, T. M., & Rodriguez, M. C. (2013). Developing and validating test items. New York, NY: Routledge [Google Scholar]
- Hopper, C. (2009). Practicing college learning strategies. (5th Ed). New York, NY: Houghton Mifflin [Google Scholar]
- Ijeoma, J. A., Eme, U. J., & Nsisong, A. U. (2013). Content validity of May/June West African senior school certificate examination (WASSCE) questions in chemistry. Journal of Education and Practice, 4(7), 15-21 [Google Scholar]
- Kasim, Z. U., & Zulfikar, T. (2017). Analysis of instructional questions in an English textbook for senior high schools. English Education Journal (EEJ), 8(4), 536-552 [Google Scholar]
- Kenneth, D. R., & Mari-Wells, H. (2017). The prevalence of item construction flaws in medical school examinations and innovative recommendations for improvement. European Medical Journal, 1(1), 61-66 [Google Scholar]
- Köksal, D., & Ulum, O. G. (2018). Language assessment through Bloom’s Taxonomy. Journal of Language and Linguistic Studies, 14(2), 76-88 [Google Scholar]
- Masters, J. C., Hulsmeyer, B. S., Pike, M. E., Leichty, K., Miller, M. T., Verst, A. L (2001). Assessment of multiple‑choice questions in selected test banks accompanying text books used in nursing education. Journal of Nursing Education, 40(1), 25‑32. [Google Scholar]
- Ministry of Education (2010). Teaching syllabus for business management (senior high school 1-3). Accra: Curriculum Research and Development Division (CRDD), Ghana Education Service [Google Scholar]
- Mizbani, M., & Chalak, A. (2017). Analyzing listening and speaking activities of Iranian EFL textbook prospect 3 through Bloom's revised taxonomy. Advances in Language and Literary Studies (ALLS), 8(3), 38-4 [Google Scholar]
- Nedeau-Cayo, R., Laughlin, D., Rus, L., & Hall, J. (2013). Assessment of item-writing flaws in multiple-choice questions. J Nurses Prof Dev, 29(2):52-7. [Google Scholar]
- Okanlawon, A. E, & Adeot, Y. F. (2014). Content analysis of West African senior school certificate chemistry examination questions according to cognitive complexity. IFE PsychologIA, 22(2), 14-26 [Google Scholar]
- Omer, A. A., Abdulrahim, M. E., & Albalawi, I. A. (2016). Flawed multiple-choice questions put on the scale: What is their impact on students’ achievement in a final undergraduate surgical examination? Journal of Health Specialties, 4:270-275. [Google Scholar]
- Orey, M. (2010). Emerging perspectives on learning, teaching, and technology. Zurich, Switzerland: Jacobs Foundation [Google Scholar]
- Pais, J., Silva, A., Guimaraes, B., Povo, A, Coelho, E. et al. (2016). Do item‑writing flaws reduce examinations psychometric quality? BMC Res Notes, 9, 2-7 [Google Scholar]
- Palmer, E. J., & Devitt, P. G. (2007). Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple-choice questions? Research paper. BMC Medical Education, 7(49), 1-7 [Google Scholar]
- Quansah, F., & Amoako, I. (2018). Attitude of senior high school teachers toward test construction: Developing and validating a standardised instrument. Research on Humanities and Social Sciences, 8(1), 25-30 [Google Scholar]
- Quansah, F., Amoako, I., & Ankomah, F. (2019). Teachers’ test construction skills in senior high schools in Ghana: Document analysis. International Journal of Assessment Tools in Education, 6(1), 1–8 [Google Scholar]
- Rahpeyma, A., & Khoshnood, A. (2015). The analysis of learning objectives in Iranian junior high school English text books based on Bloom’s revised taxonomy. International Journal of Education & Literacy Studies (IJELS), 3(2), 44-55 [Google Scholar]
- Rawadieh, S. (1998). An analysis of the cognitive levels of questions in Jordanian secondary social studies textbooks according to Bloom’s taxonomy. Unpublished doctoral dissertation, Ohio University. [Google Scholar]
- Rezaee, M., & Golshan, M. (2016). Investigating the cognitive levels of English final exams based on Bloom’s taxonomy. International Journal of Educational Investigations, 3(4), 57-68. [Google Scholar]
- Roohani, A. (2015). Analyzing cognitive processes and multiple intelligences in the top-notch textbooks. English Language Teaching, 2(3), 39-65 [Google Scholar]
- Rush, B. R., Rankin, D. C., White, B. J. (2016). The impact of item-writing flaws and item complexity on examination item difficulty and discrimination value. BMC Medical Education, 16(250), 3-10 [Google Scholar]
- Sadeghi. B., & Mahdipour, N. (2015). Evaluating ILI advanced series through Bloom‘s revised taxonomy. Science Journal (CSJ), 36(3), 2247-2260. [Google Scholar]
- Soleimani, H., & Kheiri, S. (2016). An evaluation of TEFL postgraduates’ testing classroom activities and assignments based on Bloom's revised taxonomy. Theory and Practice in Language Studies, 6(4), -869 [Google Scholar]
- Solihati, N., & Hikmat, A. (2018). Critical thinking tasks manifested in Indonesian language textbooks for senior secondary students. SAGE Open, 7(9), 1–8 [Google Scholar]
- Sood, R. S., Bendre, M. B, & Sood, A. (2016). Analysis and remedy of the item writing flaws rectified at pre-validation of multiple choice questions drafted for assessment of MBBS students. Indian Journal of Basic and Applied Medical Research, 5(2), 692-698 [Google Scholar]
- Taghipoor, H. (2015). Determining the emphasis on Bloom‘s cognitive domain in the contents of science textbook for the sixth grade. SAUSSUREA, 3(3), 162-175. [Google Scholar]
- Tangsakul, P., Kijpoonphol, W., Linh, N. D., & Kimura, L. N. (2017). Using bloom’s revised taxonomy to analyse reading comprehension questions in team up in English 1-3 and grade 9 English O-Net tests. International Journal of Research – GRANTHAALAYAH, 5(7), 31-41. [Google Scholar]
- Tarig, S., Tarig, S., Magsood, S., Jawed, S., & Baig, M. (2017). Evaluation of cognitive levels and item writing flaws in medical pharmacology internal assessment examinations. Pak J Med Sci, 33(4), 866-870. [Google Scholar]
- Tarman, B., & Kuran, B. (2015). Examination of the cognitive level of questions in social studies textbooks and the views of teachers based on bloom taxonomy. Educational Sciences: Theory & Practice, 15(1), 213-222 [Google Scholar]
- Tarrant, M., & Ware J. (2008). Impact of item-writing flaws in multiple-choice questions on student achievement in high-stakes nursing assessments. Medical Education, 42(2), 198-206 [Google Scholar]
- Tarrant, M., Knierim, A., Hayes, S. K, & Ware J. (2006). The frequency of item writing flaws in multiple-choice questions used in high stakes nursing assessments. Nurse Education Today, 26(8), 662-671 [Google Scholar]
- Tarrant, M., Ware, J., & Mohammed, A. M. (2009). An assessment of functioning and non-functioning distractors in multiple-choice questions: A descriptive analysis. BMC Medical Education, 9(40), 1-8 [Google Scholar]
- Tikkanen, G., & Aksela, M. (2012). Analysis of Finnish chemistry matriculation examinations questions according to cognitive complexity. Nordic Studies in Science Education, 8(3), 258 – 268 [Google Scholar]
- Ulum, Ö. G. (2016). A descriptive content analysis of the extent of Bloom’s taxonomy in the reading comprehension questions of the course book Q: Skills for success 4 reading and writing. The Qualitative Report, 21(9), 1674-1683 [Google Scholar]
- Upahi, J. E., & Jimoh, M. (2016). Classification of end-of-chapter questions in senior high school chemistry textbooks used in Nigeria. European Journal of Science and Mathematics Education, 4(1), 90-102 [Google Scholar]
- Upahi, J. E., Israel, D. O., & Olorundare, A. S. (2016). Analysis of the West African senior school certificate examination (WASSCE) chemistry questions according to bloom’s revised taxonomy. Eurasian Journal of Physics & Chemistry Education, 8(2), 59-70 [Google Scholar]
- Upahi, J., E., Issa, G., B., & Oyelekan, O., S. (2015). Analysis of senior school certificate examination chemistry questions for higher-order cognitive skills. Cypriot Journal of Educational Sciences, 10(3), 218-227. [Google Scholar]
- Zamani, G., & Rezvani, R. (2015). HOTS in Iran‘s official textbooks: Implications for material design and student learning. Journal of Applied Linguistics and Language Research, 2(5), 138-151. [Google Scholar]
- Zareian, G., Davoudi, M., Heshmatifar, Z., & Rahimi, J. (2015). An evaluation of questions in two ESP coursebooks based on Bloom’s new taxonomy of cognitive learning domain. International Journal of Education and Research, 3(8), 313-326. [Google Scholar]
|