Slomp, David
Permanent URI for this collection
Browse
Browsing Slomp, David by Title
Now showing 1 - 15 of 15
Results Per Page
Sort Options
- ItemAfterword: meeting the challenges of Workplace English Communication in the 21st century(The WAC Clearinghouse, 2021) Slomp, David H.; Oliveri, Maria E.; Elliot, Norbert[Abstract not available]
- ItemArticulating a sociocognitive construct of writing expertise for the digital age(The WAC Clearinghouse, 2021) Corrigan, Julie A.; Slomp, David H.Background: In this article, we articulate an updated construct describing domains of expertise in writing, one that meets the contemporary needs of those who research, teach, and assess writing—particularly in a digital age. This article appears in a collection published as a special issue of The Journal of Writing Analytics that explores both the challenges and the opportunities involved in integrating digitally delivered formative assessments into classroom instruction, illustrated by the example of Workplace English Communication (WEC). Each article in this special issue addresses different aspects of the challenges involved in developing assessments of complex tasks. The three framework articles that lead this special issue all highlight the importance of robust construct models as a foundation for assessment design. In this article, we present an integrated sociocognitive-oriented construct model for expertise in writing that informs the assessment design work discussed in this special issue. Research Questions: With the overarching purpose of developing a contemporary, integrated construct, we conducted a critical review of journal articles focused on expertise in writing ability, exploring the following research questions: o RQ1: What knowledge domains necessary for writing expertise are described in research articles from 1971 to 2020? o RQ2: How do these domains coalesce to describe a construct of writing expertise for the digital age? o RQ3: How can this broad construct be extrapolated to an idiographic model that describes the expertise required for writing in workplace contexts assessed by the WEC modules? Methodology: We conducted a critical review of writing scholarship from the past 50 years. The purpose of a critical review is to synthesize the significant scholarship in the field in order to advance theory. We chose 1971 as our starting date, which was the year in which Emig published her seminal study examining writing processes, as opposed to products. Our search parameters included the following: the articles were to address writing constructs or theories in their title or abstract, be peer reviewed, written in English, and written between 1971 and the present (spring 2020). We consulted the databases of ERIC (Educational Resource Information Center), Academic Search Complete, and ProQuest. Then, we conducted a second round of searching via a hand search of the top five ranked journals in writing research. From our initial screening, we eliminated any articles that were either duplicates or irrelevant during a close read of the texts. Articles were eliminated if they were not explicitly focused on construct/theory development and/or made little contribution to construct development; also, some were eliminated if they did not contribute anything new to construct development due to saturation. Once we arrived at our final list of texts, we read the texts and coded them using NVivo over two rounds including provisional coding and pattern coding. Results: Our critical review of 109 texts revealed that the following writing knowledge domains have predominated the literature: metacognitive, critical discourse, discourse, rhetorical aim, genre, communication task process, and substantive knowledge. We bring these domains together to form a sociocognitive construct of writing expertise, which describes the knowledge domains necessary to develop expertise in the digital age. Discussion and Conclusion: After discussing the knowledge domains revealed by our critical review of the literature, we then describe how we take our construct from the nomothetic level and apply it at the idiographic level in the context of the WEC modules that are the focus of this special issue. We conclude by elucidating the implications this construct has for writing curriculum, instruction, and assessment.
- ItemDisrupting white supremacy in assessment: toward a justice-oriented, antiracist validity framework(Taylor & Francis, 2022) Randall, Jennifer; Slomp, David; Poe, Mya; Oliveri, Maria E.In this article, we propose a justice-oriented, antiracist validity framework designed to disrupt assessment practices that continue to (re)produce racism through the uncritical promotion of white supremist hegemonic practices. Using anti-Blackness as illustration, we highlight the ways in which racism is introduced, or ignored, in current assessment and validation processes and how an antiracist approach can be enacted. To start our description of the framework, we outline the foundational theories and practices (e.g., critical race theory & antiracist assessment) and justice-based framings, which serve as the base for our framework. We then focus on Kane’s interpretive use argument and Mislevy’s sociocognitive approach and suggest extending them to include an antiracist perspective. To this end, we propose a set of heuristics organized around a validity argument that holds justice-oriented, antiracist theories and practices at its core.
- ItemEthical considerations and writing assessment(University of California, Davis, 2016) Slomp, David H.In this introductory article, I set the stage for the arguments that follow in each of the contributions to this special issue. First, I critically examine the three pillars of the current Standards--fairness, validity, and reliability--exploring briefly how on their own each concept is insufficient to guiding ethical practice. Then I briefly examine the Standards themselves highlighting their limitations in guiding ethical practice. Finally, I provide a brief introduction to the various dimensions of the theory of ethics we are developing in this special issue.
- ItemForum: issues and reflections on ethics and writing assessment(University of California, Davis, 2016) Elliot, Norbert; Slomp, David H.; Poe, Mya; Cogan, John A.; Broad, Bob; Cushman, EllenWe hope this special issue adds to the body of knowledge created by the writing studies community with respect to the opportunities that can be created when assessment is seen in terms of the creation of opportunity structure. This hope is accompanied by a reminder of our strength as we annually encounter approximately 48.9 million students in public elementary and secondary schools 20.6 million students in postsecondary institutions (Snyder & Dillow, 2015). Our influence is remarkable as we touch the lives of many, one student at a time.
- ItemA framework for using consequential validity evidence in evaluating large-scale writing assessments: a Canadian study(National Council of Teachers of English, 2014) Slomp, David H.; Corrigan, Julie A.; Sugimoto, TamikoThe increasing diversity of students in contemporary classrooms and the concomitant increase in large-scale testing programs highlight the importance of developing writing assessment programs that are sensitive to the challenges of assessing diverse populations. To this end, this paper provides a framework for conducting consequential validity research on large-scale writing assessment programs. It illustrates this validity model through a series of instrumental case studies drawing on the research literature conducted on writing assessment programs in Canada. We derived the cases from a systematic review of the literature published between January 2000 and December 2012 that directly examined the consequences of large-scale writing assessment on writing instruction in Canadian schools. We also conducted a systematic review of the publicly available documentation published on Canadian provincial and territorial government websites that discussed the purposes and uses of their large-scale writing assessment programs. We argue that this model of constructing consequential validity research provides researchers, test developers, and test users with a clearer, more systematic approach to examining the effects of assessment on diverse populations of students. We also argue that this model will enable the development of stronger, more integrated validity arguments.
- ItemAn integrated design and appraisal framework for ethical writing assessment(University of California, Davis, 2016) Slomp, David H.In my introduction to this special issue, I highlighted the insufficiency of key measurement concepts--fairness, validity, and reliability--in guiding the design and implementation of writing assessments. I proposed that the concept of ethics provides a more complete framework for guiding assessment design and use. This article advances the philosophical foundation for our theory of ethics articulated by Elliot (this issue). Starting with fairness as first principle, it examines how safety and risk can be addressed through the application of an integrated design and appraisal framework (IDAF) for writing assessment tools. The paper is structured around two case studies set in Alberta, Canada. Case Study 1 applies Kane's (2013) IUA model of validation to an appraisal--Alberta's English 30-1 (grade 12 academic English) diploma exam program--highlighting in the process the limitations in contemporary validity theory. Case Study 2 examines an assessment design project I am currently undertaking in partnership with 8 English language arts teachers in southern Alberta. This case study examines how the IDAF supports ethical assessment design and appraisal.
- ItemIntroduction: meeting the challenges of Workplace English Communication in the 21st century(The WAC Clearinghouse, 2021) Oliveri, Maria E.; Slomp, David H.; Elliot, Norbert; Rupp, André A.; Mislevy, Robert J.; Vezzu, Meg; Tackitt, Alaina; Nastal, Jessica; Phelps, Johanna; Osborn, Matthew[Abstract not available]
- ItemJustice-oriented, antiracist validation: continuing to disrupt white supremacy in assessment practices(Taylor & Francis, 2023) Randall, Jennifer; Poe, Mya; Oliveri, Maria Elena; Slomp, DavidTraditional validation approaches fail to account for the ways oppressive systems (e.g. racism, radical nationalism) impact the test design and development process. To disrupt this legacy of white supremacy, we illustrate how justice-oriented, antiracist validation (JAV) framework can be applied to construct articulation and validation, data analysis, and score reporting/interpretation phases of assessment design/development. In this article, we use the JAV framework to describe validation processes that acknowledge the role and impact of race/racism on our assessment processes—specifically construct articulation, analysis, and score reporting—on Black, Brown, Indigenous, and other students from historically marginalized populations. Through a JAV framework, we seek to disrupt inaccurate white supremacist approaches and interpretations that for too long have fuelled measurement practices.
- ItemOur validity looks like justice. Does yours?(Sage, 2023) Randall, Jennifer; Poe, Mya; Slomp, David; Oliveri, Maria E.Educational assessments, from kindergarden to 12th grade (K-12) to licensure, have a long, well-documented history of oppression and marginalization. In this paper, we (the authors) ask the field of educational assessment/measurement to actively disrupt the White supremacist and racist logics that fuel this marginalization and re-orient itself toward assessment justice. We describe how a justice-oriented, antiracist validity (JAV) approach to validation processes can support assessment justice efforts, specifically with respect to language assessment. Relying on antiracist principles and critical quantitative methodologies, a JAV approach proposes a set of critical questions to consider when gathering validity evidence, with potential utility for language testers.
- ItemPrincipled development of Workplace English Communication part 1: a sociocognitive framework(The WAC Clearinghouse, 2021) Oliveri, Maria E.; Mislevy, Robert J.; Slomp, David H.Background: This study advances a sociocognitive approach to modeling complex communication tasks. Using an integrative perspective of linguistic, cultural, and substantive (LCS) patterns, we provide a framework for understanding the nature and acquisition of people’s adaptive capabilities in social/cognitive complex adaptive systems. We also illustrate the application of the framework to learning and assessment. As we will show, understanding the connection between measurement models and users’ needs is important to increase assessments’ educative usefulness. Questions Addressed: Our framework is designed to address questions regarding the following four areas: the nature of sociocognitive perspectives in educational measurement, the application of LCS patterns to complex communication tasks captured in an extended formative assessment of Workplace English Communication (WEC), the usefulness of psychometric models for instruction and assessment with such complex tasks, and considerations for measurement modeling. Conclusions: Our study concludes with reflections on the challenges of complex assessments such as WEC, the advantages of sociocognitive modeling for new assessment genres, and the roles of situated measurement models in meeting the challenges.
- ItemPrincipled development of Workplace English Communication part 2: Expanded Evidence-Centered Design and Theory of Action frameworks(The WAC Clearinghouse, 2021) Oliveri, Maria E.; Slomp, David H.; Rupp, André A.; Mislevy, Robert J.Background: In today’s rapidly evolving world, technological pressures coupled with changes in the nature of work increasingly require individuals to use advanced technologies to communicate and collaborate in geographically distributed multidisciplinary teams. These shifts present the need to teach and assess an expanded set of knowledge, skills, and attitudes, including how to communicate at work in collaborative environments using diverse forms of technology. They also present the opportunity to create novel forms of instructional materials and forms of assessment that extend the more traditionally used summative assessments to assessments used for learning and instruction. This design process can be facilitated through the use of conceptual frameworks employed to guide assessment design and development. Their use is important to support more expansive and complex design goals emerging in the design of assessments of 21st century skills such as Workplace English Communication (WEC). In this article, we reflect on an evolving WEC construct needed for today’s economy and discuss implications for expanding how we teach and assess it using formative assessments for learning. We then discuss the features of the expanded Evidence-Centered Design for learning and assessment systems (e-ECD) and Theory of Action (ToA) frameworks and illustrate their integrative application to inform the design and development of WEC training modules (or resources). We conclude with suggestions for next steps in this line of research. Questions Addressed: In reference to the e-ECD and ToA frameworks, our article addresses questions in two areas. We illustrate the benefits of using the ToA to explicitly identify the components of an assessment, its action mechanisms, stakeholders’ needs, score-based decisions and their impact, and the services designed for test takers and users. We illustrate the benefits of using the e-ECD framework to guide design efforts in principled ways to enable consideration of both key elements that relate evidentiary elements relevant to the construct, aspects of learning and assessment, and measurement models. Consideration of these frameworks is important to design assessments and make sense of the evidence for meaningful interpretation of students’ results. Conclusions: This article illustrates the application of conceptual (e.g., the e-ECD and ToA) frameworks that can be used to inform the design and development of similar modules for complex tasks of 21st century skills. This article contributes to the literature on WEC and complex assessments of hard-to-assess constructs more generally by offering a way of thinking about designing, assessing, and then evaluating the design and assessment of interactive educational modules for teaching complex communication knowledge and abilities while remaining attentive to (negative) consequences associated with the stakeholders designing, developing, and using the assessments.
- ItemPrincipled development of Workplace English Communication part 3: an Integrated Design and Appraisal Framework(The WAC Clearinghouse, 2021) Oliveri, Maria E.; Slomp, David H.; Rupp, André A.; Mislevy, Robert J.Background: An expanded skillset is needed to meet today’s shifting workplace demands, which involve collaboration with geographically distributed multidisciplinary teams. As the nature of work changes due to increases in automation and the elevated need to work in multidisciplinary teams, enhanced visions of Workplace English Communication (WEC) are needed to communicate with diverse audiences and effectively use new technologies. Thus, WEC is ranked as one of the top five skills needed for employability. Even so, results of employer surveys report that incoming employees lack communication competency (National Association of Colleges and Employers [NACE], 2018). To address this issue, with a focus on WEC teaching and assessment, we describe a framework used to guide the design of WEC modules. We suggest that conceptual frameworks can be used to inform the design process of the module. In this article, we illustrate one such conceptual framework: the Integrated Design and Appraisal Framework (IDAF). IDAF holds consequences of testing as one of its central elements to guide test design and development. It emphasizes categorically identifying and ecologically modeling variables impacting WEC in general and the writing context in particular. It emphasizes the need for developing clearly articulated construct models to underpin the assessment, as well as incorporating a foundational focus on fairness and social consequences into the design process and use of assessments. Questions Addressed: In reference to the IDAF, our article addresses questions in the following three areas: the nature and benefits of an integrated design and appraisal approach to test design, development, and evaluation; the application of IDAF to complex communication tasks captured in formative assessment scenario-based modules of WEC; and the paramount importance of considering fairness and social consequences in the design and use of assessments administered to diverse populations. Thus, this article elaborates on the use of the IDAF to inform the design of WEC modules by explicitly articulating the needs of the test takers, the anticipated uses of the modules, and the contexts in which the modules would be used. Our questions are designed to address increasing complexities associated with the design of complex constructs such as WEC. This article describes considerations for the development of integrated learning and assessment modules for WEC. We start by reviewing principled assessment design frameworks, which have been used to inform the development of complex tasks across disciplines or fields. Following a description of WEC in terms of domain analysis and design patterns, we illustrate the application of the IDAF to inform the design of the modules. We conclude by providing an overview of our research questions and of how our article addressed them. We also discuss lessons learned with respect to the design of the prototype and the delicate balance of engaging in a principled design process that supports goals that empower students of diverse backgrounds to learn WEC. Conclusions: This article illustrates the application of the IDAF to inform the design and development of WEC modules. This article contributes to the literature on WEC and complex assessments of hard-to-assess constructs more generally by offering a way of thinking about designing, assessing, and then evaluating the design and assessment of interactive educational modules for teaching complex communication knowledge and approaches.
- ItemSex, finance, and literacy assessment(Wiley, 2020) Slomp, David H.Discussions about literacy assessment can often be polarizing for teachers, school administrators, and other stakeholders. Given the diverse and often charged perspectives on assessment within both the profession and the broader public discourse, it can be difficult to engage in productive dialogue about the role that literacy assessment plays in promoting or inhibiting effective models of literacy education. This department provides perspectives, questions, and research that enables readers to better advocate for themselves and their students as they develop their own assessment programs and respond to assessment programs that are imposed on them.
- ItemThe ethical turn in writing assessment: how far have we come, and where do we still need to go?(Cambridge Core, 2023) East, Martin; Slomp, DavidBoth of us were drawn into the writing assessment field initially through our lived experiences as schoolteachers. We worked in radically different contexts – Martin was head of a languages department and teacher of French and German in the late 1990s in the UK, and David was a Grade 12 teacher of Academic English in Alberta, Canada, at the turn of the twenty-first century. In both these contexts, the traditional direct test of writing – referred to, for example, as the ‘timed impromptu writing test’ (Weigle, 2002, p. 59) or the ‘snapshot approach’ (Hamp-Lyons & Kroll, 1997, p. 18) – featured significantly in our practices, albeit in very different ways. This form of writing assessment still holds considerable sway across the globe. For us, however, it provoked early questions and concerns around the consequential and ethical aspects of writing assessment.