Annotation – Educational design research: Grappling with methodological fit

Jacobsen et al. (2023) present a conceptual framework to assess methodological fit based on educational design research (EDR) — a term that includes all research approaches that enhance practice and advance scientific understanding. Jacobsen at al. situate the framework in the current discussion and debate of educational methodologies. Researchers seeking out theoretical or practical research must  identify problems worth study which are legitimate, researchable, and research-worthy in theoretical and/or practical terms.  

Jacobsen et al. further explore three orientations of EDR trajectories, which they define as research for interventions, research on interventions, and research through interventions. Research for interventions add to theoretical knowledge and design work. Research on interventions aims to provide information on an intervention’s characteristics. Research through interventions focuses on implementation processes of an intervention. These trajectories are usually combined and used for comparative analysis in EDR work. Jacobsen et al. use two recent dissertations to examine and illustrate the EDR trajectories they describe.

The discussion of the conceptual framework comes to a close through a discussion of why methodological fit is so challenging for researchers. Jacobsen et al. point out methodological fit depends on a variety of factors such as: the researcher’s research expertise in the area, expertise in methodologies, concerns of the researcher and other practical considerations. Four specific challenges to applying the correct methodological fit are identified: asking beginner level questions, focusing on state-of-art, rather than state of practice, insufficient measures for causal inferences, and absence of synthesis. Jacobsen et al. conclude the field of educational research needs more EDR examples to show how valuable this type of research can be.

Jacobsen et al.’s conceptual framework’s main strength comes first from the way the discussion is situated in the current discourse of methodological framework. The analysis of two dissertations to illustrate the concepts of the orientations of EDR trajectories was very strong. Elements of research design were shown at various stages of the dissertation process to illustrate and highlight the iterative nature of creating questions to shift focus of the orientation EDR trajectory. Jacobsen et al. also point out the pitfalls of this type of research for the novice researcher – which is what doctoral students are – and underscore the significance of support and mentorship from faculty if students pursue this avenue of research.

As a doctoral student knowing a dissertation is on the horizon, this conceptual framework is helpful thinking about potential topics and approaches. The most interesting sections of the article for me were the determinants of research-worthy problems and the orientations of EDR trajectories. This also connects back to Salomon & Perkins (2005) and the discussion of a concept of, with, and through technology, though this time focusing on specific intervention, which may or may not be technology. EDR is a complex, but rich way to analyze topics using mixed-methodologies that are brought to bear on a research topic as the research grows.

References

Jacobsen, M., & McKenney, S. (2023). Educational design research: Grappling with methodological fit. Educational Technology Research and Development. https://doi.org/10.1007/s11423-023-10282-5

Salomon, G., & Perkins, D. (2005). Do technologies make us smarter? Intellectual amplification with, of and through technology. In R. J. Sternberg, & D. D. Preiss (Eds). Intelligence and technology: The impact of tools on the nature and development of human abilities (pp. 71-86). Mahwah, NJ: Lawrence Erlbaum Associates.

Annotation – “The Triple-S framework: ensuring scalable, sustainable, and serviceable practices in educational technology”

Moro et al. (2023) present a new research-based framework, Triple-S Framework, for educators and institutions to consider before electing to adopt and adapt educational technology into learning spaces. The research-based framework was built in the context of every-evolving technology and the push and drive of institutions and educators to adopt the latest technology to remain relevant, the financial and practical cost of technology implementation, and student desire to see more consistent technology implementation. The Triple-S Framework guides institutions and educators to evaluate the scalability (continued growth of use) , sustainability (long-term implementation viability) , and serviceability (access to skills, tools, and resources to maintain use of technology)of educational technologies that are implemented into the schools. Moro et al. provide an overview of common and trendy educational technologies from most scalable, sustainable, and serviceable (digital text texts and images) to least scalable, sustainable and serviceable (VR technology) to illustrate application of the model.

Moro et al. (2023) provide a clear presentation of the need for a framework that takes into account not just learning out comes, but long-term viability of educational technology intervention in classrooms and institutions. The examinations of common, widely used educational technology such as digital texts and images, audio, slideshow presentations, and video allow for newcomers to the framework to bring their practical experience to bear on the benefits and pitfalls of technology implementation. Progressing to apps, which are accessible to use, but not necessarily to create then extends the application of the framework to less common technologies to show how the Triple-S framework is practical and accessible to researchers, educators and decision makers. Moro et al. also use very easy-to-grasp common language when explaining their framework. An college professor with no formal educational training can pick this up and implement the steps without having to do much work to make it happen. It’s very practical.

As a college administrator, I really love the practical examples and explanations that are provided and grounded in research. I can see that there are clear steps, questions, and processes to follow. This would be very easy to use as a jumping off point for discussions with faculty about technologies they would like me to purchase from my budget to use in their classroom. As a doctoral student, I can only hope to strive for the level of clarity of explanation, clear connection to the literature, and clear and concise applications to make the research I do practical for the practitioner and administrators who support practitioners.

Moro, C., Mills, K. A., Phelps, C., & Birt, J. (2023). The triple-s framework: Ensuring scalable, sustainable, and serviceable practices in educational technology. International Journal of Educational Technology in Higher Education, 20(1). https://doi.org/10.1186/s41239-022-00378-y

Annotated Bibliography – Methodological pragmatism in educational research: From qualitative-quantitative to exploratory-confirmatory distinctions.

References

Foster, C. (2023). Methodological pragmatism in educational research: From qualitative-quantitative to exploratory-confirmatory distinctions. International Journal of Research & Method in Education, 1-16. https://doi.org/10.1080/1743727x.2023.2210063

Foster (2023) argues the unnecessary divide between qualitative and quantitative research methodologies hinders educational research because they are more similar than not. Foster further holds the implementation of methodological pragmatism and discussions of research in terms of exploratory and confirmatory research would increase rigor and collaboration. Foster establishes qualitative and quantitative methodologies create division within educational research, which may also cause researchers to miss out on critical discussions and information within their own research. He argues both qualitative and quantitative data use different analysis techniques, but have analogous issues of defining constants, context, and measurement, reduce data down to its basic form, stripping it of content, context, detail and nuance, and must provide analytic interpretation of data. Foster also points out researcher overreliance on one method over the other might call educational research into question because researchers may not read all literature on their topic or miss out on opportunities to further their knowledge. Foster presents methodological pragmatism as the solution to this problem. Foster argues that methodological pragmatism is an approach which helps researchers solve the logical problem of what methodological tools to use. The end point could be any combination of qualitative, quantitative or mixed-method approaches. Because methodological pragmatism provides more tools and requires researchers to understand their problem fully and positions researchers to better examine their own biases and assumptions, it leads to more range of methodology used in educational research and higher quality research. Foster finally shows that exploratory (source of ideas and discussions) vs confirmatory research (tests hypotheses and conjectures) is a better way to situate discussions of educational research because they emphasize why a researcher is exploring a problem and allows access to the full array of research methods available. Foster also addresses ethical questions of methodological pragmatism including the perception that the practice is about getting results at any cost by countering that no avenue which presents a risk of harm should be used. Ultimately, Foster concludes methodological pragmatism frees researchers to make progress in their research.

Foster’s argument is very well organized. His entire focus of the article remains on expanding the horizons of researchers and providing researchers with accessibility He uses very clear and illustrative examples to make his point establishing the similarities of qualitative and quantitative research. Foster is also up front about the perception of pragmatism as having less intellectual rigor than other approaches. He counters that argument with underlining the rigor having greater access to all methodologies and may force researchers to consider approaches they otherwise would not because they are inhibited by their own ingrained biases toward methodology, theoretical, or epistemological stances.

As a doctoral student preparing to complete a dissertation process, this discussion was helpful to read. I recognize that I come with biases from my previous learning experiences regarding the nature of qualitative work vs quantitative research. Foster’s work did illuminate the many similarities – positive and negative – between both methodologies. I don’t know where I stand in relationship to methodological pragmatism, but I do like the ideas of exploratory vs. conformational research and the way a researcher really needs to be aware of their own biases and theoretical, ontological, and epistemological assumptions. The idea that a method of interpretation is available, regardless of the problem at hand is exciting. I am interested in further exploration of how qualitative and quantitative work can inform each other to provide a better description of a problem in educational research. This article helped me think and resituate the ideas of qualitative and quantitative research that came up during the discussion on assigned readings. I realized I privilege qualitative research in my own mind, first because I am not fan of statistics, and second because I am biased against quantitative research because I didn’t appreciate that even statistics happen in a context and can provide a story.

Annotated Bibliography – “Enhancing the learning effectiveness of ill-structured problem solving with online co-creation”

In this early empirical study on co-creation in learning, Pee (2019) attempts to support the hypothesis that the open-ended nature of ill-structured problem solving (ISPS) can be used to a learner’s advantage in increasing cognitive and epistemic knowledge. Three concepts were derived from business disciplines, where co-creation is commonly used, to develop a framework to determine of online co-creation to test for increased student learning in ISPS: solution co-creation, decision co-creation, and solution sharing. Pee created an asynchronous, voluntary, and optionally anonymous activity on Blackboard for students to participate in decision co-creation for evaluative criteria and then to discuss their solutions to the problem in the assignment to engage in co-solution sharing and solution co-creation. Pee interprets the student survey results to indicate that by engaging in online co-creation, learning increases. Ultimately, Pee suggests that while this study is an early study and cannot yet be generalized, it should be replicated in other areas and current course instructors can implement this method to increase learning in the context of working with ISPS.

            While the article excels in presenting its data visually and the limitations of the study are adequately acknowledged, there are areas of concern in the arguments Pee presents. While Pee (2019) presents a cogent statistical analysis of the survey deployed to students (n=225), and the survey had an excellent return rate of 70.3%, the findings were presented in the article as proving learning increased when the survey measured student perception of learning. A brief follow up interview with 13 students was mentioned in the article, but these were not discussed in depth and did not support the hypothesis that learning had increased, but a single student was quoted as showing their perception of learning increased. Finally, the examples to illustrate the method for collecting data was described and limited to the graduate student sample who made up only 32.4% of the sample size – the undergraduate student experience shaped most of the survey results, but it was not described in the methodology or discussion. Pee draws a conclusion that the survey results show the model for co-creation online worked in a classroom to “leverage the multiplicity of ISPs,” to enhance student learning, not noting the survey can only measure objective perception since student work was not evaluated or controlled for with groups who used co-creation and groups who did not.

            As a writing teacher, the idea of ill-structured problems and co-creation is interesting to me. Writing is often difficult to teach because it’s amorphous and doesn’t have a “right” answer. The idea of online co-creation where students work together to contribute to discussions of how a project will be evaluated is exciting because it shifts the burden of teaching in an ISP context from the instructor only to instructor and students. I like this idea in terms of establishing rubrics that are more individualized for learners to help them grow their writing in ways they find relevant while also meeting course standards and outcomes. As a doctoral student, I am interested in the ways students perceive their own learning versus how instructors perceive student learning based on knowledge acquisition. I find the methodology and framework used by Pee to study perception of learning is interesting.  

References

Pee, L. G. (2019). Enhancing the learning effectiveness of ill-structured problem solving with online co-creation. Studies in Higher Education, 45(11), 2341-2355. https://doi.org/10.1080/03075079.2019.1609924