Week 13 Annotation – “Understanding creativity in primary English, science, and history”

McLean, N., Georgiou, H., Matruglio, E., Turney, A., Gardiner, P., Jones, P., & Groves, C. E. (2021). Understanding creativity in primary English, science, and history. The Australian Educational Researcher, 50(2), 581-600. https://doi.org/10.1007/s13384-021-00501-4

McLean et al. (2021) conducted exploratory qualitative research of nine Australian primary school teachers to expand understanding of how teaching creativity manifests in the primary classroom and to see how creative thinking is operationalized within the classroom – a gap the authors perceived in the literature. McLean et al provided a robust literature review on creativity, emphasizing that the research gap was in how creativity is manifest in the classroom. This study investigated how “teacher practice can increase students’ creative capacity and creative confidence” (para. 6). Three research questions were investigated: What are primary teachers’ conceptualizations of creativity; What does creativity look like in the classroom, according to teachers? And What, according to teachers are the creative thinking skills associated with each discipline? Nine Australian primary teachers were participants. Three English teachers, three history teachers, and three science teachers represented their disciplines. The data was coded to three themes to answer the research questions: definitions, manifestations, and teaching for creativity. The first finding is that while teachers reported creativity was important to student learning, it was a difficult concept to define. Teachers could articulate creativity for their discipline, but not clearly or in a complete definition. Though all agreed creativity was essential for student learning. Manifestations of creativity were specific classroom practices and examples. Things like dramatic performance of concepts of text analysis were considered manifestations of creativity. Teaching for creativity was also a key theme included skills such as: analysis, communication, curiosity, inquiry, open-mindedness, and problem-solving. However to use the skills, students needed to be able to so in discipline specific ways. Additionally, all teachers spoke of creativity in relationship to foundational skill and knowledge in their area, holding that students could not be creative unless and until they understood the basics for the content area. McLean et al. hold there is opportunity for more research into how creativity is conceptualized in the classroom to eliminate ambiguity of the concept in operationalization in education,

Overall, this article is very well structured. The literature review leads directly to the research questions. The literature review leads also to a clear gap that the research can address. McLean et al. did a great job of clearly articulating the research questions, methodology to address the research questions, and provided coding process. The research results provided clear and specific examples to help the reader fully see the phenomenon at play. The only things that I wish were included that were not included were the questions used in the semi-structured inverview process and examples of the coding. While the process was very clearly described, and there was member checking and validation of the coding process, only the coding reference from NVivo was provided, and it would have been nice to see more details from NVivo, for example.

As an writing teacher, I think I take for granted that my students are going to be creative. It seems a given, that if you’re going to write a paper, for example, there will be an expression of creativity in the process. But from this week’s readings, it’s clear that creativity or having creative thinking  can and should more granular and specific. Seeing that in the classroom spaces, there are other educators who can state creativity is important – because it’s at the top of Bloom’s Taxonomy and we’ve been taught we want students operating in higher order thinking activities such as creation – but being unable to fully articulate what it is to be creative is challenging when you’re trying to help students be creative. I think it helps to have a clear definition of creativity for a field that can be operationalized for a task and for assessment. This article affirmed that the ambiguity of creativity can be further refined through more research.

Week 8 Annotation – Characteristics of productive feedback encounters in online learning

Jensen, L. X., Bearman, M., & Boud, D. (2023). Characteristics of productive feedback encounters in online learning. Teaching in Higher Education, 1-15. https://doi.org/10.1080/13562517.2023.2213168

 

Jensen et al. (2023) provide a digital ethnographic approach to studying the perceived usefulness of feedback for students taking online classes at an Australian and Danish university in courses that were not emergency remote courses due to COVID-19. Jensen et al. situate their argument in the context of productive feedback – feedback a student finds meaningful – and feedback in higher education, noting feedback can come from interactions from humans, technology, or resources. The dataset for this study was derived from 18 students whose online text-based work was observed and 27 semi-structured interviews were conducted using longitudinal audio diaries. The data was thematically coded. Three major themes for feedback emerged. The first was elicited feedback encounters, where students directly asked for feedback. An example of this would be asking for peer review or emailing a question. The second was formal feedback encounters, where feedback is structured as part of the course design. An example is any instructor feedback on a submitted assignment. The final is incidental feedback encounters, where students get information that causes them to reflect on their understanding. An example is a discussion with peers about the directions for an assignment. Feedback has two types of impact: instrumental or substantive. Instrumental feedback is feedback that clearly delineates for the student what action they need to take next. This type of feedback often leads to superficial changes based on direct instruction for what to do. Substantive feedback is feedback that asks a student to critically reflect on their own assumptions. This type of feedback is often ignored because it’s too challenging; however, if the student engages with this feedback it reshapes their understanding of the task, the work, or their own approach. Instrumental and substantive feedback are equally valuable to student learning and serve a purpose. However, all feedback is most valuable to students when it is received when the are open to the feedback and the feedback arrives in time for them to apply it to their current work.

Jensen et al. do a good job of situating their problem in the context of feedback. There was discussion about the framework and approach for feedback, but it was not related back to the discussion or the findings in a clear way. It was also not clear if the authors of the study collected the dataset on their own or used a dataset that was collected for someone else and available for other researchers to utilize to conduct research. It was, however, very easy to follow the conceptual framework to the research questions and methodology used to explore this problem.

Feedback is something I have grappled with for a very long time as an educator. When I teach writing classes, I am always swamped by feedback. When I teach online courses, I log in every single day to read what students post and provide them formative feedback that I hope will shape their work. I’m not always sure that students read and engage with the feedback I give them, except when I require it as part of a reflection assignment (as Jensen et al. pointed out in their literature review). But I do agree that students will happily take surface feature/lower-order concern feedback and apply it easily because it is direct and tells them what to do. For example, if I tell them to restate their thesis and give them a template, they almost always do it. But if I ask them to reframe their thinking on a topic – which would lead to major overhaul of their paper – they often don’t do anything with that feedback. Jensen et al. pointed out that this type of feedback is hard to do. I mean, it’s a big ask to ask a first-year composition student to reframe their entire way of thinking about a topic while at the same time they’re learning how to research, evaluate sources, and all the things that come with learning to do academic research. It defeats the purpose of giving that kind of feedback, however, for me to tell them how to think about the topic differently. But this kind of feedback is successful if they’re ready and open to the conversation of changing how they think.

In online learning, it’s even harder to give this kind of feedback that leads to substantive learning because you can’t see the student to know how well it’s received or even understood. I also don’t always know the right time to give that feedback so it’s useful. In 8 week classes, I’m usually one week behind in grading so they’re getting feedback as they’re working two assignments down from the initial feedback. It’s not really helpful anymore. I need to think about ways to structure feedback so they get it when it’s useful.

Annotated Bibliography – Methodological pragmatism in educational research: From qualitative-quantitative to exploratory-confirmatory distinctions.

References

Foster, C. (2023). Methodological pragmatism in educational research: From qualitative-quantitative to exploratory-confirmatory distinctions. International Journal of Research & Method in Education, 1-16. https://doi.org/10.1080/1743727x.2023.2210063

Foster (2023) argues the unnecessary divide between qualitative and quantitative research methodologies hinders educational research because they are more similar than not. Foster further holds the implementation of methodological pragmatism and discussions of research in terms of exploratory and confirmatory research would increase rigor and collaboration. Foster establishes qualitative and quantitative methodologies create division within educational research, which may also cause researchers to miss out on critical discussions and information within their own research. He argues both qualitative and quantitative data use different analysis techniques, but have analogous issues of defining constants, context, and measurement, reduce data down to its basic form, stripping it of content, context, detail and nuance, and must provide analytic interpretation of data. Foster also points out researcher overreliance on one method over the other might call educational research into question because researchers may not read all literature on their topic or miss out on opportunities to further their knowledge. Foster presents methodological pragmatism as the solution to this problem. Foster argues that methodological pragmatism is an approach which helps researchers solve the logical problem of what methodological tools to use. The end point could be any combination of qualitative, quantitative or mixed-method approaches. Because methodological pragmatism provides more tools and requires researchers to understand their problem fully and positions researchers to better examine their own biases and assumptions, it leads to more range of methodology used in educational research and higher quality research. Foster finally shows that exploratory (source of ideas and discussions) vs confirmatory research (tests hypotheses and conjectures) is a better way to situate discussions of educational research because they emphasize why a researcher is exploring a problem and allows access to the full array of research methods available. Foster also addresses ethical questions of methodological pragmatism including the perception that the practice is about getting results at any cost by countering that no avenue which presents a risk of harm should be used. Ultimately, Foster concludes methodological pragmatism frees researchers to make progress in their research.

Foster’s argument is very well organized. His entire focus of the article remains on expanding the horizons of researchers and providing researchers with accessibility He uses very clear and illustrative examples to make his point establishing the similarities of qualitative and quantitative research. Foster is also up front about the perception of pragmatism as having less intellectual rigor than other approaches. He counters that argument with underlining the rigor having greater access to all methodologies and may force researchers to consider approaches they otherwise would not because they are inhibited by their own ingrained biases toward methodology, theoretical, or epistemological stances.

As a doctoral student preparing to complete a dissertation process, this discussion was helpful to read. I recognize that I come with biases from my previous learning experiences regarding the nature of qualitative work vs quantitative research. Foster’s work did illuminate the many similarities – positive and negative – between both methodologies. I don’t know where I stand in relationship to methodological pragmatism, but I do like the ideas of exploratory vs. conformational research and the way a researcher really needs to be aware of their own biases and theoretical, ontological, and epistemological assumptions. The idea that a method of interpretation is available, regardless of the problem at hand is exciting. I am interested in further exploration of how qualitative and quantitative work can inform each other to provide a better description of a problem in educational research. This article helped me think and resituate the ideas of qualitative and quantitative research that came up during the discussion on assigned readings. I realized I privilege qualitative research in my own mind, first because I am not fan of statistics, and second because I am biased against quantitative research because I didn’t appreciate that even statistics happen in a context and can provide a story.