Where Did He Come From? Where Did He Go?

Book cover for the reading – MĂ©tissage: A Research Praxis. Retrieved from https://www.worldcat.org/title/handbook-of-the-arts-in-qualitative-research-perspectives-methodologies-examples-and-issues/oclc/141187869

 

Is this a real study or Is this just fantasy?

I’m caught in a landslide of readings, with no escape from reality.

But it helped me open my eyes, and look up to the future and see.

I’m just a poor science teacher, I need some sympathy.

Because summer came and is about to go, the stress pretty high, not very low.

Any way this blog goes doesn’t really matter to me, because it’s about me.

 

 

As the end of the semester is in sight, I feel this image is an accurate representation of how I felt before and after this journey.

Retrieved from https://andertoons.com/science/cartoon/8299/back-in-my-day-theyd-build-an-actual-physical-maze

Appreciating Research – as a Scholar

Looking at the whole cartoon, it symbolize the need to examine research (characters) in detail to understand the findings (the joke). My background in sciences presented a biased view that published research articles have reliable findings because it made it through peer-reviews that would have pointed out flaws in its design. Yet McAteer (2013) pointed out examples in their article where researchers deliberately selected data which support their hypothesis or manipulated their data to produce a favorable result. Combined with the infamous Wakefield (1998) study that ‘linked’ measles, mumps, and rubella vaccinations to autism, I had to re-evaluate my belief in published articles.

Retrieved from https://beijingimmj.wordpress.com/research-guide/

O’Cathain’s (2010) proposed framework to assess the quality of mixed method research is perhaps the most detailed criteria list I’ve encounter in this course. However, I favored the Mixed Method Appraisal Tool to be more concise and easier to apply. Another useful tool is Boote and Beile’s (2005) literature review rubric, which helps readers assess whether the researchers have a full understanding of the terminology and current understanding in their field of research; as opposed to following standard methodology without comprehension its necessity. In previous blog posts (here and here), I applied those tools to articles after an initial reading and found changed my perception of the research afterwards. In the former case I questioned the validity of the findings due to vagueness in the report, and the latter lacked a broader inspection in their literature review. Going forward, I have a better appreciation for reading research articles which directly translates into a better understanding for teaching the scientific method as a science teacher.

Appreciating Research – as a Teacher

As educators, we receive a lot of resources and workshops on how to improve our practices. Being caught up in the energy of presentation and potential to improve our classes,  it is tempting to implement innovations immediately. However, I believe we need to examine those ideas much like how we examine research and its findings. For example, we discussed how classes can incorporate usage of social media such as Twitter or blogs to foster student relations and interactions; and saw how it was successful in its implementation and purpose in our own Masters’ cohort. However, it is important to have reservations about re-structuring our own classes to model this without doing prior research such as seeking administrative or parental approval. Both those parties, and teachers themselves, need to keep the safety of the students first, thus require examination of whether those services adhere to the Freedom of Information and Protection of Privacy Act (FOIPPA). After looking into those aspects, there is still another important party to consider: the students. In the last school year, I experimented with incorporating  school district’s Idea-X challenge into my Applied Design, Skills and Technology (ADST) class. The challenge aligned with the course’s curricular goals, so I let the class decide whether to pursue this at the start of the second semester. The school approved it, my students were interested, and parent consent forms were submitted; everything was on-track until the first information session. The organizers asked each team to create a group Instagram account, a social media service I know most of my students use. Their motivation dropped faster than Facebook share prices in 2018 and some even avoided class.

Retrieved from https://tenor.com/view/hiding-office-work-gif-5391795

Given that I saw students twice a week and have taught them for roughly three months back then, I sorely underestimated how well I knew my class. Their unwillingness to have a digital presence outside of their closed circle, combined with some unclear expectations of the group account, made the whole exercise flop despite my attempts to offer social media support. Looking back now, rather than treating that experience as a one-off occurrence or giving up, the source of error from not researching deeper about my students’ comfortability with social media will be a key consideration for future activities. I also feel better about my decision now in framing my willingness to drop the project as exemplifying learner-centered pedagogy. as opposed to insisting upon it because it meets curricular competencies.

Open Mindset – as a Researcher

Looking back at the first comic, I see myself as the mouse in the corner making fun of non-traditional (constructivism) approaches to research. This meant that I placed a greater significance on studies that quantify and directly prove causation than those describing social observations and analysis, which might be open to interpretation. This mindset changed from two realizations: (i) quantitative approach being unsuitable for social research, and (ii) other methodology being equally as rigorous as quantitative studies. The second aspect was heavily influenced by Onwu and Mosimege (2004), where they clearly answered how oral practices in traditional medicine is subject to the same replicability that is expected in the scientific method; lack of empirical documentation should not make it less valid than Western science (discussed in previous blog). To wean myself off the superfood that is quantitative research, I began exploring a more balanced diet such as mixed methods (includes familiarity of traditional positivism and feasibility of constructivism in social research), and action research (blending of education theory and practice). As of now, action research seems to be the ideal path forward as it focuses on improvement of practice, which is essential to myself as an educator and one of two main reasons that I enrolled in the Master’s program.

Reason #2. Retrieved from https://pixabay.com/illustrations/packs-pile-money-finance-currency-163497/

Open Mindset – as a Teacher

I have always viewed myself as a flexible teacher in being open to new technology, pedagogy, and student suggestions, but I still find moments like the researchers in the first cartoon where I simply use modern technology to do the same thing as before. It struck me like thunderbolt and lightning, and was very, very frightening that I tossed a pile of notes at the students and expected them to regurgitate it on an exam before the end of the year. My main goal over the summer break was to spare them their life from this monstrosity and look for more engaging ways to connect the content to conversations or critical reflections. I also had to figure out what Aboriginal education is, and how to include it into classes as part of BC’s new curriculum. Luckily, the course readings have shown me that each of these individual ideas are interconnected, and not like Beelzebub has these tasks put aside for me, and just for me. Firstly, the availability of resources online is not exclusive to online and blended courses, but rather a movement towards open pedagogy in education. Students in face-to-face classes can also access these resources, such as online textbooks and video sites, providing a breadth of available material. Open access allows teachers and students to engage in learner-centered pedagogy and not be restricted by the availability of resources or expertise. The latter point involves students (or through teacher moderation) connecting with specialists in field via social media or simple email, to address their curiosity. One example of this being done effectively is at the Pacific School of Innovation and Inquiry, as mentioned by Jeff Hopkins in our meetings. These interactions may also serve as a learning opportunity for students regarding professional digital citizenship, such as curating and communicating with individuals online; a necessary skill that I might be able to model in my quest in writing a research project. Turning to course design, a correctly scaffolded model of this personalize inquiry would circumvent the issue of online education being “low context so that it can be consumed by any user, anywhere” (Tessaro, Restoule, Gaviria, Flessa, Lindeman, and Scully-Stewart (2018). Finally, all these parts can be woven under the banner of Indigenous pedagogy, whereby focus of learning is through conversation (be it synchronous or asynchronous) as opposed to assembly-line construction without seeing what the final product will be. It turns out my summer homework will be less strenuous under a natural, holistic lens rather than compartmental; something students may notice when examining their own learning under this model.

But before tackling these ideas for the upcoming school year, I need to look after my own health and wellness. So for now:

Nothing really matters, everyone can see my blog.

Nothing really matters

Nothing really matters to me~

“ART FLYER” by Andrecio Alves is licensed under CC BY-NC 4.0

Where’s my Knowledge at?

I used the film, “Kitchen Stories” as an assessment of my understanding with research methodologies to date; focusing on Researched & Research.

Also, this might be a fun way to teach/assess units in Sciences; point out related concepts or in films / movies (connections to abstract ideas).

Regarding Researched:

  • The film did not mention why Norwegians volunteers were chosen (poor quality research! not explaining sampling rationale), but it’s clear they did not understand how their work might be perceived by the participants (resentment to Swedish observers). When applied to modern study on say, Indigenous Knowledge Systems, a “settler” researcher may face or present an uncomfortable presence when observing Indigenous healers at work. I feel researchers may need to their awareness of different contexts and perspectives when conducting social research, recognizing how their methodologies may not as neutral as it appears.
  • The researched (observed) in the film can also be a metaphor advocating for Indigenous pedagogy; how learning occurs as an collaborative effort between the learner and teacher.  The defining phrase for me was when Isak said “How can we understand each other without communicating?” In traditional “factory process” content-driven pedagogy (traced back to industrialization & mandatory education), this would never occur as the teacher disseminates knowledge to the student vessels. Issues with this model has prompted the paradigm to shift to Inquiry-based approach, of are already present in Indigenous pedagogy. I believe a learner-focused approach would benefit both members. Students can self-advocate concerns and learning intentions, while being open to suggestions in the form of conversations. Teachers no longer worry over engagement or management, and may gain knowledge from the students as well. This applies to research as well: researchers could gain a lot more information in asking participants compared to simple observations (seeing Isak not answer the phone vs asking about it and understanding it’s due to costs).

Changes regarding Research

  • Coming from a Western science background, I agree I was believer in positivism because of it’s detailed observations, interpretation, replicability, and use in prediction modeling. After reading Onwu & Mosimege (2004), that has changed a bit, that is I’ve recognized my bias in viewing Indigenous Knowledge Systems (IKS) poorly because it has not been empirically studied and validated. Traditional medicine being passed down in oral history (ethnography) and in practice makes it no less rigorous than documenting the same process in print (if it didn’t work, won’t be passed down). A personal example would be my unfavorable view of Traditional Chinese Medicine (TCM). I held Western Medicine to be more ‘valid’ because the language used is clear, uses trial & error in controlled studies, and provides cause & effect. Where as TCM says things like you have too much ‘fire’ in your body? What the heck? However, TCM is an accepted as a practice today and parts of it have been tested and explained in detail using Western Science. This goes to prove Onwu & Mosimege’s point that a combination of both system would be ideal. I believe it’s possible that traditional or IKS can help guide Western Science in a novel direction, while the latter can help explain specific interactions or improve the former’s practices. Going forward, I feel more comfortable with the inclusion of IKS into courses (specifically in sciences) because I have an authentic understanding of how it connects to the content; as opposed to it being a checklist item. Specifically, that methods of assessment and validations may look different, but one is not more rigorous than the other; and that a combination of the two would provide a more holistic picture. Consider two students: one consistently scores 90% by themselves,  whereas the other worked intensively with the teacher to understand their misconceptions whilst overcoming crippling anxiety to reach 86%. Reporting solely on Proficiency (percentage / letter grade) would encapsulate the dedication and resilience of the latter, while reporting for Progress (anecdote / comments) would disregard the former’s understanding of themselves and how they learn best. Teachers generally include both when reporting out, so why couldn’t social researchers do the same? Bringing it to my own focus for a potential research direction: using a mixed method model to provide a more meaningful summary of student learning achievement and experiences in distance-education courses.

Where’s the Lit. Review?

This week, I decided to focus on reviewing literature reviews using Boote and Beile’s scoring rubric (pg.8) as a means to familiarize myself with what quality literature review should encompass. I chose George Veletsianos’ article simply because it clearly states the section for literature review.

  1. Coverage
    • Justification of reviews: 1/4
      • No statements regarding the exclusion (or inclusion) for selection of article reviews, and instead mentions “little is known about faculty harassment online” (Veletsianos et al, 2018). I interpreted that as he’s aware there’s probably some research on it, but didn’t try very hard to look for it. For instance, he notes several other studies looking at women’s experience online, and even Duggan (2014) finding “women who are in the public eye or who use technology to promote their work—such as scholars—are placed at even greater risk”. Now examining Duggan’s in detail, I noticed it examined people between 18-24 years old; probably lacking faculty members. But wait, wasn’t George focused on scholars? Wouldn’t this age group not include some novice scholars who are in their post-secondary studies? This further makes me question what he defines as “scholar”, which he doesn’t explain until later. I strongly believe he should have clarified his terminology sooner, which helps support his claims of having few existing research on them. This was why I gave him a mark of 1 out of 4.
      • On a side note, it’s hilarious reading his assumptions of the online world being egalitarian. It makes one worry about how out-of-touch researchers are with the rest of the world (to all tenured profs teaching first-year undergraduate courses: we have no idea what language you’re speaking).
  2. Synthesis
    • What has been/needs-to-be done: 3/4
      • Critically examined how existing research found women experiencing more online harassment than men, but his target group (“scholars”) have yet to been studied. He does not introduce new methodology, just a need to apply same methods to different group.
    • Topic in broad scholarly literature: 3/4
      • Raised issue of online harassment curtailing women’s participation, leading lack of diversity in future literature. Doesn’t offer any methods such as examining publication ratios based on gender to assess whether it has occurred.
    • History of topic: 1/4
      • Does not discuss history of online harassment or history of coping strategies. For instance, has this issue been persistent or increased since the introduction of the internet? Is there a favored coping strategy or is it changing?
      • Honorable mentions to Alexandria Ocasio-Cortez, who is epitomizing the coping strategy of “clapback“.
    • Acquired & enhanced jargon: 2/4
      • Defined harassment, scholars, and categories of coping strategies employed by female scholars. Scholars definition perhaps differs from general usage (perhaps my bias in extending scholars to graduate students; given that Bootes & Beile found dissertations that were akin to high school essays). Did not discuss or resolve ambiguities in definitions.
    • Important variables relevant to topic:  2/4
      • Suggested that internet anonymity helps foster toxic behavior, as well as lack or inefficient moderation. In addition, the different strategies to cope with harassment.
    • Gained new perspective: 1/4
      • Generally accepted current literature on the prevalence of disproportionate harassment faced by women online. Which is rather welcoming compared to his 2013 article viewing the world(wide web) with rosy glasses.
  3. Methodology
    • Pros/cons of methodologies: 2/4
      • Mainly describes findings from other literature, sometimes provides method employed in those studies (ie. survey).
      • (Regarding his own study) Does not elaborate why methodologies were acceptable, such as iterative interviews (merely common standard) or sample size of 14 (because they “felt” answer was found).
    • Connecting ideas to methodology: 2/4
      • Described research methods, but not critiqued their strengths or weaknesses.
  4. Significance
    • Practical benefit: 2/4
      • Adds to existing knowledge – suggest methods to prepare for or cope with online abuse.
    • Theoretical benefit: 2/4
      • Research would add to existing knowledge – show evidence of online harassment and perhaps development of new coping strategies or policies.
  5. Rhetoric
    • Eloquence: 3/4
      • Article was fairly well written and used language that, for the most part, would be understood by the general population.

~Quick assessment of the overall article quality~

Researcher: Covered by George introducing himself to our class.
Researched: Covered by literature review, assessed above.
Readers: I’m honestly have trouble with the significance of the research itself as it focuses on Experiences and Coping Strategies of female scholars, as opposed to practical solutions to deal with harassment in general. From past research about online harassment in general, one could safely assume that it would extend/include scholars as well. George’s finding suggest institutions provide training to help navigate social media; seems very un-intuitive. Why not create safeguards to prevent harassment instead of just preparing for harassment? A more useful study would have been implementation of safeguards to prevent or reduce harassment and it’s effectiveness.
Research: we’ve all read the article… right? 🙂

Where’s My Comparison? Assignment 2 – EDCI 515

Background Information about the Article

Ever since the British Columbia (BC) modernized it’s K-12 curriculum and began implementing it in 2016, teachers have been re-visiting their pedagogies to shift from content-focused teaching to inquiry-based or problem-solving approaches. As a high school educator who teaches both blended- and distance-learning classes, I have been looking for models for online learning spaces where students work collaboratively. The article I chose, “Wikis for a Collaborative Problem-Solving (CPS) Module for Secondary School Science”(DeWitt, Alias, Siraj, & Spector, 2017), used Wiki pages as their platform for students’ online collaboration. Their research looked at three aspects: (1) what types of interactions occurs on the Wiki in the context of learning science, (2) what extent does CPS in a wiki encourage social and cognitive processes, and (3) what extant does the CPS module improve achievement (DeWitt et al., 2017). Delving into the background of this study, I found that DeWitt, Alias, and Siraj are associate professors from the University of Malaya, Malaysia. In addition, DeWitt herself had previously worked for Malaysia’s Ministry of Education’s Educational Technology Division; according to her biography. The literature supporting this study stems from a report by Ministry of Education of Malaysia (2013) finding that mathematics and science achievements in the country have declined in the past few years, and that “few studies have examined online collaboration and problem-solving in science” (DeWitt et al., 2017). The population which this article samples from are 31 volunteers from an urban high school in Malaysia which includes students of varying academic strengths, reflective of the multiracial community, as well as simple convenience for the researcher. Their findings could useful for fellow researchers, teachers, and policy makers in Malaysia to consider future direction of research, classroom implementation, or curricular development.

How the Research was Carried Out

The researchers grouped their volunteers into teams of 7 or 8 students and assigned each group with a type of meal. The goal was that in three weeks, they needed to have analyzed the food classes present in that meal. A CPS module was already developed for this activity and students received a laptop as well as an orientation on how to use the Wiki, learning resources, and problem tasks available on the module. Researchers collect all the discussions on the wiki, student journals, and individual student interviews, then manually coded as one of the following interactions:

Learner – Content: learners engaging with the content.

Learner – Learner: interactions between the students.

Learner – Instructor: interactions with the teacher.

Learner – Interface: interaction with the technology medium.

The first three categories were described by Moore & Kearsley (2005), while Learner-Interface was described by Hillman, Willis, & Gunawardena, (1994). DeWitt et al., also looked the at frequency and further categorization of each learner interaction into social and cognitive processes shown in Table 1. Lastly, the researchers employed a pre- and post-test that contained simple, open-ended questions to assess the students’ proficiency in regards to food classes as a result of the CPS module.

Research Findings and Discussions

After collecting and coding all communications, DeWitt et al. quantified the frequency of each type of interaction and summarized it in Table 2. They noted the low amount of Learner – Learner and Learner – Interface communication, which they believed was due to discussions occurring outside of the wiki such as in face-to-face meetings between the students. They also pointed to studies by Ertmer et al. (2011) and Huang (2010) who both carried out research on online interactions on Wiki that also saw similar lack of interactions.

To examine the extent of CPS encouraging social and cognitive processes, the researchers had further divided the communications into categories seen in Table 1 and posted the frequency of those interactions in Table 3. They noted most interactions were cognitive processes with a total of 69.3%, followed by teaching process with 12.9%, and social process of 4%. DeWitt et al., attributed the lack of Triggering and Exploration cognitive processes to students believing the Wiki should contain solutions to the problem, or that they occurred in discussions between face-to-face interactions that were not captured by the researchers.


The third aspect of the research looked at effectiveness of the CPS module in learning about food classes, which was conducted by examining students’ pre-test and post-test scores. They found a significant difference between the means of the paired tests, as seen in Table 4. This lead DeWitt’s team to conclude that their study showed that applying the CPS method on Wiki allowed for varying types of interactions, promoted social and cognitive processes in learners, and resulted in an improvement of students’ knowledge.

Applying O’Cathain’s Proposed Framework of Quality

After reading my selected article, I found myself questioning portions of the research process. Thus, I considered re-examining DeWitt’s study under the lens of the proposed Comprehensive Framework by O’Cathain (2010). I applied each domain to the whole study and considered whether I could clearly state the presence of each element within the article. Bolded elements indicated where quality of research was in question.

  1. Planning Quality
  • Foundational element: clearly stated in introduction.
  • Rational Transparency: doesn’t explicitly state the type of research being conducted, or why the Mixed Method approach was taken.
  • Planning Transparency: purpose of study clearly outlined.
  • Feasibility: study could be completed within short timeframe.
  1. Design Quality
  • Design Transparency: design type is known and process was described.
  • Design Suitability: mixed method approach may be most convenient, but the qualitative and quantitative elements feels unsuitable. For instance, examining pretest and posttest scores of online CPS module compared to traditional in-person teaching would provide far stronger argument of the difference in effectiveness. Similarly, lack of certain interactions such as learner-learner doesn’t necessarily mean students are interacting face-to-face; actual absence of interaction or interacting with people outside of student group is strong a possibility.
  • Design Strength: study was not optimized for breadth as the test scores were analyzed according to standards set by just 2 Biology teachers. Depth of the study in terms of coding interactions into different categories was also prone to bias, as it was done by just 2 researchers.
  • Design Rigor: rigor is questionable as the researchers included a “noise” category in interactions. I strongly believe that student comments were not completely without rationale and should be considered as affectionate attitude instead. Furthermore, 3 weeks for collaboration between 31 students without any mention of teambuilding or apparently scaffolding to facilitate collaboration could account for lack of interactions between students.
  1. Data Quality
  • Data Transparency: collection method and data were available.
  • Data Rigor: collection of student interactions where not conducted with rigor, particularly in realizing possibility of face-to-face discussions that could not be collected.
  • Sample Adequacy: 31 student volunteers were not an adequate sample size, nor representative of a usual classroom dynamic such as including disengaged students.
  • Analytic Adequacy: Qualitative aspects of describing student interactions relied on interpretations of just 2 researchers.
  • Analytic Integration Rigor: not implemented with rigor as transformation of qualitative data (categorization of comments) into quantitative (frequency) was conducted and checked by just 2 researchers.
  1. Interpretive Rigor
  • Interpretive Transparency: clear which findings came from which method.
  • Inference Consistency: some consistency between inference and findings. Although lack of interactions such as learner-learner or learner-interface was not completely adequate. Students may actually be working individually without collaborating, or they do not remember how to contribute to the Wiki page.
  • Theoretical Consistency: findings consistent with current knowledge.
  • Interpretive Agreement: other likely to reach similar conclusion based on findings.
  • Interpretive Distinctiveness: conclusion are more credible than other possibilities.
  • Interpretive Efficacy: meta-inferences appropriately incorporates from qualitive and quantitative.
  • Interpretive Bias Reduction: bias reduction not taken as research team comprised of staff at same university.
  • Interpretive Correspondence: Inferences correspond to purpose of research study.
  1. Inference Transferability
  • Ecological Transferability: difficult to apply findings to other contexts such different subjects, or other settings like schools outside of Malaysia.
  • Population Transferability: difficulties to apply findings to other population dynamics such as rural populations lacking internet access, or to regular classroom dynamics with that include students who require learning assistance with Individual Education Plans (IEPs).
  • Temporal Transferability: Has potential for further research or future policies.
  • Theoretical Transferability: Has potential to be re-assessed using different research method or different tools for analyzing findings.
  1. Reporting Quality
  • Reporting Availability: report assumed to be successfully completed within time and budget.
  • Reporting Transparency: report assumed to adhere to Good Reporting of a Mixed Method Study (GRAMMs).
  • Yield: report provides worthwhile result compared to two individual studies.
  1. Synthesizability
  • I applied the Mixed Methods Appraisal Tool by Hong et al., (2018) to this article as well.

  1. Utility Quality
  • Findings from article for potential for researchers, educators, and curriculum designers.

 

To summarize, I had various questions and concerns regarding the chosen article which were clearly highlighted and described in detail when applied to the framework. Had those markers been more noticeable or considered, I believe it would have increased the overall quality of the research. From my perspective, the research by DeWitt et al. is worth re-examining under more stringent conditions. One example was suggested by the article itself, which was to reduce the possible of face-to-face discussions by having students further separated by geographical placements. Another consideration is to offer concurrent classes of in-person versus purely-online CPS module. This would still be feasible to conduct in the same time frame and would reduce confounding variables, such as differences in content, to allow accurate assessment of delivery method. The researchers could also provide more transparency in how they categorized students’ communications, as well as seeking agreement from more than 2 researchers to reduce bias in interpretations.

 

 

 

References

DeWitt, D., Alias, N., Siraj, S., & Spector, J. M. (2017). Educational Technology & Society. 13.

Ertmer, P. A., Newby, T. J., Liu, W., Tomory, A., Yu, J. H., & Lee, Y. M. (2011). Students’ Confidence and Perceived Value for Participating in Cross-Cultural Wiki-Based Collaborations. Educational Technology Research and Development, 59(2), 213–228.

Hillman, D. C. A., Willis, D. J., & Gunawardena, C. N. (1994). Learner interface interaction in distance education: An Extension of contemporary models and strategies for practitioners. The American Journal of Distance Education, 8(2), 30-42.

Hong, Q.N., Pluye, P., Fabregues, S., Bartlett, G., Boardman, F., Cargo, M., Dagenais, P., Gagnon, M-P., Griffiths, F., Nicolau, B., O’Cathain, A., Rousseau, M-C., & Vedel, I., (2018). Mixed Methods Appraisal Tool (MMAT) Version 2018. McGill University, Department of Family Medicine. Retrieved from http://mixedmethodsappraisaltoolpublic.pbworks.com/w/file/fetch/127916259/MMAT_2018_criteria-manual_2018-08-01_ENG.pdf

Huang, W.-H. D. (2010). A Case Study of Wikis’ Effects on Online Transactional Interactions.

Ministry of Education Malaysia (MOE). (2013). The Malaysia education blueprint 2013 –2025: Preschool to post secondary education. Putrajaya, Malaysia: Ministry of Education Malaysia. Retrieved from https://www.ilo.org/dyn/youthpol/en/equest.fileutils.dochandle?p_uploaded_file_id=406

Moore, M., & Kearsley, G. (2005). Distance education: A Systems view stems view (2nd ed.).
Ontario, Canada: Thomson Wadsworth.

O’Cathain, A. (2010). Assessing the Quality of Mixed Methods Research: Toward a Comprehensive Framework. In A. Tashakkori & C. Teddlie, SAGE Handbook of Mixed Methods in Social & Behavioral Research (pp. 531–556). https://doi.org/10.4135/9781506335193.n21

 

Where did Guidance go?

As a surviving Online / Distance Learning (DL) teacher of the 2018/2019 school year, I am ready and eager to burn through the collective garbage from the past year and hopefully plant some meaningful scaffolds to build my future courses upon it’s charred ashes.
Gone will be 30-page “learning guides” that students must complete (I’m glad you figured out there’s an answer key attached at the back)
Gone will be irrelevant “projects” that are absent of logic and purpose (make a powerpoint of your progress through an interactive game? really?)
Gone will be “unit tests” of 30 multiple choice questions (your answer isn’t listed? Gee, I wonder why?)
The reigning paradigm of Problem-Based Learning (PBL) stands tall and all shall follow suit.

“Book Burning” by Jason Verwey is licensed under CC BY-NC-SA 2.0
“Superficial Learning Engagement” by ransomtech is licensed under CC BY-NC-SA 2.0

Teaching for Meaningful Learning: A Review of Research on Inquiry-Based and Cooperative Learning.

Barron and Darling-Hammond summarizes how the shift from traditional transmission of knowledge towards obtaining knowledge through experience, namely problem-solving. Much of their reasoning will be familiar to those who have gone through teacher education or teacher training in the last decade or two.

  • Current / future demands for employment are more complex, requires problem solving & collaboration skills.
  • Traditional instructions do not prepare students for those challenges.
  • Various research showing benefits of PBL methodology.
    • Higher (Boaler, 1997, 1998) or comparable standardized test scores (Penuel, Means, & Simkins, 2000).
    • Better mastery of transferable skills, ie. defining problem, hypothesizing, (re)testing, support & argue with rational logic
      (Gallaghers, Stepien, & Rosenthal, 1992; Gallaghers, Stepien, & Workman, 1993; Lundeberg, Levin, & Harrington, 1999; Savery & Duffy, 1996; Williams, 1992).
    • Improve social interactions & collaboration (Cohen et al., 1982; Cook et al., 1985; Hartley, 1977; Ginsburg-Block, Rohrbeck, & Fantuzzo, 2006; Johnson & Johnson, 1989).

The mounting evidence of against guidance-centered learning should be enough for most educators to re-think delivery approach, and consider trading in for the new vehicle of learning. Hush nagging doubts, numerous researchers on this topic cannot be wrong when they all reach the same consensus.

Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching.

Mistakes were made.

“The major fallacy of this [minimal guidance] is that it makes no distinction between the behaviors and methods of a researcher who is an expert practicing a profession and those students who are new to the discipline and who are, thus, essentially novices.” (Kirschner, Sweller, & Clark, 2006).

I’ve often butted heads against my Faculty Associate during my teaching Professional Development Program / Post-Degree Program (PDP). While I’m happy to acquiesce to the current PBL trend, something always felt off about disregarding prescribed knowledge and content in favor of students making their own learning. After reading the article, now I know why.

  • If I were a researcher with no experience as an educator, I would obviously use standardized exams as a benchmark for scoring improvements in students’ learning based on the method of instruction. After all, these are the marks should be free of an educators’ individual bias to appear more competent, and should be more applicable across the board. Herein lies my first concern: what exactly do the standardized exams assess, and how does it relate to the teaching style?
    Past Provincials and AP Exams had more questions analogous to worked-examples and fewer problem-based varieties, where as the current Numeracy Assessment exam is the opposite. If PBL students scored comparably or higher in the former scenario, then one can conclude it to being more effectively. Conversely, if traditional guidance students scored more favorably on the latter type, then one can argue that the PBL approach may not be as beneficial as it appears. What would potentially invalidate a research would be students scoring favorably on the exams that reflect the type of learning they received.  Readers, especially educators, need to examine findings from research on learning styles more clearly to see whether the appropriate experiment and analysis has been carried out.
  • Students being researched would provide valuable feedback to help support the findings for the research of their learning from experiencing different teaching styles. However, we know the teenage brain is not fully mature until mid twenties (for some guys, even later). Therefore, student reporting contains the issue of Do they know what they know?
    Clark (1982) noted that “less able learners who choose less guided approaches tend to like the experience even though they learn less from it”(Kirschner, Sweller, & Clark, 2006). It’s not surprising that students who dislike traditional guidance would view PBL to be more favorable, it’s designed to be more engaging. This is not to ignore all students’ input, but what they perceive as success may not be the same to teachers, to parents, to administrators, or to researchers. It would be noteworthy to find more recent data that correctly analyzes students’ measure of success in comparison to academic standings.
  • More about the brain (B.Sci with a Bio major here): I absolutely love how this article goes into detail about how problem solving cannot occur effectively without a large pool of resource from experience.  As a senior science teacher in high school, I’ve had to extensively grapple with the issue of content-heavy instruction to provide students the tools to solve problem (worked-examples), versus a scaffolded problem scenario for them to slowly work their ways toward the answer. As evidenced in today’s MEd orientation / information overload, the working memory can process less than a handful of novel information as once, and that information is quickly lost if not re-visited promptly. If grown, working adults are struggling to accomplish this, how do we expect our students to do the same in a limited 60-80 minute instruction time? This is why I’ve begun leaning back towards more instruction-centered designs where students need to shown at least one method or example, demonstrate their mastery of it, before being allowed to challenge the higher difficulty, open-ended scenarios; the same conclusion Kirschner, Sweller, and Clark (2006) came to.
  • Most educators would agree, expecting a class to hold a meaningful debate is nigh-impossible without participants having some background knowledge to anchor their logic or reasoning and provide supporting arguments from. Here is where the guidance comes shines, and where minimal guidance waits it’s turn.
Scroll to top