Where are my Tweets?

#notallteachers

It’s 2019 and although I do not live in/under a rock, I still have no idea what this means.

Throughout the week, our class has been discussing the use of social media for professional reasons such as: promoting an individual’s digital presence/citizenship, creating a professional (learning) network, or connecting with fellow colleagues/like-minded individuals. We even had a guest lecturer, Alec Couros, come and discuss the use of social media and technology in modernizing the way individuals learn, and provide open access to learning opportunities. Youtube is popular social platform with hundreds of thousands of educational videos specific to academic subjects  or just “simple” life skills; instead of having to attend costly in-person lessons. Sites like Khan Academy, Wolfram Alpha, PhotoMath, or Desmos became known to me through mentions on social media networks, web advertisements, or Professional-Development (Pro-D) workshops. Furthermore, in the article “Twitter Use and its Effects on Student Perception of Instructor Credibility”, the researchers concluded that students preferred instructors having a professional Twitter presence as it helps to support their credibility, appear more caring, and more human-like as opposed to an instrument of knowledge. They also cited cited several other studies which had similar findings: Martin, Mottet, & Chesebro (2009), Brookfield (2006), and McArthur & Bostedo-Condway (2012). As these aspects directly and indirectly improve student engagement / motivation in class, it’s in the best interest of fledgling teachers like myself (who’ve yet to lose faith in students of today) to implement this. While I’m no stranger to exploring new media or technology, when it comes to my own transition from spectator to active participant in the digital world, I have some reservations about doing so.

One of the passages that stood out to me was: “student/teacher relationship should be left inside of school, not social life or social media” (DeGroot et al, 2015). While this is next to impossible to accomplish in our modern society, unless they somehow live completely “off the grid”, aspects of this idea are still ingrained in me. As a teacher, certified by the Ministry of Education of BC, have I not demonstrated my credibility in terms of being a competent teacher? There is also the question of how authentic does a Twitter profile have to be to justify its credibility? Having numerous tweets, followers, or likes is one indicator for breadth of online presence, but is it through shallow, trendy comments tells us nothing about a person (or egg’s) actual credibility. Also, the issue of individuals or companies paying to artificially boost their followings or re-tweets (side blog: Hashtags Are Worthless); not the most healthy of ways to build any relation on. This is not to say teachers’ social media presence can’t ever be trusted (Need help – appropriateness around hyperlink search results on public websites like RateMyProf). I simply believe that teachers should not rush to establish themselves online from research findings support it’s benefits; DeGroot et al. also mentioned how college students’ perspectives on use of social media has likely changed over time as prior research found results contradicting theirs. “If the history of educational technology teaches researchers anything then it is this: what begins as fresh, innovative and edgy quickly evolves to tired, redundant and banal” (Brabazon, 2012). This statement can be seen as a caution against immediate establishment of students motivation by simply creating an authentic social media profile. Students “before” may see instructors with Twitter and Instagram accounts in a positive light, but current trends of Twitter has seen it become a toxic echo chamber or source of misinformation. If we look further, ye olde website have certainly taken on more negative reputations over the years such as Alec described Facebook groups as gathering grounds for middle-aged housewives, or Nexopia being known for child-predators (what happened to LinkedIn?)

Another potential negative consequence of instructors on social media would be how it could become a distraction or a newfound source of stress. In terms of distraction, (ideally) teachers would be less likely to pull out a phone during class to check who followed or liked their tweets; the opposite is more likely for students. So now a teacher wanting to show support or build a positive relation with a student by following or “Like”-ing them would generate a bunch of inferences from students or student groups. There’s also the possibility of additional stress and depression over lack of replies or poor ratios on their social media accounts (Social Media Usage & Well-Being).

And so I believe my resistance to joining the social media trend is at the minimum, not rooted in irrational fear (I just spent over an hour searching for readings). Perhaps I will continue to be the observer to my cohorts’ experiences with it in the meantime. After all, I’m having enough trouble keeping up with course reading to analyze 140 more characters of wisdom.

#aintnobodygottime

Where’s My Comparison? Assignment 2 – EDCI 515

Background Information about the Article

Ever since the British Columbia (BC) modernized it’s K-12 curriculum and began implementing it in 2016, teachers have been re-visiting their pedagogies to shift from content-focused teaching to inquiry-based or problem-solving approaches. As a high school educator who teaches both blended- and distance-learning classes, I have been looking for models for online learning spaces where students work collaboratively. The article I chose, “Wikis for a Collaborative Problem-Solving (CPS) Module for Secondary School Science”(DeWitt, Alias, Siraj, & Spector, 2017), used Wiki pages as their platform for students’ online collaboration. Their research looked at three aspects: (1) what types of interactions occurs on the Wiki in the context of learning science, (2) what extent does CPS in a wiki encourage social and cognitive processes, and (3) what extant does the CPS module improve achievement (DeWitt et al., 2017). Delving into the background of this study, I found that DeWitt, Alias, and Siraj are associate professors from the University of Malaya, Malaysia. In addition, DeWitt herself had previously worked for Malaysia’s Ministry of Education’s Educational Technology Division; according to her biography. The literature supporting this study stems from a report by Ministry of Education of Malaysia (2013) finding that mathematics and science achievements in the country have declined in the past few years, and that “few studies have examined online collaboration and problem-solving in science” (DeWitt et al., 2017). The population which this article samples from are 31 volunteers from an urban high school in Malaysia which includes students of varying academic strengths, reflective of the multiracial community, as well as simple convenience for the researcher. Their findings could useful for fellow researchers, teachers, and policy makers in Malaysia to consider future direction of research, classroom implementation, or curricular development.

How the Research was Carried Out

The researchers grouped their volunteers into teams of 7 or 8 students and assigned each group with a type of meal. The goal was that in three weeks, they needed to have analyzed the food classes present in that meal. A CPS module was already developed for this activity and students received a laptop as well as an orientation on how to use the Wiki, learning resources, and problem tasks available on the module. Researchers collect all the discussions on the wiki, student journals, and individual student interviews, then manually coded as one of the following interactions:

Learner – Content: learners engaging with the content.

Learner – Learner: interactions between the students.

Learner – Instructor: interactions with the teacher.

Learner – Interface: interaction with the technology medium.

The first three categories were described by Moore & Kearsley (2005), while Learner-Interface was described by Hillman, Willis, & Gunawardena, (1994). DeWitt et al., also looked the at frequency and further categorization of each learner interaction into social and cognitive processes shown in Table 1. Lastly, the researchers employed a pre- and post-test that contained simple, open-ended questions to assess the students’ proficiency in regards to food classes as a result of the CPS module.

Research Findings and Discussions

After collecting and coding all communications, DeWitt et al. quantified the frequency of each type of interaction and summarized it in Table 2. They noted the low amount of Learner – Learner and Learner – Interface communication, which they believed was due to discussions occurring outside of the wiki such as in face-to-face meetings between the students. They also pointed to studies by Ertmer et al. (2011) and Huang (2010) who both carried out research on online interactions on Wiki that also saw similar lack of interactions.

To examine the extent of CPS encouraging social and cognitive processes, the researchers had further divided the communications into categories seen in Table 1 and posted the frequency of those interactions in Table 3. They noted most interactions were cognitive processes with a total of 69.3%, followed by teaching process with 12.9%, and social process of 4%. DeWitt et al., attributed the lack of Triggering and Exploration cognitive processes to students believing the Wiki should contain solutions to the problem, or that they occurred in discussions between face-to-face interactions that were not captured by the researchers.


The third aspect of the research looked at effectiveness of the CPS module in learning about food classes, which was conducted by examining students’ pre-test and post-test scores. They found a significant difference between the means of the paired tests, as seen in Table 4. This lead DeWitt’s team to conclude that their study showed that applying the CPS method on Wiki allowed for varying types of interactions, promoted social and cognitive processes in learners, and resulted in an improvement of students’ knowledge.

Applying O’Cathain’s Proposed Framework of Quality

After reading my selected article, I found myself questioning portions of the research process. Thus, I considered re-examining DeWitt’s study under the lens of the proposed Comprehensive Framework by O’Cathain (2010). I applied each domain to the whole study and considered whether I could clearly state the presence of each element within the article. Bolded elements indicated where quality of research was in question.

  1. Planning Quality
  • Foundational element: clearly stated in introduction.
  • Rational Transparency: doesn’t explicitly state the type of research being conducted, or why the Mixed Method approach was taken.
  • Planning Transparency: purpose of study clearly outlined.
  • Feasibility: study could be completed within short timeframe.
  1. Design Quality
  • Design Transparency: design type is known and process was described.
  • Design Suitability: mixed method approach may be most convenient, but the qualitative and quantitative elements feels unsuitable. For instance, examining pretest and posttest scores of online CPS module compared to traditional in-person teaching would provide far stronger argument of the difference in effectiveness. Similarly, lack of certain interactions such as learner-learner doesn’t necessarily mean students are interacting face-to-face; actual absence of interaction or interacting with people outside of student group is strong a possibility.
  • Design Strength: study was not optimized for breadth as the test scores were analyzed according to standards set by just 2 Biology teachers. Depth of the study in terms of coding interactions into different categories was also prone to bias, as it was done by just 2 researchers.
  • Design Rigor: rigor is questionable as the researchers included a “noise” category in interactions. I strongly believe that student comments were not completely without rationale and should be considered as affectionate attitude instead. Furthermore, 3 weeks for collaboration between 31 students without any mention of teambuilding or apparently scaffolding to facilitate collaboration could account for lack of interactions between students.
  1. Data Quality
  • Data Transparency: collection method and data were available.
  • Data Rigor: collection of student interactions where not conducted with rigor, particularly in realizing possibility of face-to-face discussions that could not be collected.
  • Sample Adequacy: 31 student volunteers were not an adequate sample size, nor representative of a usual classroom dynamic such as including disengaged students.
  • Analytic Adequacy: Qualitative aspects of describing student interactions relied on interpretations of just 2 researchers.
  • Analytic Integration Rigor: not implemented with rigor as transformation of qualitative data (categorization of comments) into quantitative (frequency) was conducted and checked by just 2 researchers.
  1. Interpretive Rigor
  • Interpretive Transparency: clear which findings came from which method.
  • Inference Consistency: some consistency between inference and findings. Although lack of interactions such as learner-learner or learner-interface was not completely adequate. Students may actually be working individually without collaborating, or they do not remember how to contribute to the Wiki page.
  • Theoretical Consistency: findings consistent with current knowledge.
  • Interpretive Agreement: other likely to reach similar conclusion based on findings.
  • Interpretive Distinctiveness: conclusion are more credible than other possibilities.
  • Interpretive Efficacy: meta-inferences appropriately incorporates from qualitive and quantitative.
  • Interpretive Bias Reduction: bias reduction not taken as research team comprised of staff at same university.
  • Interpretive Correspondence: Inferences correspond to purpose of research study.
  1. Inference Transferability
  • Ecological Transferability: difficult to apply findings to other contexts such different subjects, or other settings like schools outside of Malaysia.
  • Population Transferability: difficulties to apply findings to other population dynamics such as rural populations lacking internet access, or to regular classroom dynamics with that include students who require learning assistance with Individual Education Plans (IEPs).
  • Temporal Transferability: Has potential for further research or future policies.
  • Theoretical Transferability: Has potential to be re-assessed using different research method or different tools for analyzing findings.
  1. Reporting Quality
  • Reporting Availability: report assumed to be successfully completed within time and budget.
  • Reporting Transparency: report assumed to adhere to Good Reporting of a Mixed Method Study (GRAMMs).
  • Yield: report provides worthwhile result compared to two individual studies.
  1. Synthesizability
  • I applied the Mixed Methods Appraisal Tool by Hong et al., (2018) to this article as well.

  1. Utility Quality
  • Findings from article for potential for researchers, educators, and curriculum designers.

 

To summarize, I had various questions and concerns regarding the chosen article which were clearly highlighted and described in detail when applied to the framework. Had those markers been more noticeable or considered, I believe it would have increased the overall quality of the research. From my perspective, the research by DeWitt et al. is worth re-examining under more stringent conditions. One example was suggested by the article itself, which was to reduce the possible of face-to-face discussions by having students further separated by geographical placements. Another consideration is to offer concurrent classes of in-person versus purely-online CPS module. This would still be feasible to conduct in the same time frame and would reduce confounding variables, such as differences in content, to allow accurate assessment of delivery method. The researchers could also provide more transparency in how they categorized students’ communications, as well as seeking agreement from more than 2 researchers to reduce bias in interpretations.

 

 

 

References

DeWitt, D., Alias, N., Siraj, S., & Spector, J. M. (2017). Educational Technology & Society. 13.

Ertmer, P. A., Newby, T. J., Liu, W., Tomory, A., Yu, J. H., & Lee, Y. M. (2011). Students’ Confidence and Perceived Value for Participating in Cross-Cultural Wiki-Based Collaborations. Educational Technology Research and Development, 59(2), 213–228.

Hillman, D. C. A., Willis, D. J., & Gunawardena, C. N. (1994). Learner interface interaction in distance education: An Extension of contemporary models and strategies for practitioners. The American Journal of Distance Education, 8(2), 30-42.

Hong, Q.N., Pluye, P., Fabregues, S., Bartlett, G., Boardman, F., Cargo, M., Dagenais, P., Gagnon, M-P., Griffiths, F., Nicolau, B., O’Cathain, A., Rousseau, M-C., & Vedel, I., (2018). Mixed Methods Appraisal Tool (MMAT) Version 2018. McGill University, Department of Family Medicine. Retrieved from http://mixedmethodsappraisaltoolpublic.pbworks.com/w/file/fetch/127916259/MMAT_2018_criteria-manual_2018-08-01_ENG.pdf

Huang, W.-H. D. (2010). A Case Study of Wikis’ Effects on Online Transactional Interactions.

Ministry of Education Malaysia (MOE). (2013). The Malaysia education blueprint 2013 –2025: Preschool to post secondary education. Putrajaya, Malaysia: Ministry of Education Malaysia. Retrieved from https://www.ilo.org/dyn/youthpol/en/equest.fileutils.dochandle?p_uploaded_file_id=406

Moore, M., & Kearsley, G. (2005). Distance education: A Systems view stems view (2nd ed.).
Ontario, Canada: Thomson Wadsworth.

O’Cathain, A. (2010). Assessing the Quality of Mixed Methods Research: Toward a Comprehensive Framework. In A. Tashakkori & C. Teddlie, SAGE Handbook of Mixed Methods in Social & Behavioral Research (pp. 531–556). https://doi.org/10.4135/9781506335193.n21

 

Where did the Classroom go?

Sharing a Pragmatic Model for Open Pedagogy

Well, I see where they got the idea to build this course.

This article provides benefits of Open Learning plus a concrete and applicable approach on how to do so. Our current pedagogical paradigm has shifted away from the Sage-on-Stage idea of traditional teaching, but current teachers still struggling with how to implement and assess Problem-Based Learning. I believe a portion of this issue lies in the fact that some (ie. myself) are still struggling to assess students based on Proficiency (one’s mastery) or Progress (one’s improvement). While I’d love to discuss issues with favoring one or the other, let us return to the discussion of practical Open-Learning strategies, as well as some concerns.

  1. “If the history of educational technology teaches researchers anything then it is this: what begins as fresh, innovative and edgy quickly evolves to tired, redundant and (Brabazon, 2012).”
    • I feel this statement over-generalizes advancement and retirement of technology far too much. While the basic idea is undoubtedly true (open Wi-Fi and personal devices are the norm), that doesn’t mean “old” technology becomes moot. Michael’s and Opus would’ve gone out of business by now if all art teachers had adopted drawing software and 3D imaging technologies. Some tech never becomes redundant despite being ancient. People may enjoy the smell of freshly mixed paint, the motions of brushing across a canvas, or simply not having to look at a screen all day.
  2. Cognitive Presence, Teaching Presence, Social Presence are correlated to student satisfaction and perceived learning.
    • Cognitive presence is important? Shocking. Terrible news for my students who believe they can pass by vegetating on the chairs. However, they have succinctly advocated their supposed learning and in particular, their satisfaction of being cognitively absent.

      “Never sleeping in class had felt so good…” by ogaudemar is licensed under CC BY-NC-SA 2.0
    • Teaching presence – if the way I teach has absolutely no effect on students and learning, then I should be awarded my Masters and a Ph.D now. Thank you.
    • Social Presence is a thorny issue. Working at a Distance Learning school, my students are usually a) busy with work / sport, or b) somewhere on the spectrum of anxiety. It was why they chose this program in the first place. Excluding the former (they can find the time to post if they truly try), the latter may be uncomfortable or completely opposed to sharing in any format. In this scenario, Open Learning would be devastating to them as the constant stress and fear of having to participate or show their learning would overwhelm them, or cause them to shutdown and isolate themselves. Hence, a traditional approach would be more logical in this case.
      “Shoes on the window ledge” by therealhussy is licensed under CC BY-NC 2.0

      **Here’s another issue: cultural difference (especially Asian). As someone whose been both a student and teacher on both sides of the globe, there is a huge cultural difference in the aversion towards any showing of ignorance or inability. People are judged severely by their actions, so the idea of presenting their failures and learning from them could cut off future opportunities. Those learners are more comfortable with individual learning and mastering, before making a public display. It will be interesting to see whether/when their pedagogical paradigm will shift away from the current closed-learning, content-driven model.

  3. Structure/Scaffold for Open Learning.
    • Some solid framework on how to setup this process, as well as caveats.
      • Open Learning Design Interventions – build relation between teacher/students -> build digital literacy -> intentional collaboration/connection/interaction -> personal learning network. Great analogy using beehive.
      • Graham (2018) suggest this:
      • I’m glad the article mentions the absolute need to create and promote a safe learning environment, including digital literacy. The issue is less likely to occur at higher education, but students in K-12 have very little awareness of having an online presence (see: Ice-cream licking suspect , why would you post that?)
      • Lacking standards and expectations around individual sharing, commenting, and constructive criticism, the model of Open Learning would quickly fall apart as students fear other’s judgement. Another possibility would be the environment becoming an echo chamber of toxicity or misinformation.
      • Another obstacle that impedes success would be parental consent and FOIPPA concerns. Successful implementation at K-12 levels requires teachers and admins to ensure protocols are followed and respected, including alternatives for those who refused to provide consent.

Closing thoughts:

Open Pedagogy is an interesting avenue to explore going forward and fits perfectly with our teacher autonomy, but how to implement it in more content-driven courses like Math & Sciences (especially Biology); and what about the standardized exams?

Where did Guidance go?

As a surviving Online / Distance Learning (DL) teacher of the 2018/2019 school year, I am ready and eager to burn through the collective garbage from the past year and hopefully plant some meaningful scaffolds to build my future courses upon it’s charred ashes.
Gone will be 30-page “learning guides” that students must complete (I’m glad you figured out there’s an answer key attached at the back)
Gone will be irrelevant “projects” that are absent of logic and purpose (make a powerpoint of your progress through an interactive game? really?)
Gone will be “unit tests” of 30 multiple choice questions (your answer isn’t listed? Gee, I wonder why?)
The reigning paradigm of Problem-Based Learning (PBL) stands tall and all shall follow suit.

“Book Burning” by Jason Verwey is licensed under CC BY-NC-SA 2.0
“Superficial Learning Engagement” by ransomtech is licensed under CC BY-NC-SA 2.0

Teaching for Meaningful Learning: A Review of Research on Inquiry-Based and Cooperative Learning.

Barron and Darling-Hammond summarizes how the shift from traditional transmission of knowledge towards obtaining knowledge through experience, namely problem-solving. Much of their reasoning will be familiar to those who have gone through teacher education or teacher training in the last decade or two.

  • Current / future demands for employment are more complex, requires problem solving & collaboration skills.
  • Traditional instructions do not prepare students for those challenges.
  • Various research showing benefits of PBL methodology.
    • Higher (Boaler, 1997, 1998) or comparable standardized test scores (Penuel, Means, & Simkins, 2000).
    • Better mastery of transferable skills, ie. defining problem, hypothesizing, (re)testing, support & argue with rational logic
      (Gallaghers, Stepien, & Rosenthal, 1992; Gallaghers, Stepien, & Workman, 1993; Lundeberg, Levin, & Harrington, 1999; Savery & Duffy, 1996; Williams, 1992).
    • Improve social interactions & collaboration (Cohen et al., 1982; Cook et al., 1985; Hartley, 1977; Ginsburg-Block, Rohrbeck, & Fantuzzo, 2006; Johnson & Johnson, 1989).

The mounting evidence of against guidance-centered learning should be enough for most educators to re-think delivery approach, and consider trading in for the new vehicle of learning. Hush nagging doubts, numerous researchers on this topic cannot be wrong when they all reach the same consensus.

Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching.

Mistakes were made.

“The major fallacy of this [minimal guidance] is that it makes no distinction between the behaviors and methods of a researcher who is an expert practicing a profession and those students who are new to the discipline and who are, thus, essentially novices.” (Kirschner, Sweller, & Clark, 2006).

I’ve often butted heads against my Faculty Associate during my teaching Professional Development Program / Post-Degree Program (PDP). While I’m happy to acquiesce to the current PBL trend, something always felt off about disregarding prescribed knowledge and content in favor of students making their own learning. After reading the article, now I know why.

  • If I were a researcher with no experience as an educator, I would obviously use standardized exams as a benchmark for scoring improvements in students’ learning based on the method of instruction. After all, these are the marks should be free of an educators’ individual bias to appear more competent, and should be more applicable across the board. Herein lies my first concern: what exactly do the standardized exams assess, and how does it relate to the teaching style?
    Past Provincials and AP Exams had more questions analogous to worked-examples and fewer problem-based varieties, where as the current Numeracy Assessment exam is the opposite. If PBL students scored comparably or higher in the former scenario, then one can conclude it to being more effectively. Conversely, if traditional guidance students scored more favorably on the latter type, then one can argue that the PBL approach may not be as beneficial as it appears. What would potentially invalidate a research would be students scoring favorably on the exams that reflect the type of learning they received.  Readers, especially educators, need to examine findings from research on learning styles more clearly to see whether the appropriate experiment and analysis has been carried out.
  • Students being researched would provide valuable feedback to help support the findings for the research of their learning from experiencing different teaching styles. However, we know the teenage brain is not fully mature until mid twenties (for some guys, even later). Therefore, student reporting contains the issue of Do they know what they know?
    Clark (1982) noted that “less able learners who choose less guided approaches tend to like the experience even though they learn less from it”(Kirschner, Sweller, & Clark, 2006). It’s not surprising that students who dislike traditional guidance would view PBL to be more favorable, it’s designed to be more engaging. This is not to ignore all students’ input, but what they perceive as success may not be the same to teachers, to parents, to administrators, or to researchers. It would be noteworthy to find more recent data that correctly analyzes students’ measure of success in comparison to academic standings.
  • More about the brain (B.Sci with a Bio major here): I absolutely love how this article goes into detail about how problem solving cannot occur effectively without a large pool of resource from experience.  As a senior science teacher in high school, I’ve had to extensively grapple with the issue of content-heavy instruction to provide students the tools to solve problem (worked-examples), versus a scaffolded problem scenario for them to slowly work their ways toward the answer. As evidenced in today’s MEd orientation / information overload, the working memory can process less than a handful of novel information as once, and that information is quickly lost if not re-visited promptly. If grown, working adults are struggling to accomplish this, how do we expect our students to do the same in a limited 60-80 minute instruction time? This is why I’ve begun leaning back towards more instruction-centered designs where students need to shown at least one method or example, demonstrate their mastery of it, before being allowed to challenge the higher difficulty, open-ended scenarios; the same conclusion Kirschner, Sweller, and Clark (2006) came to.
  • Most educators would agree, expecting a class to hold a meaningful debate is nigh-impossible without participants having some background knowledge to anchor their logic or reasoning and provide supporting arguments from. Here is where the guidance comes shines, and where minimal guidance waits it’s turn.

Posts navigation

1 2 3 4
Scroll to top