All posts by Isabeau Iqbal

Small Group Instructional Feedback

I conducted a small group instructional feedback (SGIF) session last week. In this post, I share on the process I used for the in-class portion.

Writing? Yeah.

SGIF is a formative, mid-course check-in process for gathering information from students on their learning experience. Like with all mid-course evaluations, the advantage is that the instructor can respond to the information gathered during the course (unlike with the end-of-course evaluations, for which the information gathered from students can only be applied to a future offering of a course). SGIF is initiated by the instructor and helps foster dialogue between the instructor and students.

If you were to search on the internet, you would find there are many ways to conduct a SGIF. Here is what I opted for once the instructor and I had met to discuss aspects of her teaching and pre-arranged a date/time for the SGIF.

1. Instructor introduces me and leaves the room (she had, the class before, told students this process would take place).

2. I thank the students and let them know a bit more about me and what this is about. Things I say include:

  • I work with faculty members across campus on enhancing teaching and learning.
  • Your instructor has requested this process, which will give her feedback on her teaching in this course.

3. I outline the overall process. Points covered include:

  • You are going to answer some questions individually, then in small groups. Within the next few days, I will share your comments with the instructor anonymously [she will not see your writing or original papers]. Your instructor will report back to you on your feedback and her reflections/decisions within the next week or so.
  • Unlike end-of-course student evaluations of teaching, this process allows the instructor to respond right away–so you (all) get to benefit directly from this.

4. I encourage students to be constructive in their feedback. I mention:

  • Inviting me to class to do this takes a lot of courage on your instructor’s part. As you’re answering these questions, please be constructive and specific. This is not an opportunity to lash out in frustration, but rather to be professional and helpful in giving feedback that will help make your experience in this course even better.

[all the above takes approximately 5 minutes]

5. Students individually respond to the following three questions, which I have copied onto a 1/2 page of paper and distributed to each student. [5 minutes]

  • In what ways has your instructor been supporting your learning in this course?Please give examples.
  • How could your instructor support your learning more effectively in this course? Please give examples.
  • Other comments you would like to make about the course and/or instructor that might strengthen your learning in this course.

6. Students get into groups of 3-5 and individually share their responses to the first question only. Then, they find at least 2 points on which they all agree (for the first question). They write these down on the group sheet. [5 minutes]

7. They repeat the above process for Question 2. [5 minutes]

8. As a whole class, each group shares out loud on one of their consensus points for Question 1.  They do the same for Question 2. [5 minutes]

9. If time allows, and in their small groups only, they find consensus points for Question 3.

10. I thank the class and gather all the papers.

I am done within 1/2 hour and the instructor returns.

The SGIF process involves several more steps, but this post looks only at the in-class portion.  If you’d like to find out more, I encourage you to visit:

 

Photo by Caleb Roenigk: https: //flic.kr/p/brNqFE

Calibrated Peer Review: An introduction

Student

I am enthusiastically involved in a project in which I am helping instructors implement writing assignments that use student peer feedback into their courses (see note 1). I am loving this initiative and the learning; plus, it is a neat extension of the work I have been doing on peer review of teaching.

Today’s post is a brief introduction to Calibrated Peer Review (CPR), a web-based writing and peer review tool that is being used in one of the re-designed courses.

Calibrated Peer Review (CPR) automates the process of distributing writing assignments to the students and then manages a peer review process that involves four steps, in which students:

  1. Submit a writing assignment
  2. Undergo a process whereby their review skills are calibrated
  3. Review peers’ writing, and
  4. Assess their own writing assignment

The instructor need not grade the assignments and the CPR system automatically compiles grades (Likkel, 2012; Schneider, 2015).

According to the CPR website, compelling reasons to use CPR include that it:

  • Allows students to hone their writing skills
  • Helps student learn to use higher-order thinking skills (in the writing and review process)
  • Promotes students’ critical thinking abilities
  • Encourages students to gain a deeper understanding of the topic and discipline
  • Reduces time needed by instructors to grade

With the exception of the last point, research on the CPR has drawn varied conclusions about the effectiveness of CPR for the above.

Some of the questions that remain inconclusive in the literature are:

  • Do the students’ writing skills improve?
  • Does engaging in the process promote students’ conceptual understanding of X?
  • Do students feel more confident as writers?
  • Do students’ believe the CPR process helped them augment their conceptual understanding of X and/or become better writers?

I cannot yet comment on the above from personal experience because the CPR assignment I have been working on launches next week.  I can attest to the fact that, though instructor load may be lightened overall (i.e., when CPR is used in multiple assignments and/or in next iterations of the same course), the time involved in getting to know CPR and setting up the assignment has been significant.

Note 1: The project I am working on is a Teaching and Learning Enhancement Fund granted initiative. See here and search for “Bradley” (the principal investigator) for brief information about that TLEF.

References:

Likkel, L. (2012). Calibrated Peer Review™ essays increase student confidence in assessing their own writing. Journal of College Science Teaching, 41(3), 42-47.

Schneider, S. C. (2015). Work in progress: Use of Calibrated Peer Review to improve report quality in an electrical engineering laboratory.  Paper presented at the 2015 American Society for Engineering Education Zone III Conference, Springfield, MO.

Overview of Calibrated Peer Review (2016). Retrieved from http://cpr.molsci.ucla.edu/Overview.aspx

Photo by CollegeDegrees360 https: //flic.kr/p/cEJnWs, CC BY-SA 2.0

 

Research Projects – Past

Research Projects – Past

Social Network Analysis in Teaching and Learning

Manuscript: Poole, G., Iqbal, I., & Verwoord, R. (2018). Small significant networks as birds of a feather. International Journal for Academic Development, 1-12. https://doi.org/10.1080/1360144X.2018.1492924

Project: With Dr. Gary Poole and Roselynn Verwoord, we examined how post-secondary instructors use significant networks to support their professional growth as teachers and SoTL scholars.

Based on our analysis of existing research on small significant networks pertaining to teaching and learning, our research posed the following questions:

(1) How are educators using networks in their own contexts to expand, refute or build their stories of teaching and learning and of SoTL?
(2) Do instructors perceive greater similarity among network members than among randomly chosen colleagues?
(3) Are there relationships among perceived similarity, value of interactions, and impact of the network on one’s teaching and research on teaching?
(4) What strategies can be employed to enhance the value of one’s networks?

Student Peer Feedback: Between 2016-2017, I worked with two UBC instructors who implemented student peer feedback approaches in their courses. Together, we carefully considered the course design, researched and selected student peer approaches, implemented and evaluated these (in terms of student learning and other) and then made further modifications to the course based on the evaluation data.

Evaluation of the Course Design Intensive: Beginning in April 2015, I led a program evaluation of the Course Design Intensive, a workshop offered through the Centre for Teaching, Learning and Technology.  (Link to more information).

The Educational Developer’s Portfolio (February 2014-February 2016). I was part of a collaborative research project inquiring into the Educational Developer’s Portfolio. Collectively, we gathered data from educational developers through World Cafes and inquired into the possibilities for the Educational Developer’s portfolio and what is needed to support a culture of portfolios.  We produced two publications from this project: (1) The Educational Developer’s Portfolio Guide (2015) and (2) a journal article titled “Exploring the potential of educational developer portfolios” in To Improve the Academy.

Team members on this project were: Jeanette McDonald (lead), Natasha Kenny, Erika Kustra, Judy Chan, Debra Dawson, and Paola Borin (and, of course, me).

Planning the Integrated Respirology Module (November-December 2014). With colleagues from Pharmaceutical Science, I conducted a program evaluation study related to an integrated Respirology Module in the Faculty of Pharmaceutical Sciences. I interviewed members of the planning team to determine what they perceived were the successes and challenges of planning an integrated module. The intention of this research was to help with future module development.

Student and Faculty Member Perceptions of the Student Evaluations of Teaching: A Qualitative Study. (July 2013-June 2014). This was a collaborative project conducted with John Lee (who was an undergraduate student in the Pharmacy program), Marion Pearson, and Simon Albon. We have had our paper accepted to Currents in Pharmacy Teaching and Learning (scheduled to be published in Spring 2016).

How do Physicians Learn to be Good Patient Educators? (January-June, 2013). This was a study being conducted by Dr. Terese Stenfors-Hayes, when she was a Post-Doctoral Fellow at the Centre for Health Education Scholarship. In my role as research assistant, I conducted literature reviews and carried out data analysis.

Faculty Members’ Professional Growth in Teaching Through the Summative Peer Review of Teaching and Other Departmental Practices.  In 2012, I completed a doctoral research study.  Please see here for information on my doctoral research.

 

Significant conversations

Come Together

As I prepare for a pre-conference workshop and conference workshop at the International Society for the Scholarship of Teaching and Learning 2016, I am reading and blogging on how university instructors learn about teaching through personal networks (my four previous posts on the topic can be found here). The ISSoTL workshops are part of a research project I am collaborating on with Gary Poole and Roselynn Verwoord.

Today’s post looks at “significant conversations” and consists of my notes from a paper by Roxå and Mårtensson, two lovely people and terrific scholars I had the pleasure of meeting in 2008.

Reference: Roxå, T. & Mårtensson, K. (2009). Significant conversations and significant networks: Exploring the backstage of the teaching arena. Studies in Higher Education, 34(5), 547-559.

**************

This paper presents results from a study in which Roxå & Mårtensson surveyed 109 university instructors to learn more about their teaching and learning conversation partners. Participants were from the following disciplines: engineering studies, social sciences, humanities.

Participants responded to these questions:

  1. With how many people do you have engaging conversations about teaching and learning?
  2. Where are these conversational partners found?
  3. What characterizes your conversations? (Please describe them.)
  4. Do you consider your local professional culture to be supportive or nonsupportive of such conversations about teaching and learning? (this question was only included in the later questionnaires)

The researchers drew on Handal’s (1999) concept of critical friends to “focus the respondents on individuals with whom they had sincere and serious discussions about teaching and learning” (p.550).

Summary of results

With how many people do you have engaging conversations about teaching and learning?

  • 83% of respondents had up to 10 conversational partners (there were differences among the disciplines)
  • Roxå & Mårtensson found that “university teachers rely on a limited number of individuals to test ideas or solve problems related to teaching and learning” (p.556)

Where are these conversational partners found?

The majority of participants discussed teaching with colleagues within their own discipline.  Conversational partners, however, were located within or outside the individual’s institution and discipline, and therefore the authors concluded that “significant networks” have no boundaries surrounding them.

What characterizes your conversations?

Private conversations:  

  • conversations rarely took place in formal meetings (they took place “backstage”1)
  • though many were backstage, the conversations were not isolated from the surrounding culture

Trustful conversations:  

  • conversations were about a range of topics (intellectual and emotional)
  • there was mutual trust among partners and partners often shared similar interests and values
  • at times, the conversations did not align with the official discourse within the participant’s culture/context

Intellectually intriguing conversations:  

  • conversations dealt with important disciplinary content and challenges about how to support student learning
  • participants used these conversations to make sense of experiences, deal with problems, and plan/evaluate actions.
  • Roxå & Mårtensson found that most participants were not drawing on pedagogical literature and theory as they were having these conversations; nor were they making public the results of their inquiry. Rather, they were using “personal theories” (p.556)

“Do you consider your local professional culture to be supportive or non-supportive of such conversations about teaching and learning” (this question was posed to only 50 of the participants)

  • There is a clear link between how encouraging a culture is and number of conversational partners (i.e., in a supportive culture, individuals have more conversational partners)
Implications

Significant conversations have the potential to help university teachers see things through someone else’s perspective. They may shape and/or expand an individual’s identity as teachers.

In the words of the authors: It is likely that these conversations open up the possibility of constructing and maintaining–and perhaps partly changing–an understanding about the realities of teaching.” (p.555)

**************

Footnote 1: Erving Goffman’s wrote about the concept of “front stage” and “back stage” behaviors in his book The Presentation of Self in Everyday Life (1959). Goffman proposes that we have two different modes of presenting our selves: one when we are ‘on’ for others (front stage) and another when we let down our guard (back stage). For a succinct introduction to these concepts, see here at “Everyday Sociology“.