The benefits of classroom visits (and how to make them effective)
By Tommaso Magrini, PhD student, Departement of Materials
Visualisation of classroom visits in an online teaching scenario (Image: Tommaso Magrini)
It requires effort, time and theoretical preparation to be truly able to deliver a good lecture, to properly plan a class, or to assess the performance of your students. As I am writing this document, I am still in this process of learning. Nevertheless, the more I am involved in teaching, the more I can experiment with new techniques or approaches, keeping the structures that worked, adjusting those that didn’t. One technique I plan to keep is classroom visits with peers.
Over the course of my doctoral studies I enrolled in the program “Learning to Teach”. One of the most interesting things I have learnt is the concept “learning by doing”. Not only is “learning by doing” more fun and engaging for the students, but it has also been proven to be the best ally for the lecturers to reach their learning objectives. Step by step, I introduced different activities in my classes, ranging from simple short discussions to more advanced posters and presentations. Those activities not only proved to be a good way to keep the students focused and active, but were also be extremely useful for us as teachers to assess if the concepts our class has been built on, are solid and stable in the students’ minds, or if they need repetition or consolidation. To understand whether the activities I have planned are meaningful and well aligned with the learning objectives, I have asked for peers to visit my classes, sit in and provide me with feedback.
Over the last semester I started implementing a new activity at the end of each lecture. Expecting my students to build on the concepts seen in my class and restructure them into a broader and more applied context, I proposed the following activity: divided in groups the students would need to come up with a shared idea, describe it schematically on a poster and then pitch it in front of the class. This would then foster a discussion between ‘critical friends’, that would openly challenge each group’s idea, with the goal of improving it and helping its realization. The classroom visit proved to be crucial in this phase.
As a lecturer, during such vivid and intense scientific discussions, I have to occupy several roles at the same time. Indeed, not only I have to moderate the discussion, but I also have to evaluate how deep and relevant the discussion is, while I assess whether the students have reached the milestones and the key learning objectives or not. For this reason, the presence of my colleague, that sits ‘outside the discussion’ and evaluates the classroom response to the activity is extremely important. If at the beginning he would observe only, at later stages we were also able to switch roles and evaluate the class activity in turns. Being able not only to take part in the discussion but also to observe it from outside and take notes gave me a more complete vision of the activity.
At the end of each class, I would always have a debriefing with my colleague. Its goal was to sum up the positive and negative aspects of the lecture in a constructive and unbiased way. These debriefings helped to correct the weak parts of the lecture and expand the positive ones. It was clear from the first time on, that the students were responding to the activity we planned with a positive attitude and with enthusiasm. Furthermore, through the classroom visits, we realized that we could more efficiently assess the classroom knowledge by using ‘exam-like questions’ during the discussion.
As a matter of fact, the ‘simple’ classroom visits have evolved, in our experience, into an open ideas exchange, built on honest and constructive feedback, that would help me improving my teaching style, the structure of my classes and the realization of more targeted and better structured learning activities.
Among other things, ETH Zurich’s EduApp allows instructors to pose clicker questions during lectures. Instructors can interrupt lectures to ask questions from the students and get and give feedback on learning progress. Lecturers can also trigger phases of peer-instruction, where students discuss their initial answers to a question with one another and then reanswer the question – in effect, the students are teaching each other during those phases, thus “peer instruction”. By asking students to answer a question twice, lecturers gather data on student understanding. But how meaningful is this feedback data, in particular, when answering is voluntary and ungraded?
A group of mathematics instructors at ETH’s D-MATH worked with LET to analyze EduApp data using Item Response Theory (IRT), Classical Test Theory (CTT) and clustering methods. Over the course of the semester, 44 clicker problems were posed – 12 of them twice, as the instructor decided to insert a phase of peer-instruction. The following figure shows an example of the kind of problem being analyzed:
Fig.1 Example of a clicker problem
The problem shown was used in conjunction with peer-instruction; the gray bars indicate the initial student responses, the black bars those after the discussion. A simple, unsurprising observation is that after peer-instruction, more students arrived at the correct answer. What can we learn from these responses? CTT and IRT can provide psychometrics that help understand this instructional scenario.
When it comes to being “meaningful,” the “discrimination” parameter of a problem is of particular interest: how well does correctly or incorrectly answering a problem distinguish (“discriminate”) between students who have or have not understood the underlying concepts?
CTT simply uses the total score as a measure of “ability”, but also has a measure of discrimination (“biserial coefficient”). IRT estimates the probability of a student arriving at the correct answer for a particular problem (“item”) based on a hidden (“latent”) trait of the student called “ability” – typically, higher-ability students would have a higher chance of getting a problem correct. How exactly this probability increases depends on problem characteristics (“item parameters”).
In IRT, the ability-trait is determined in a multistep, multidimensional optimization process, where the difficulty and discrimination parameters of particular problems (“items”) feed back on how much correctly answering that problem says about the “ability” of the student; “high-ability” students are likely to get correct answers even on high-difficulty, high-discrimination problems.
The results of their study were extremely
encouraging: using both CTT and IRT, almost all 44 problems under investigation
exhibited strong positive discrimination in the initial vote. This means that
the better the student understood the underlying concepts, the much more likely
they were to give the right answers – and vice versa. A low
discrimination, on the other hand, means a problem provides less meaningful
feedback. For the handful of problems which had lower (yet still meaningful!)
discrimination, this could be explained by other problem characteristics, for
example, that at the time they were posed, they were still too hard or already
too easy – but even that feedback is meaningful to the instructor for future
semesters.
The truly surprising result of the study was that in all cases of peer-instruction, the problem had even stronger discrimination afterwards! Yes, unsurprisingly more students answer correctly after discussion with their neighbors (the problem becomes “easier”), but: peer-instruction does not simply allow weaker students to enter the correct answer, it apparently helps them to perform at their true potential.
For the purposes of the study, the clicker data had to be exported manually, but the next version of EduApp, slated to be released in December 2020, will allow export of data for learning analytics purposes directly from the interface – the following figure shows a sneak preview of that new functionality.
Fig. 2 The new “Learning Analytics” function in EduApp
The exported data format is compatible with
input for the statistics software R, and there are variety of guides
available for how to analyze this data (https://aapt.scitation.org/doi/abs/10.1119/1.5135788
(accessible through the ETH Library) provides a “quick-and-dirty” guide).
The full study, including results from
Classical Test Theory and clustering methods, as well an outlook for new EduApp-functionality
is available open-access in Issue 13 of e-learning and education (eleed)
under https://eleed.campussource.de/archive/13/5122.
Schlechte Unterrichtsbeurteilungen für innovativen Unterricht?
“Mit viel Engagement habe ich meine Lehrveranstaltung nach neuesten didaktischen Erkenntnissen umgestaltet. Sei es im Flipped Classroom oder mit erhöhtem Einsatz von Clicker-Fragen, die Studierenden waren während der Präsenz aktiv gefordert und haben auch gerne mitgemacht. Doch mit der Unterrichtsbeurteilung kam die grosse Enttäuschung. Die Studierenden bewerten mich und meinen Unterricht deutlich schlechter als vorher. Sie bevorzugen sogar Frontalunterricht, denn damit würden sie besser lernen. Habe ich etwas falsch gemacht? Soll ich wieder zurück zu meiner altbewährten Vorlesung?”
Was hier wie ein
Einzelfall klingt, ist doch sehr verbreitet. Zahlreiche Untersuchungen
dokumentieren, dass Unterrichtsformen mit Lerner zentrierten Methoden häufig zu
schlechten Evaluationsergebnissen führen (z.B. Seidel, 2013). In einer kürzlich publizierten Studie haben
Louis Deslauriers und Kollegen/innen der Harvard University genau diese
Problematik untersucht (Deslauriers,
2019). Sie gingen der Frage nach, ob es bei aktiv involvierten Studierenden
eine Diskrepanz zwischen dem tatsächlichen und dem empfundenen Lerngewinn gibt.
Falls Studierende den Eindruck haben, weniger als in einer Vorlesung gelernt zu
haben, dann führt dies zwangsläufig zu schlechteren Evaluationsergebnissen. Die
experimentell angelegte Untersuchung bestätigte die negative Korrelation
zwischen der Selbsteinschätzung und dem effektivem Lerngewinn. Folgende drei Gründe
sind dafür verantwortlich:
Interaktive Unterrichtsmethoden verlangen eine erhöhte kognitive Leistung, die von Studierenden dann nicht unbedingt mit dem Lernen in Verbindung gesetzt wird.
Insbesondere Studienanfänger/innen verfügen noch nicht über die Fähigkeit, ihr eigenes Wissen in einem neuen Fachgebiet korrekt einzuschätzen.
Die klare und sprachgewandte Präsentation beim Frontalunterricht verleitet Studierende dazu, ihr eigenes Verständnis in Vorlesungen deutlich zu überschätzen.
Aufgrund der Ergebnisse einer Nachuntersuchung schlagen Delauriers und Kollegen/innen einige recht simple Massnahmen vor, um dieses Missverhältnis zwischen gefühltem und tatsächlichem Lerngewinn zu unterbinden. Zentral dabei ist, die Befürchtungen und Ängste der Studierenden ernst zu nehmen und sie zu thematisieren. So kann eine kurze Darlegung der Lernvorteile von aktivem Unterricht (z.B. Freeman, 2014) während der ersten Unterrichtsstunde bereits erste Befürchtungen auffangen. Aber auch im weiteren Verlauf der Veranstaltung sollte immer wieder auf den erzielten Lernfortschritt hingewiesen werden. Damit erlangen die Studierende eine bessere Einschätzung ihres eigenen Lerngewinns. Hilfreich ist zudem, auf die Gefahr der Lernillusion bei eloquenter Redegewandtheit des Dozierenden in Vorlesungen hinzuweisen (z.B. Toftness, 2018).
Auch an der ETH konnten wir den Einfluss dieser Massnahmen bestätigen. In einer Studie am Departement Physik verglichen wir den Lerngewinn zwischen interaktivem Unterricht und Vorlesung. Bereits in der ersten Lerneinheit wiesen wir die interaktive Gruppe ausführlich auf die positiven Auswirkungen des interaktiven Unterrichts hin. Zusätzlich wurden Unsicherheiten bezüglich des eigenen Lerngewinns im Semester kontinuierlich thematisiert. Bei der Unterrichtsevaluation konnten wir daraufhin kein Missverhältnis zwischen effektiver und geschätzter Lernleistung feststellen. Studierende im interaktiven Unterrichtsformat erzielten einen höheren Lerngewinn und gaben signifikant bessere Werte bezüglich ihres eigenen Lernens an als jene in der parallel durchgeführten Vorlesung (Schiltz, 2018).
Tipp: Was hier jetzt speziell für interaktive Unterrichtsformen gilt, lässt sich sicher auch auf jeden anderen Wechsel der Lernform übertragen. Insbesondere wenn die neue Lernform noch nicht geläufig ist, sollte man die anfänglichen Bedenken der Studierenden ernst nehmen und ihnen klar vermitteln, welchen Nutzen sie vom Wechsel zu erwarten haben. Daneben ist es wichtig, Ergebnisse der Unterrichtsbeurteilung (ob gute oder schlechte) kritisch zu hinterfragen. Nicht immer ist der kausale Zusammenhang zwischen studentischer Zufriedenheit und tatsächlichem Lernerfolg gegeben (z.B. Carpenter, 2020).
Does anything ever happen after those teaching evaluation surveys?
Maybe you know
the problem. You want feedback on your teaching from your students. You want to
know what they think went well, and what didn’t. Maybe you need their
evaluations for future job applications. In whatever case, in your evaluation a
more or less representative amount of feedback and number of ratings would come
in handy. But your students are sick of evaluations! They wonder why they have
to fill something out which will be of no use to them, and nothing ever comes
of evaluations anyway… .
So the
muttering of students. However, something actually does happen with student
evaluations – even if most students aren’t aware of it. It is rare that someone
attends a course twice, and there is little opportunity to find out whether
lecturers have implemented their students’ wishes. Therefore, to let students
and others know what happens after a questionnaire is submitted and why
evaluations are important to teaching quality, we have created a 3-minute video
with the help of Youknow (specialists in explainer videos). Please show this
video to your students and motivate them to take part in the survey! This is
especially useful if you have the opportunity to conduct the evaluation in
class.[1]
The challenge for us was to explain the entire comprehensive, stringent evaluation process from survey via publication of findings to deduction of appropriate measures briefly and appealingly. A bigger challenge was to be responsive to students and take their criticisms seriously, while also dignifying the engagement of most lecturers. Whether and how well we have achieved this in three minutes of moving images is yours to decide!
[1] By in class evaluation we mean that you programmed the time your evaluation survey will be send out to students, and you ask them during your lecture time to fill in the survey, via their laptop or smartphone.
Case Study – Peer Review Mastering Digital Business Models
As part of a series of case studies, staff
at LET sat down to have a conversation with Prof. Elgar Fleisch, Johannes
Hübner and Dominik Bilgeri from the Department of Management, Technology, and
Economics (D-MTEC) to discuss their Mastering Digital Business Model (MDBM)
course.
What is the project about?
In this Mastering Digital Business Model
(MDBM) course, Prof. Elgar Fleisch, Dominik Bilgeri, George Boateng and Johannes
Huebner teach Master’s level students a theory- and practice-based
understanding of how today’s information technologies enable new digital
business models and transform existing ones. The course contains a novel examination
mode, a video group project is introduced as a core element contributing to the
overall course grade. In addition, students are asked to participate in a
peer-to-peer review of the videos produced by other student groups, which is
independent of the grading and is geared towards giving students insights in
how other groups solved the challenge. The best-rated videos are then shared
with the entire class in the end of the semester.
As part of this newly created examination element, course participants (in teams of two to three students) explain one of the major lecture topics (theoretical lenses) in the first half of their video.Then they apply the same lens by analysing a company, aiming to better understand its underlying business model. Companies are pre-selected and allocated to students for fairness reasons. Every year, we choose a pool of interesting companies in the context of digital transformation, the Internet of Things, Blockchain, e-health, etc.
What
motivated you to initiate the project?
The core idea was to improve students’
learning success by using an examination format that not only requires learners
to reiterate theoretical contents, but also apply the theory in a practical
context. The students have different backgrounds, and do not necessarily have a
strong business focus, which means that many of the concepts taught in class
may be rather abstract. We used the video format and specific companies as case
studies, because we think this is a good way to trigger curiosity, show
concrete examples of modern companies in a compact form, and force students to
reflect deeply upon theoretical frameworks compared to other examination
formats.
How
did you do it?
Aside from the weekly input lectures, we
ask students to form groups in the beginning of the semester. We then provide a
list of theoretical core topics from which each group can choose one. In
addition, we randomly assign each group to a case company. The theoretical
topic then first needs to be explained in the first half of the video, and then
be applied to the case company in the second half. Here we thus used a prosumer
approach, where students become part of the course because they create a small
section of the content. The best videos are shared with the class, and can be
reused as additional learning materials for future cohorts. This set-up generally
resulted in high-quality videos, perhaps also since students knew their videos
will be used again.
Students also had to review the video
projects of five other groups. They had to clearly describe whether and how
their peers used certain perspectives (called “lenses” in the course) which played
a role in the video and in their feedback. In this way they analysed once more how
the newly learned concepts were visible in other companies – a positive side
effect being that they also honed their reflection and feedback skills.
Did
you have the support you needed for the project? Is there additional support
you wish you had had to help you to achieve your goals?
We asked two students from previous cohorts
to join us as tutors, and support this year’s groups primarily with technical
questions about video-making (e.g. tools, quality considerations etc.). In
addition, we designed one of the lecture slots as a coaching session during
which we would further support student groups with their questions. In total,
this approach allowed us to provide the students with high-quality supervision
with reasonable effort.
Please
describe some of the key outcomes of the project
To most students, the task of creating a
video was new. We received feedback that while the initial effort for learning
how to make a video was higher compared to other examination formats, it was also
fun and very helpful to really understand and apply the new concepts. They said
that they learned things more deeply and more sustainably because they had to
consider all details and aspects – compared to the practical exercises they are
familiar with in other courses. By carefully phrasing their arguments in giving
feedback on peer videos, students became more aware of their own thinking and
argumentation.
We observed that the questions asked by
students once they start creating videos were different and went deeper, i.e. their
reflections were based on many concrete examples of companies, and the concepts
were put into perspective. The same sub-concepts have a different meaning in
another context, and students now see the overarching principles better and can
argue more precisely about theoretical aspects. Without these concrete
examples, it would have been harder to concretely grasp the theoretical
aspects.
How
did the project impact learners or the way in which you teach?
We were surprised by the high quality of the
best student videos. The teaching team is now really motivated to continue
innovating on our approaches in other courses. We saw clearly that when
students are very active we get better results, deeper learning and better
reflection.
What
lessons learned do you want to share with your colleagues?
It can really pay off to try things and to
experiment. We think that nowadays the classic format of passive lectures and
final exams may not always be the best choice. We believe the improved outcomes
through students who were actively engaged by the video assignment justified
the investment in developing new approaches and tools.
When considering videos as an examination
format, you should define the entire course/project very clearly. When
describing what production options students have for videos, you should be very
precise. Offering too many options can be counterproductive. It is better to
present 3-4 crystal-clear examples and stick to them.
Also, we would recommend managing students’
expectations clearly in the beginning of the semester, and highlighting both
the benefits and challenges of this examination format. Of course, this becomes
easier after the first year, when you can draw from the experience of the first
cohort, and also provide examples of prior videos to illustrate what is
expected of the groups. Because the students are co-creators you get new and
relevant content which enriches the course and can serve to motivate both
students and teachers.
What
are the future plans for this work? How do you plan to sustain what you have
created through the project?
We plan to optimize some details of this
course, and to go even more in the direction of a flipped classroom to use this
teaching approach in other courses. We will create a library of the student
videos to provide it as additional learning materials in future editions of the
course.
Student feedback
By MDBM Student Cristina Mercandetti (mercandc@student.ethz.ch)
Your opinion about this course and the peer
review & video production process – how has it influenced your learning
process? Cristina Mercandetti: I really enjoyed both the
course and the video production process. I think they complemented each other
very well and we were able to directly apply the theoretical knowledge learned
in the course to work on our project. It helped me to think more critically
about the course content, and really dive into some of the lenses and models
presented. I don’t think this would have been possible without the video
production, so it definitely improved my learning process.
Do
you think this approach could be used in other courses?
Cristina Mercandetti: Yes, I think this approach could easily be used in other
classes. However, I think part of the fun in this class was that the video
production was something very new and refreshing (a side effect was that I
learned how to cut a short movie). I imagine that if several classes introduced
this it would lose some of its novelty and could be stressful, as it took a lot
of time.
Final remarks about the course Cristina Mercandetti: I really enjoyed the whole
class, and heard a lot of good things from other students too.
As part of a series of case studies, staff
at LET sat down to have a conversation with Prof. Volker Hoffmann (SusTec, the Group for
Sustainability and Technology) and Erik Jentges (Educational
Developer) from the Department of Management, Technology and Economics (D-MTEC)
to discuss their corporate sustainability project.
What
is the project about?
The course “Corporate Sustainability” aims to enable students to become advocates of sustainable business practices in their later careers. Each year it attracts 150-200 students with diverse disciplinary backgrounds and different educational levels (BSc, MSc, and MAS). We adapted the Six Sentence Argument (6SA) method for this course. The method focuses on enhancing critical thinking skills through structured writing and guided, double-blind peer-review.
What
motivated you to initiate the project?
We wanted students to get a clearer picture
of what sustainability really is. In the course, they develop not only a deeper
understanding of corporate sustainability but also the skills to give and
receive feedback.
How
did you do it?
At the core are four topics that relate to
the sustainability of corporations. These are assessment, strategy, technology,
and finance. We developed digital learning modules (videos, some with
interactive elements) that explain key concepts to support the most relevant and difficult
parts of the lecture. Also, we want to develop students’
critical thinking skills. In e-modules, students learn to formulate concise and
short arguments with the 6SA method. The core idea builds on the
assumption that writing is thinking.
In the e-modules, students face a decision
(a micro case based on the lecture content) and argue for their preferred
course of action using a logical structure of exactly six sentences. Each
sentence fulfils a specific function in the overall argument and has a 20-word
limit. A clear grading rubric enables students to assess 6SAs in double-blind
peer reviews. These have been continuously adapted and improved since 2015. The
specialized online tool “peergrade” also helped us to conduct a smooth process
– for both students and teachers.
Through the peer assessment, students engage
critically with their peers’ arguments and receive constructive feedback on
their own arguments. With the 6SA exercise, students learn to argue with
clarity, and it helps them to reflect on the way they and others think.
During the second half of the semester,
students work in diverse teams to prepare mock debates, consulting strategies,
economic models and campaign videos. In this phase, they are coached by several
postdoctoral and doctoral researchers from SusTec, the Group for Sustainability and Technology. The students then present their projects and display their skills in
a group puzzle session and are debriefed in the following final lecture
session. Students receive grades for both individual and group performance and
can earn a bonus on their exam grade when completing the critical thinking
exercises.
Did
you have the support you needed for the project? Is there additional support
you wish you had had to help you to achieve your goals?
The project received funding from different
sources. This helped us to hire academic staff to assist the development of new
teaching approaches and the production of high-quality videos. In addition, we
received specialist guidance in the instructional design and production of
videos.
Please
describe some of the key outcomes of the project
With regard to our feedback modules, we think
that the quality of the argumentation and peer reviews has increased over the
years. For example, we learned that the effective design of such peer
assessment exercises for students requires training on how to give constructive
feedback and that it should involve several feedback loops to support the
development and refinement of critical thinking skills. Overall, the course now
integrates many innovative teaching elements and was a finalist in the 2018 ETH
KITE award.
How
did the project impact learners or the way in which you teach?
When students are able to write better and concise
arguments that convince critical readers, and if they can give constructive
feedback to arguments that are being made to justify strategic decisions, then
they are able to actively shape good decisions in a company setting – they can
be change-makers for corporate sustainability. The students were motivated by
the new teaching approaches such as the supporting videos, interactive
questions inside the videos, and the critical thinking exercises. Peer
assessment is “homework” for the students, but they know that they can earn a
bonus on their exam grade – and they are already rehearsing for some parts of
the final exam.
With regard to students’ learning, the peer
review process itself is convincing. What is unique to our teaching situation is
the incredible diversity in the classroom. A 19-year-old Swiss environmental
science student may be sitting next to a 25-year-old Chinese student who is pursuing
a master’s degree in management, who in turn sits next to a 35-year-old
American part-time student with a PhD in chemistry and a management position
with responsibilities for 20 employees in a multinational company. Peer
feedback is a powerful solution to bridge these gaps of different levels of
experience and cultural backgrounds. It allows younger students to write a
creative and brilliant argument without being intimidated by more senior
students. It allows a shy and quiet student to gain confidence by formulating a
convincing argument whose strengths are recognized in their peers’ feedback. It
creates a space for older students to learn how to coach younger classmates with
constructive feedback to improve their reasoning.
That is why at D-MTEC, we use peer feedback
in other courses as well. Students learn more when actually giving feedback
compared to when only submitting an assignment.
What
lessons learned do you want to share with your colleagues?
At the beginning, it was a lot of work and many
people were involved, but it was worth it. Today, with regard to the critical
thinking exercises, we have continuously refined our processes. Every student writes
three reviews, thereby ensuring that everyone also receives much more feedback
than a single lecturer could provide. The main work for lecturers is providing
an overview of the themes in the arguments and summarizing the activity for all
students. This lets them know that their individual contribution becomes part
of a collective intelligence. There are always truly smart and innovative
solutions that need to be shared with the whole class. Also, there is little
effort involved in re-grading/moderating student questions about feedback,
because we train students to write helpful and considerate feedback and make them
aware of that they also have to learn how to receive feedback, especially if it
is feedback that they don’t want to, but need to hear.
For the production of videos, we recommend planning enough time and engaging with video experts and instructional designers early on. Especially writing a concise script for a short video requires a surprising amount of time until it effectively conveys your key points.
If you are interested in applying these concepts in your own courses please contact LET.
Note: The project received funding from different sources (Innovedum, Emil Halter Foundation, ETH Critical Thinking Initiative).
Additional
resources and comments
Article: Kölbel, J., & Jentges, E. (2018). The Six-Sentence Argument: Training Critical Thinking Skills Using Peer Review. Management Teaching Review, 3(2), 118–128. https://doi.org/10.1177/2379298117739856
Case study – Peer Review Food Chemistry Laboratory – Writing reports
As part of a series of case studies, staff at LET sat down to have a conversation with Prof. Laura Nyström and Dr. Melanie Erzinger from the Department of Health Sciences and Technology to discuss their food chemistry laboratory project.
What
is the project about?
We introduced a new way to write lab reports,
combined with a peer review method to foster collaboration and critical
thinking skills among students. In the past students did not have clear
criteria as to what makes a good report. Assistants also needed too much time
to read the reports and give repeated feedback. Thus we looked for a way to
help assistants spend less time on the review process.
We transformed the format of our Food
Chemistry Laboratory Course (Food Science, BSc level, 4th semester)
from a classical lecture format with lab exercises to a blended learning
format. With new videos, we can achieve better coverage of basic knowledge
(i.e. security, handling of equipment).
What
motivated you to initiate the project?
Student numbers have increased over the
past 10 years and we have been losing too much time in covering basic knowledge
repeatedly. Using concept videos, students will be able to review key topics on
their own. Overall, we also wanted to make the entire course more attractive. A
key intention was to develop student skills in report writing and improve
report quality.
How
did you do it?
We defined additional, clear quality
criteria for a good report. During a first round students give each other
feedback, such that final review by teaching assistants and lecturer approval involve
less effort. For each experiment, every student has to review another student’s
report. In total, each reviews four reports over the semester.
Students don’t get a grade for the peer review (semester performance in the lab course is also ungraded). They have one week for each of the four peer reviews, and must complete each by the respective set deadline. They answer various questions related to the quality of the respective report (these involve five aspects plus overall feedback; see the annex at the end of this case study). Students do not “grade” the reports, but give feedback in their own words.
Assistants are aware of what is asked in
the reports and are therefore able to provide targeted and helpful feedback in
the lab which addresses the quality criteria for reports.
We provide the students with online material on how to write reports (short videos, documents etc.). Previously we had a short lecture with examples. Until now, however, we did not train them in conducting proper peer reviews. We have now realised that we need to do this (especially for Bachelor’s degree students), and will include peer review training with the short lecture next year.
Did
you have the support you needed for the project? Is there additional support
you wish you had had to help you to achieve your goals?
We learned about a module inside our LMS for
administering the peer-review process (“workshop module” in Moodle). It would
have been helpful to have had practical tips from others, but apparently not
many lecturers have used this tool. Although the general instructions were
useful, it took quite some time to learn all the aspects of the tool.
Please
describe some of the key outcomes of the project.
Various things changed for the better. Students learned a lot by reading and reviewing the reports of their peers. They gained important input for their own reports. For many it was the first time they had had to give feedback in such a structured way. They also had to find a way to critique something in a good, constructive manner. Overall, students were introduced to a new way of critical thinking and took important first steps in this skill, which is important for their later careers.
We can say clearly that through the new
review method we were able to improve the quality of reports and reduce the
time needed by lecturers to grade them.
How
did the project impact learners or the way in which you teach?
In general the peer review method was well received in the BSc course, and we used the same approach in an MSc-level course. We therefore realised that Bachelor’s degree students need more help and training in peer review than Master’s students.
Overall we saw that the blended learning
approach and the peer review methods work to improve our courses, addressing
the above-mentioned challenges of lack of student preparation and the need to
constantly repeat basic knowledge. Students themselves clearly realised the
value and potential of better collaboration, peer feedback and critical
thinking skills.
What
lessons learned do you want to share with your colleagues?
Not every cohort is the same. While things worked quite well in 2017, in 2018 fewer students adhered to the schedule and deadlines – even though everything was communicated and documented in the same way as in the previous year.
What
are your future plans for this work? How do you plan to sustain what you have
created through the project?
More and more assistants will become competent in providing full reviews of the already peer-reviewed reports. Currently lecturers still have to do this. Lecturers will thus gain more time to be present in the labs and to give 1:1 feedback to students in the lab and online.
We will definitely create some training
material for assistants for this purpose, but it is not available yet. We also want
to create a short, ready-to-use document about giving feedback in our specific
context: what is constructive feedback, what are the do’s and don’ts? Students,
assistants and fellow instructors can use it.
We are interested in learning whether other
lab courses at ETH do something similar, and how. We also need to improve the
support situation with the Moodle review tool “workshop module”. We will
continue to work with it, but it is a bit tricky sometimes.
One additional idea is to make the videos
interactive. Students will see in-built questions in the videos which they have
to answer right away.
Additional
notes regarding resources and tools used.
We used a programme called Labster to create virtual labs in some cases to extend the experience to experiments which were not doable in reality in our labs.
We learned from other courses and departments regarding effective feedback (ETH “Foundations of Teaching and Learning” course).
To conduct the peer review we work with the “workshop module” in Moodle.
To make the videos interactive we will work with the new Moodle “interactive video suite” plugin.
Student voices:
What is your opinion about this course and the peer review
process (lab reports). How has it influenced your learning process?
Robert Spiess: I think peer-reviewing was a great way to see other students’ work. It gave me the opportunity to experience and compare different ways of writing. I could always detect things that I wanted to include in my reports. At the same time, I could see in which points my reports were better, where my advantages were.
I think this procedure is particularly useful when writing.
But the reports should not be too long, because, otherwise, students have to
spend too much time on their own report and neglect, as a consequence, the
peer-reviewing of someone else’s report. Other courses usually required longer
reports. But if the reports were shortened, the method could also be applied in
other laboratory courses (such as in the food processing or in the
biotechnology lab course).
Aline Candrian: I’m glad I did the course, I think it gives students a first impression of laboratory work. The lab report writing is an essential part of the course to understand the experiment and the obtained data. The peer review approach was fine, even though nobody was eager to do them. VERY little time was invested into peer review by most groups, as far as I’m concerned. Of course, sometimes you could benefit from your peer’s feedback but most of the times we didn’t act on them. That’s probably because it was our first time writing (semi-) real reports. We didn’t really know what we were doing and you mostly think you know better than others, especially if someone reviewed your report who you rate less familiar with chemistry.
At the same time you’re well aware that you know nothing
about report writing, so how can you evaluate someone else’s work?
Additionally, motivation was minimal since you were just
glad to be done with writing your report. Having to assess another report and
then correct your own report again was just another ‘burden’. So, altogether, I
would say report-writing was a crucial part of the course but peer review not
so much since we had no experience at all. I think peer review makes more sense
in the courses in our last semester.
Making the students just do a peer review on the last report
might work. They’ll see how it works, they’ll have had written a few reports
(and got more familiar with it) and might be more confident in providing
feedback. But I’m not an expert, it may not work the way I envision it, what do
I know 🙂
Impartial group assessment. Using peer review and economic theory to grade groups fairly.
In a clear case of practicing what he preaches, Dr. Heinrich Nax has applied game theory to his teaching practice. After lecturing on game theory for several years, he realised that his methods for teaching, more specifically, for assessing did not follow the very theories he was espousing and so he set out to correct this incongruence.
In his course «Controversies in Game Theory» students work in groups and are assessed based on a group project. Social tensions can develop between individual and collective interests in group interactions. One such tension, free-riding, when one person rides the coat-tails of other hard-working group members is well known. There are however additional potential problems when assessing group work such as collusion on grades in cases of peer review. To eliminate these tensions, Dr. Nax decided to implement a mechanism from economic theory to his assessments.
What triggered this approach?
Previously Dr. Nax gave the same final mark to everybody in a particular group regardless of their individual efforts as these could not reliably be assessed. From a game theory perspective this constituted a big temptation for free-riding and Dr. Nax decided to devise something that would incentivize individual efforts but without giving up the benefits of group work altogether.
What exactly did he do?
Influenced by the article Impartial division of a dollar by Clippel, G., Moulin, H., and Tideman, N. (2008), Dr. Nax and his colleague Sven Seuken implemented the article’s mechanism in a blockchain start-up company. The mechanism enables a group to split their financial earnings through peer review between the group members. Group members decide internally what a fair allocation of earnings should be. So he decided to try the mechanism in an educational setting where the “earnings” become the finite amount of points the group works towards, the total of which is determined by the grade he allocates to the group’s total project.
The key idea of the mechanism is that individual group members don’t evaluate their own performance and therefore don’t decide how many points they themselves have “earned”. Instead they allocate relative contributions to the other group members. So in a group of three, if student A thinks group member B did twice as much work as fellow group member C, then B should receive twice as many points as C. Using a specific formula (described in the paper) all three group members reports are then aggregated anonymously to make sure the resulting grades cannot be manipulated. In other words, student A only receives the (aggregated) amount of points, that their colleagues think student A deserves.
Courtesy of Spliddit
What were the results (for student learning)?
Not only was Dr. Nax convinced that the quality of the group projects improved, but the students were happier as well. They believed that the marking was much more fair. It is unclear if this grading method decreased free-riding, however students felt that freeriders did receive lower grades, thus increasing student satisfaction in comparison to grading methods where all members of the group receive the same grade, regardless of effort or contribution.
To see this mechanism in action, visit the Spliddit website which features a demonstration tool. Those interested in learning more should read the original paper or this second (less math-based) follow-up paper or contact Dr. Nax for further information. Dr. Nax is working on a tool to make his grading plan available to other professors.
Case study: Molegram Explorer – A mixed-reality framework for teaching drug design
As part of a series of case studies, staff at LET sat down to have a conversation with Prof. Gisbert Schneider and Dr. Jan Hiss from the Institute of Pharmaceutical Sciences in the Department of Chemistry and Applied Biosciences to discuss their mixed-reality project.
What is the project about?
The Molegram Explorer project provides a mixed-reality framework to facilitate and broaden students’ understanding of molecular structure. It is part of the Computer-Assisted Drug Design course run by Gisbert Schneider, professor in the field of the same name.
At the core of the project is the innovative hardware device “HoloLens” (comprising special glasses with 3D projection, motion sensor and environment scanning, produced by Microsoft). Users of the HoloLens see not only (real) furniture and people present in the room, but also a hologram. In our case, the virtual object is a protein that the students can explore, investigate and even walk through. They literally immerse themselves in the world of molecules.
This innovative concept presents a new way of perceiving molecular structure and facilitates new approaches to chemical structure analysis and design via human-machine interaction.
What motivated you to initiate the project?
This innovative project originated in answer to a call for projects on the use of the HoloLens in ETH teaching. Because we faced a teaching challenge where a 3D representation gave a very good use case, it was a perfect opportunity to apply for a pilot project using HoloLenses.
The idea underpinning our work is that this new technology can help our students to understand certain important principles of molecular structure which traditional teaching methods and media struggle to clarify. To identify or design a suitable drug molecule (the ligand) students must understand the protein’s surface, and particularly the cavities suitable for accommodating the ligand. The HoloLens device helps them visualise the regions of a protein that are accessible to the ligand.
How did you do it? Three groups collaborated on this project. First, Gisbert Schneider and Jan Hiss delivered the scientific content. Guided by specific learning objectives, ETH Zurich’s Educational Development and Technology (LET) unit helped organise and facilitate the development process and provided the required hardware. Finally, a specialised software company (afca) implemented the learning app software.
The HoloLenses are used in a two-week practical course in which students experience a condensed version of early-stage drug discovery. They learn how to computationally screen a catalogue of millions of molecules to identify those that might favourably interact with a particular protein. The students perform a computational analysis and select one or two molecules from the top-ranking candidates. Then they synthesise these compounds and test their activity in the laboratory.
An important basic aspect of protein-ligand interaction is the “solvent-accessible surface”. For beginners, this molecular representation often remains an abstract concept without suitable visualization. By using the HoloLens students can now create surface representations of a protein, interact with the holographic model, and simultaneously discuss it with peers and instructors.
Did you have the support you needed for the project? Is there any additional help you wish you had had?
We had excellent help from the company afca who designed a user-friendly app with an elegant interface. LET helped us with the legal aspects and provided the necessary contacts. The 12 ETH HoloLenses are stored at LET. Although we understand that HoloLenses are not easily available due to their comparably high acquisition cost, it would have been helpful to have faster and easier access to this hardware, especially when we needed to try out and check something quickly.
Please describe some of the key outcomes of the project.
The new tool proved to be a valuable addition to our course. It certainly does not replace traditional teaching and discussion, but it is an example of how technology can enhance the understanding of abstract scientific concepts which are otherwise hard to teach. Because students can virtually navigate the molecular hologram they gain a better understanding of the concept of protein structures and surfaces. In the learning sciences this effect is described by the principle of “embodied cognition”. We were also able to increase the attractiveness of our subject matter with this concrete visual experience. It was a kind of scientific marketing. We received several suggestions for additional projects in the context of other practical exercises. The positive feedback and the success of the pilot has driven us to expand the project with enhanced content and to reach out to other disciplines.
How did the project affect learners or the way in which you teach?
We observed that students became more curious, not only about the specific topic of the learning app but in general about many questions related to protein-structure-based drug design. Students certainly appreciate the new tool. The value of technology-enhanced learning apps for teaching of specific aspects in our field is obvious, and we intend to stay on this route.
What lessons learned would you share with your colleagues?
It is not always realistic or meaningful for scientists and teachers to address app programming and didactic concepts. Therefore, it is important to have experts from complementary fields working hand in hand. Experts on the subject matter can contribute the scientific content, and software developers can create user-friendly and visually appealing interfaces and functions. Learning professionals can then connect content with technological functions. They can also advise on how to transpose learning objectives into an appropriate and technology-enhanced learning process.
Overall, we encourage teachers to try out new methods in teaching, and there is much potential for combining proven learning approaches with new technology. In particular, teachers and students should not fear experiments that do not produce immediate success. “Productive failure” should be regarded as a natural part of the development process; it is a great way to learn.
What are your future plans for this work? How do you plan to sustain what you created through the project?
Based on the many positive outcomes, we plan to develop further apps. The ultimate goal is to adapt the work to a professional context by adding scientific content from our subject matter, together with advanced analysis tools. It would also make sense to develop HoloLens learning apps for selected (teaching) topics in medicine, chemistry and biology. HoloLenses are increasingly employed not only by the entertainment industry, but in business and education generally. We would welcome new, broader applications of this technology at ETH – but always with a critical double-check as to whether it actually provides added value for students compared to conventional teaching methods. In the case of Molegram Explorer, we are extremely satisfied with the learning success achieved.
Feedback from PhD students (Tutors)
What is your opinion about this course and the HoloLens process? How has it influenced your learning process?
Cyrill Brunner: Though it has been clear to me before since I have been doing research in that area before, the visualization of proteins by the HoloLens helped in getting a better feeling for 3D structure of a protein. My personal gain was clearly not so much as for the course participants, but that has nothing to do with the process itself, but the chosen protein (carbonic anhydrase II) which I’ve done research in before. I’m positive that application of the HoloLens process on a novel protein would have helped me clearly to get a better understanding of the 3D confirmation.
Dominique Bruns: The HoloLens is a useful next step in the visualisation of molecules. This hands-on experience allows the understanding of molecular properties, their definition and dependencies. In this regard, the application of the surface area visualisation and determination is an ideal showcase.
Do you think this approach could also be used in other courses?
Cyrill Brunner: Yes, indeed. The HoloLens should clearly be put into use in the lab course of medicinal chemistry as students there are already working with a 2D visualization program. A 3D full insight into what they have been studying beforehand would strengthen their understanding.
Dominique Bruns: I appreciate the 3D perspective and interactivity enabled by the HoloLens, a characteristic that might be especially useful for people that find it difficult to see in three-dimensions.
In my opinion, many further examples could be established and used for didactic purposes, e.g. chemical reaction mechanisms or biological folding events of proteins.
To find out more about how ETH teaching staff can start their own HoloLens project, visit theETH website.
Die EduApp ist eine der wichtigsten Lehrapplikationen der ETH. Ziel der EduApp ist es einerseits, die Interaktion zwischen Studierenden und Dozierenden im Hörsaal zu verbessern. Anderseits möchte diese Lehre-App Studierenden der ETH Zürich einen Mehrwert im Studienalltag bieten.
Im letzten Frühlingssemester haben 100 Dozierende Clickerfragen in ihrem Unterricht eingesetzt und damit 8’694 Studierende erreicht. Auch aus Sicht der Dozenten ist die EduApp eine wertvolle Ergänzung.
Dr. Ghislain Fourny (D-INFK): «Ich benutze seit 2016 die EduApp in allen meinen Vorlesungen und bin davon sehr begeistert. Es ermöglicht eine reiche Interaktion mit den Studierenden und gibt mir ein konstantes Feedback»
Prof. Dr. Christoph Heinrich (D-ERDW): «Ich habe im HS2017 zum ersten Mal regelmässig Clicker-Fragen in meiner grossen Geologievorlesung für die Erstsemestrigen am D-BAUG eingesetzt. Es war ein grosser Erfolg, nicht zuletzt wegen der Auflockerung, und ich bekam spontan viele positive Feedbacks».
Dr. Markus Kalisch (D-MATH): «Mit der EduApp bekomme ich sofortiges Feedback von den Studenten, selbst wenn die Vorlesung mehrere hundert Teilnehmer hat».
Dr. Meike Akveld (D-MATH): «Die EduApp gibt mir direktes Feedback darüber, ob verstanden wurde, was ich unterrichtet habe. Ich bitte immer einen der Studierenden die richtige Antwort zu erklären, was oft hilfreich ist. Ausserdem ist es für sie eine angenehme Abwechslung».
Neue Funktionen für den Clicker
Pünktlich auf das aktuelle Semester wurden in der der EduApp neue Funktionen im Bereich Clicker hinzugefügt. Mit der Funktion «Clicker» können Dozierende via EduApp Fragen stellen, die meist sofort im Unterricht beantwortet werden.
1. Zwischenresultate: Neu können Dozierende die Abstimmung der Clickerfragen in zwei Runden machen und die Zwischenresultate anzeigen.
2. Erweiterter LaTeX-Editor: Der Funktionsumfang des LaTeX-Editors zur Anzeige von mathematischen Formeln in Clickerfragen wurde erweitert. Nicht nur können Dozierende jetzt Formeln und Gleichungen im Text eingebunden darstellen, es gibt auch mehr Textformatierungsmöglichkeiten.
3. Flashcards: Studierende können neu mit der Funktion «Flashcards» bestehende Clickerfragen durcharbeiten (z.B. zur Prüfungsvorbereitung). Die neue EduApp-Funktion “Flashcards” wurde durch den «the Rectors Impulse Fund» ermöglicht.