Category: Featuring teaching

Did you know there is a JupyterHub at ETH?

In the framework of the computational competencies initiative at ETH, a JupyterHub has been established at LET. This brand-new JupyterHub serves JupyterNotebooks to everyone involved in teaching and learning at ETH.

JupyterNotebooks are interactive documents, which combine code, text and animations. Different programming languages, like Python, R, Julia, Octave or Open Modelica are supported. Sign in through a plug-in from your course in Moodle and enjoy using JupyterNotebooks without additional authenticaton or the need to install anything on your computer. This holds for everyone involved in a course. No matter if your role is student or teacher, you can reach your personal JupyterLab environment on the JupyterHub with one click and it runs in your browser.

This is what the plug-in in Moodle looks like, which takes you straight to your JupyterHub hosted by LET

Example of a simple JupyterNotebook in Python on the LET JupyterHub

In your course you can use JupyterNotebooks as

  • interactive textbooks which support lectures or exercises
  • assignment sheets, where students answer questions and write code in a pre-defined (coding) environment
  • or just as an environment to combine text, code, and visualizations, either for students to work on assignments or for teachers for demo purposes
  • learning journal for documenting learning progress

Choose JupyterNotebook as type of assignment in Moodle

First, start your JupyterHub through the plug-in in Moodle. Either create a new JupyterNotebook right in your JupyterHub or upload your work. Also include additional files, which you might want to distribute together with your Notebook, like data files, etc, in the same folder on your JupyterHub. Once your assignment in the form of a JupyterNotebook and optinal accompanying files are ready on your space on the Hub, you can include it directly in the assignment activity in Moodle: When you choose Jupyter notebooks as submission type, it shows you the folder tree in your Jupyter workspace on the Hub. Select a folder to distribute all files inside this folder to your students in the form of an assignment.

The students will be able to select a folder on their Jupyter workspace once they download the assignment. And when the assignment is finished and ready to be submitted, again the students will be able to select a folder from their JupyterHub workspace to submit, which might of course contain more files than just the JupyterNotebook itself.

Additionally, students can not only use the JupyterHub through distributed assignments, but they also get the same plug-in in Moodle to reach their space on the JupyterHub to do their own coursework.

First users

During fall term 2021 first pilot users have been using the JupyterHub for in-class exercises, for documentation and evaluation of lab experiments, for entire homework assignments and also as a tool to complete a part of an assignment. There are of course many more use cases, and we can even offer you to use JupyterNotebooks on the Hub in your exams.

Interested? Just contact us at jupyterhub@let.ethz.ch for more information and to activate the JupyterHub for your course in Moodle. As of now, the Hub won’t be available by default for your course.

We are looking forward to welcome new users across all departments!

Read more

Posted on ,

Your mission: Build an intelligent boat

In a project-based course, students learned to apply materials knowledge and skills to the construction of a boat that could navigate an unknown terrain using artificial intelligence. We talked with the two lecturers Rafael Libanori and Henning Galinski, and the department’s Educational Developer Lorenzo De Pietro to find out more about this innovative course in the Department of Material Sciences.

Lukas Wooley and Sebastian Gassenmeier get their boat ready.

What triggered this experiment? 

Originally, we were inspired by AP50, a project and team-based introductory physics course taught at Harvard. We wanted to do more with problem-based learning at ETH Zurich and achieve a different kind of learning environment. Students tend to expect that lectures just “give the knowledge”, but there is so much more to teaching. We realised it’s important to teach students how to learn more efficiently and take more responsibility for their own learning. In this course, we give students scientific questions to answer themselves. We wanted them to start taking risks and to have the freedom to fail, which is what science is all about. It’s not just theoretical input. Interpersonal and technical skills are just as important as academic skills.

A hand is shown holding a small home-made boat. There are lots of electronic elements visible.

What exactly did you do? We applied for Innovedum funding and when we were successful recipients we created a course that gives the students a project that has a connection to material sciences, as well as other areas such as controlling and artificial intelligence. We receive support from Antonio Loquercio in the controlling and computer vision part. He is currently a Post-Doc at the University of Berkley, California. Without him, it would have been very difficult to achieve the computational goals of the project. Students attended 4 weeks of theoretical classes and then started working in teams. The goal is to construct a model boat which can intelligently navigate a course using the Materials Design Lab at D-MATL. We also employed PhD students as coaches to support the students. 

What were the results?  We had 16 students who completed the course in the spring of 2021. The challenges were big and so we were thrilled by the final outcomes. The students took it seriously and at the end of the course there were four final boats. The students displayed great creativity, such as building small experimental set-ups along the way. They were able to solve problems on their own, in groups and learn from each other. 

What is the student perspective? Students were frustrated initially because we took a passive approach to communicating knowledge, but they saw the benefit of this approach at the end. We believe that learning should strain their abilities and that it is iterative. But it is wonderful that it ended on a celebratory note with the functional boats that successfully navigated the terrain. 

What lessons did you learn? We realised that in the future we need to spend more time explaining our approach to teaching and clarifying expectations right from the start. We also plan to pay close attention to the gender-balance among our students as we want to maintain a good mix as the course grows.

What are your plans for the future regarding this project? Due to the current curriculum revision projects in our department, there will likely be an increase in hands-on courses like this one. So, this course represents a new way of teaching, like a prototype for the new curriculum. The results will be looked at closely and are quite important for future decision-making in the department. Teaching this way is also a development opportunity for the lecturers. 

What first steps do you advise for others who are interested in doing the same? We think it is important that teaching is viewed as a design science, in other words that it benefits from careful planning and time. We recommend visiting other courses that already use this kind of approach and speaking with the course leaders to gain inspiration and practical ideas for implementing project and problem-based learning in your own course. We would be happy to share our experiences with others.

Read more

Posted on ,

Pharmaceutical Case Studies: The Power of Moodle Quizzes

By Dominik Stämpfli, D-CHAB

The lecture series «Pharmazeutische Fallbeispiele» [Pharmaceutical Case Studies] is a compilation of seven 2-hour sessions for around 75 students of the BSc Pharmaceutical Sciences. Surrounded by lectures and lab work on basic science and pharmacotherapy in the third year of studies, our autumn series aims to showcase the complexity of and our fascination for later pharmacy practice issues, giving the students a new perspective on all the other courses’ material as well.

One of our primary learning objectives states that students should be able to analyse simple case studies from pharmacy practice and present, explain, and discuss them in plenary, based on their current pharmaceutical knowledge. To let students achieve this objective, we had already included group work and presentations, where they discuss their thoughts on a given case study with our and their own literature resources (e.g., which drug class is most appropriate for which kind of nausea).

In 2019, we realized that student participation dropped towards the end of the semester when the big exams of the other courses approached. The sessions were mainly visited by the presenting student groups, whilst their peers focused on learning for ECTS.

2020 challenged us to go digital. This simultaneously provided the opportunity to have Moodle supporting us in our different teaching elements. The Moodle group selection allowed our students to choose their own peers. Folders, surrounded by explanatory text, helped us in embedding the asynchronous learning material (i.e., preparatory reading).

Most importantly, however, we created a simple quiz with four questions for each of the seven 2-hour sessions, focusing on the day’s learning objectives. We specifically aimed to include questions on the preparatory reading, our frontal inputs, the presentations by their peers, and one additional pharmacy practice issue. Moodle badges allowed for a simple gamification of the quiz. We shortened our lessons and offered the time to complete the quizzes during the two hours to not increase the overall student workload.

Student feedback was overwhelmingly positive: They appreciated our efforts concerning the Moodle course, liked the variation with peer presentations, stated having fun completing our quizzes, and were happy about the interactive segments. There were still fewer students present live towards the end of the semester, but the completed quizzes suggest a shift towards asynchronous learning by watching the recordings when taking a break from learning for ECTS.

We will most certainly keep our Moodle course even when going back to physically present teaching. The students seemed to be engaged in the course material by asking us interested follow-up questions concerning the preparatory reading and even our quizzes. The administrative work for setting up the course was hefty, but well worth it. One advise to my previous self: Cramped shoulders won’t help you in troubleshooting issues in the Moodle group selector faster.

Read more

Posted on ,

The benefits of classroom visits (and how to make them effective)

By Tommaso Magrini, PhD student, Departement of Materials

Visualisation of classroom visits in an online teaching scenario (Image: Tommaso Magrini)

It requires effort, time and theoretical preparation to be truly able to deliver a good lecture, to properly plan a class, or to assess the performance of your students. As I am writing this document, I am still in this process of learning. Nevertheless, the more I am involved in teaching, the more I can experiment with new techniques or approaches, keeping the structures that worked, adjusting those that didn’t. One technique I plan to keep is classroom visits with peers.

Over the course of my doctoral studies I enrolled in the program “Learning to Teach”. One of the most interesting things I have learnt is the concept “learning by doing”. Not only is “learning by doing” more fun and engaging for the students, but it has also been proven to be the best ally for the lecturers to reach their learning objectives. Step by step, I introduced different activities in my classes, ranging from simple short discussions to more advanced posters and presentations. Those activities not only proved to be a good way to keep the students focused and active, but were also be extremely useful for us as teachers to assess if the concepts our class has been built on, are solid and stable in the students’ minds, or if they need repetition or consolidation. To understand whether the activities I have planned are meaningful and well aligned with the learning objectives, I have asked for peers to visit my classes, sit in and provide me with feedback.

Over the last semester I started implementing a new activity at the end of each lecture. Expecting my students to build on the concepts seen in my class and restructure them into a broader and more applied context, I proposed the following activity: divided in groups the students would need to come up with a shared idea, describe it schematically on a poster and then pitch it in front of the class. This would then foster a discussion between ‘critical friends’, that would openly challenge each group’s idea, with the goal of improving it and helping its realization. The classroom visit proved to be crucial in this phase.

As a lecturer, during such vivid and intense scientific discussions, I have to occupy several roles at the same time. Indeed, not only I have to moderate the discussion, but I also have to evaluate how deep and relevant the discussion is, while I  assess whether the students have reached the milestones and the key learning objectives or not. For this reason, the presence of my colleague, that sits ‘outside the discussion’ and evaluates the classroom response to the activity is extremely important. If at the beginning he would observe only, at later stages we were also able to switch roles and evaluate the class activity in turns. Being able not only to take part in the discussion but also to observe it from outside and take notes gave me a more complete vision of the activity.

At the end of each class, I would always have a debriefing with my colleague. Its goal was to sum up the positive and negative aspects of the lecture in a constructive and unbiased way. These debriefings helped to correct the weak parts of the lecture and expand the positive ones. It was clear from the first time on, that the students were responding to the activity we planned with a positive attitude and with enthusiasm. Furthermore, through the classroom visits, we realized that we could more efficiently assess the classroom knowledge by using ‘exam-like questions’ during the discussion.

As a matter of fact, the ‘simple’ classroom visits have evolved, in our experience, into an open ideas exchange, built on honest and constructive feedback, that would help me improving my teaching style, the structure of my classes and the realization of more targeted and better structured learning activities.

Read more

Posted on ,

How meaningful are clicker data?

Contributors: Meike Akveld (D-MATH), Menny Aka (D-MATH), Alexander Caspar (D-MATH), Marinka Valkering-Sijsling (LET), Gerd Kortemeyer (LET)

Among other things, ETH Zurich’s EduApp allows instructors to pose clicker questions during lectures. Instructors can interrupt lectures to ask questions from the students and get and give feedback on learning progress. Lecturers can also trigger phases of peer-instruction, where students discuss their initial answers to a question with one another and then reanswer the question – in effect, the students are teaching each other during those phases, thus “peer instruction”. By asking students to answer a question twice, lecturers gather data on student understanding. But how meaningful is this feedback data, in particular, when answering is voluntary and ungraded?

A group of mathematics instructors at ETH’s D-MATH worked with LET to analyze EduApp data using Item Response Theory (IRT), Classical Test Theory (CTT) and clustering methods. Over the course of the semester, 44 clicker problems were posed – 12 of them twice, as the instructor decided to insert a phase of peer-instruction. The following figure shows an example of the kind of problem being analyzed:

Fig.1 Example of a clicker problem

The problem shown was used in conjunction with peer-instruction; the gray bars indicate the initial student responses, the black bars those after the discussion. A simple, unsurprising observation is that after peer-instruction, more students arrived at the correct answer. What can we learn from these responses? CTT and IRT can provide psychometrics that help understand this instructional scenario.

When it comes to being “meaningful,” the “discrimination” parameter of a problem is of particular interest: how well does correctly or incorrectly answering a problem distinguish (“discriminate”) between students who have or have not understood the underlying concepts?

CTT simply uses the total score as a measure of “ability”, but also has a measure of discrimination (“biserial coefficient”). IRT estimates the probability of a student arriving at the correct answer for a particular problem (“item”) based on a hidden (“latent”) trait of the student called “ability” – typically, higher-ability students would have a higher chance of getting a problem correct. How exactly this probability increases depends on problem characteristics (“item parameters”).

In IRT, the ability-trait is determined in a multistep, multidimensional optimization process, where the difficulty and discrimination parameters of particular problems (“items”) feed back on how much correctly answering that problem says about the “ability” of the student; “high-ability” students are likely to get correct answers even on high-difficulty, high-discrimination problems.

The results of their study were extremely encouraging: using both CTT and IRT, almost all 44 problems under investigation exhibited strong positive discrimination in the initial vote. This means that the better the student understood the underlying concepts, the much more likely they were to give the right answers – and vice versa. A low discrimination, on the other hand, means a problem provides less meaningful feedback. For the handful of problems which had lower (yet still meaningful!) discrimination, this could be explained by other problem characteristics, for example, that at the time they were posed, they were still too hard or already too easy – but even that feedback is meaningful to the instructor for future semesters.

The truly surprising result of the study was that in all cases of peer-instruction, the problem had even stronger discrimination afterwards! Yes, unsurprisingly more students answer correctly after discussion with their neighbors (the problem becomes “easier”), but: peer-instruction does not simply allow weaker students to enter the correct answer, it apparently helps them to perform at their true potential.

For the purposes of the study, the clicker data had to be exported manually, but the next version of EduApp, slated to be released in December 2020, will allow export of data for learning analytics purposes directly from the interface – the following figure shows a sneak preview of that new functionality.

Fig. 2 The new “Learning Analytics” function in EduApp

The exported data format is compatible with input for the statistics software R, and there are variety of guides available for how to analyze this data (https://aapt.scitation.org/doi/abs/10.1119/1.5135788 (accessible through the ETH Library) provides a “quick-and-dirty” guide).

The full study, including results from Classical Test Theory and clustering methods, as well an outlook for new EduApp-functionality is available open-access in Issue 13 of e-learning and education (eleed) under https://eleed.campussource.de/archive/13/5122.

Read more

Posted on ,

Schlechte Unterrichtsbeurteilungen für innovativen Unterricht?

Unterricht

“Mit viel Engagement habe ich meine Lehrveranstaltung nach neuesten didaktischen Erkenntnissen umgestaltet. Sei es im Flipped Classroom oder mit erhöhtem Einsatz von Clicker-Fragen, die Studierenden waren während der Präsenz aktiv gefordert und haben auch gerne mitgemacht. Doch mit der Unterrichtsbeurteilung kam die grosse Enttäuschung. Die Studierenden bewerten mich und meinen Unterricht deutlich schlechter als vorher. Sie bevorzugen sogar Frontalunterricht, denn damit würden sie besser lernen. Habe ich etwas falsch gemacht? Soll ich wieder zurück zu meiner altbewährten Vorlesung?”

Was hier wie ein Einzelfall klingt, ist doch sehr verbreitet. Zahlreiche Untersuchungen dokumentieren, dass Unterrichtsformen mit Lerner zentrierten Methoden häufig zu schlechten Evaluationsergebnissen führen (z.B. Seidel, 2013). In einer kürzlich publizierten Studie haben Louis Deslauriers und Kollegen/innen der Harvard University genau diese Problematik untersucht (Deslauriers, 2019). Sie gingen der Frage nach, ob es bei aktiv involvierten Studierenden eine Diskrepanz zwischen dem tatsächlichen und dem empfundenen Lerngewinn gibt. Falls Studierende den Eindruck haben, weniger als in einer Vorlesung gelernt zu haben, dann führt dies zwangsläufig zu schlechteren Evaluationsergebnissen. Die experimentell angelegte Untersuchung bestätigte die negative Korrelation zwischen der Selbsteinschätzung und dem effektivem Lerngewinn. Folgende drei Gründe sind dafür verantwortlich:

  • Interaktive Unterrichtsmethoden verlangen eine erhöhte kognitive Leistung, die von Studierenden dann nicht unbedingt mit dem Lernen in Verbindung gesetzt wird.
  • Insbesondere Studienanfänger/innen verfügen noch nicht über die Fähigkeit, ihr eigenes Wissen in einem neuen Fachgebiet korrekt einzuschätzen.
  • Die klare und sprachgewandte Präsentation beim Frontalunterricht verleitet Studierende dazu, ihr eigenes Verständnis in Vorlesungen deutlich zu überschätzen.

Aufgrund der Ergebnisse einer Nachuntersuchung schlagen Delauriers und Kollegen/innen einige recht simple Massnahmen vor, um dieses Missverhältnis zwischen gefühltem und tatsächlichem Lerngewinn zu unterbinden. Zentral dabei ist, die Befürchtungen und Ängste der Studierenden ernst zu nehmen und sie zu thematisieren. So kann eine kurze Darlegung der Lernvorteile von aktivem Unterricht (z.B. Freeman, 2014) während der ersten Unterrichtsstunde bereits erste Befürchtungen auffangen. Aber auch im weiteren Verlauf der Veranstaltung sollte immer wieder auf den erzielten Lernfortschritt hingewiesen werden. Damit erlangen die Studierende eine bessere Einschätzung ihres eigenen Lerngewinns. Hilfreich ist zudem, auf die Gefahr der Lernillusion bei eloquenter Redegewandtheit des Dozierenden in Vorlesungen hinzuweisen (z.B. Toftness, 2018).

Auch an der ETH konnten wir den Einfluss dieser Massnahmen bestätigen. In einer Studie am Departement Physik verglichen wir den Lerngewinn zwischen interaktivem Unterricht und Vorlesung. Bereits in der ersten Lerneinheit wiesen wir die interaktive Gruppe ausführlich auf die positiven Auswirkungen des interaktiven Unterrichts hin. Zusätzlich wurden Unsicherheiten bezüglich des eigenen Lerngewinns im Semester kontinuierlich thematisiert. Bei der Unterrichtsevaluation konnten wir daraufhin kein Missverhältnis zwischen effektiver und geschätzter Lernleistung feststellen. Studierende im interaktiven Unterrichtsformat erzielten einen höheren Lerngewinn und gaben signifikant bessere Werte bezüglich ihres eigenen Lernens an als jene in der parallel durchgeführten Vorlesung (Schiltz, 2018).

Tipp:
Was hier jetzt speziell für interaktive Unterrichtsformen gilt, lässt sich sicher auch auf jeden anderen Wechsel der Lernform übertragen. Insbesondere wenn die neue Lernform noch nicht geläufig ist, sollte man die anfänglichen Bedenken der Studierenden ernst nehmen und ihnen klar vermitteln, welchen Nutzen sie vom Wechsel zu erwarten haben. Daneben ist es wichtig, Ergebnisse der Unterrichtsbeurteilung (ob gute oder schlechte) kritisch zu hinterfragen. Nicht immer ist der kausale Zusammenhang zwischen studentischer Zufriedenheit und tatsächlichem Lernerfolg gegeben (z.B. Carpenter, 2020).

Read more

Posted on ,

Does anything ever happen after those teaching evaluation surveys?

Maybe you know the problem. You want feedback on your teaching from your students. You want to know what they think went well, and what didn’t. Maybe you need their evaluations for future job applications. In whatever case, in your evaluation a more or less representative amount of feedback and number of ratings would come in handy. But your students are sick of evaluations! They wonder why they have to fill something out which will be of no use to them, and nothing ever comes of evaluations anyway… .

So the muttering of students. However, something actually does happen with student evaluations – even if most students aren’t aware of it. It is rare that someone attends a course twice, and there is little opportunity to find out whether lecturers have implemented their students’ wishes. Therefore, to let students and others know what happens after a questionnaire is submitted and why evaluations are important to teaching quality, we have created a 3-minute video with the help of Youknow (specialists in explainer videos). Please show this video to your students and motivate them to take part in the survey! This is especially useful if you have the opportunity to conduct the evaluation in class.[1]

The challenge for us was to explain the entire comprehensive, stringent evaluation process from survey via publication of findings to deduction of appropriate measures briefly and appealingly. A bigger challenge was to be responsive to students and take their criticisms seriously, while also dignifying the engagement of most lecturers. Whether and how well we have achieved this in three minutes of moving images is yours to decide!


[1] By in class evaluation we mean that you programmed the time your evaluation survey will be send out to students, and you ask them during your lecture time to fill in the survey, via their laptop or smartphone.

Read more

Posted on ,

Case Study – Peer Review Mastering Digital Business Models

As part of a series of case studies, staff at LET sat down to have a conversation with Prof. Elgar Fleisch, Johannes Hübner and Dominik Bilgeri from the Department of Management, Technology, and Economics (D-MTEC) to discuss their Mastering Digital Business Model (MDBM) course.

What is the project about?

In this Mastering Digital Business Model (MDBM) course, Prof. Elgar Fleisch, Dominik Bilgeri, George Boateng and Johannes Huebner teach Master’s level students a theory- and practice-based understanding of how today’s information technologies enable new digital business models and transform existing ones. The course contains a novel examination mode, a video group project is introduced as a core element contributing to the overall course grade. In addition, students are asked to participate in a peer-to-peer review of the videos produced by other student groups, which is independent of the grading and is geared towards giving students insights in how other groups solved the challenge. The best-rated videos are then shared with the entire class in the end of the semester.

As part of this newly created examination element, course participants (in teams of two to three students) explain one of the major lecture topics (theoretical lenses) in the first half of their video.Then they apply the same lens by analysing a company, aiming to better understand its underlying business model. Companies are pre-selected and allocated to students for fairness reasons. Every year, we choose a pool of interesting companies in the context of digital transformation, the Internet of Things, Blockchain, e-health, etc.

What motivated you to initiate the project?

The core idea was to improve students’ learning success by using an examination format that not only requires learners to reiterate theoretical contents, but also apply the theory in a practical context. The students have different backgrounds, and do not necessarily have a strong business focus, which means that many of the concepts taught in class may be rather abstract. We used the video format and specific companies as case studies, because we think this is a good way to trigger curiosity, show concrete examples of modern companies in a compact form, and force students to reflect deeply upon theoretical frameworks compared to other examination formats.

How did you do it?

Aside from the weekly input lectures, we ask students to form groups in the beginning of the semester. We then provide a list of theoretical core topics from which each group can choose one. In addition, we randomly assign each group to a case company. The theoretical topic then first needs to be explained in the first half of the video, and then be applied to the case company in the second half. Here we thus used a prosumer approach, where students become part of the course because they create a small section of the content. The best videos are shared with the class, and can be reused as additional learning materials for future cohorts. This set-up generally resulted in high-quality videos, perhaps also since students knew their videos will be used again.

Students also had to review the video projects of five other groups. They had to clearly describe whether and how their peers used certain perspectives (called “lenses” in the course) which played a role in the video and in their feedback. In this way they analysed once more how the newly learned concepts were visible in other companies – a positive side effect being that they also honed their reflection and feedback skills.

Did you have the support you needed for the project? Is there additional support you wish you had had to help you to achieve your goals?

We asked two students from previous cohorts to join us as tutors, and support this year’s groups primarily with technical questions about video-making (e.g. tools, quality considerations etc.). In addition, we designed one of the lecture slots as a coaching session during which we would further support student groups with their questions. In total, this approach allowed us to provide the students with high-quality supervision with reasonable effort.

Please describe some of the key outcomes of the project

To most students, the task of creating a video was new. We received feedback that while the initial effort for learning how to make a video was higher compared to other examination formats, it was also fun and very helpful to really understand and apply the new concepts. They said that they learned things more deeply and more sustainably because they had to consider all details and aspects – compared to the practical exercises they are familiar with in other courses. By carefully phrasing their arguments in giving feedback on peer videos, students became more aware of their own thinking and argumentation.

We observed that the questions asked by students once they start creating videos were different and went deeper, i.e. their reflections were based on many concrete examples of companies, and the concepts were put into perspective. The same sub-concepts have a different meaning in another context, and students now see the overarching principles better and can argue more precisely about theoretical aspects. Without these concrete examples, it would have been harder to concretely grasp the theoretical aspects.

How did the project impact learners or the way in which you teach?

We were surprised by the high quality of the best student videos. The teaching team is now really motivated to continue innovating on our approaches in other courses. We saw clearly that when students are very active we get better results, deeper learning and better reflection.

What lessons learned do you want to share with your colleagues?

It can really pay off to try things and to experiment. We think that nowadays the classic format of passive lectures and final exams may not always be the best choice. We believe the improved outcomes through students who were actively engaged by the video assignment justified the investment in developing new approaches and tools.

When considering videos as an examination format, you should define the entire course/project very clearly. When describing what production options students have for videos, you should be very precise. Offering too many options can be counterproductive. It is better to present 3-4 crystal-clear examples and stick to them.

Also, we would recommend managing students’ expectations clearly in the beginning of the semester, and highlighting both the benefits and challenges of this examination format. Of course, this becomes easier after the first year, when you can draw from the experience of the first cohort, and also provide examples of prior videos to illustrate what is expected of the groups. Because the students are co-creators you get new and relevant content which enriches the course and can serve to motivate both students and teachers.

What are the future plans for this work? How do you plan to sustain what you have created through the project?

We plan to optimize some details of this course, and to go even more in the direction of a flipped classroom to use this teaching approach in other courses. We will create a library of the student videos to provide it as additional learning materials in future editions of the course.

Student feedback

By MDBM Student Cristina Mercandetti (mercandc@student.ethz.ch)

  1. Your opinion about this course and the peer review & video production process – how has it influenced your learning process?
    Cristina Mercandetti: I really enjoyed both the course and the video production process. I think they complemented each other very well and we were able to directly apply the theoretical knowledge learned in the course to work on our project. It helped me to think more critically about the course content, and really dive into some of the lenses and models presented. I don’t think this would have been possible without the video production, so it definitely improved my learning process.
  • Do you think this approach could be used in other courses?
    Cristina Mercandetti: Yes, I think this approach could easily be used in other classes. However, I think part of the fun in this class was that the video production was something very new and refreshing (a side effect was that I learned how to cut a short movie). I imagine that if several classes introduced this it would lose some of its novelty and could be stressful, as it took a lot of time.
  • Final remarks about the course
    Cristina Mercandetti: I really enjoyed the whole class, and heard a lot of good things from other students too.

Read more

Posted on ,

Case Study – Peer Review Corporate Sustainability

As part of a series of case studies, staff at LET sat down to have a conversation with Prof. Volker Hoffmann (SusTec, the Group for Sustainability and Technology) and Erik Jentges (Educational Developer) from the Department of Management, Technology and Economics (D-MTEC) to discuss their corporate sustainability project.

What is the project about?

The course “Corporate Sustainability” aims to enable students to become advocates of sustainable business practices in their later careers. Each year it attracts 150-200 students with diverse disciplinary backgrounds and different educational levels (BSc, MSc, and MAS). We adapted the Six Sentence Argument (6SA) method for this course. The method focuses on enhancing critical thinking skills through structured writing and guided, double-blind peer-review.

What motivated you to initiate the project?

We wanted students to get a clearer picture of what sustainability really is. In the course, they develop not only a deeper understanding of corporate sustainability but also the skills to give and receive feedback.

How did you do it?

At the core are four topics that relate to the sustainability of corporations. These are assessment, strategy, technology, and finance. We developed digital learning modules (videos, some with interactive elements) that explain key concepts to support the most relevant and difficult parts of the lecture. Also, we want to develop students’ critical thinking skills. In e-modules, students learn to formulate concise and short arguments with the 6SA method. The core idea builds on the assumption that writing is thinking.

In the e-modules, students face a decision (a micro case based on the lecture content) and argue for their preferred course of action using a logical structure of exactly six sentences. Each sentence fulfils a specific function in the overall argument and has a 20-word limit. A clear grading rubric enables students to assess 6SAs in double-blind peer reviews. These have been continuously adapted and improved since 2015. The specialized online tool “peergrade” also helped us to conduct a smooth process – for both students and teachers.

Through the peer assessment, students engage critically with their peers’ arguments and receive constructive feedback on their own arguments. With the 6SA exercise, students learn to argue with clarity, and it helps them to reflect on the way they and others think.

During the second half of the semester, students work in diverse teams to prepare mock debates, consulting strategies, economic models and campaign videos. In this phase, they are coached by several postdoctoral and doctoral researchers from SusTec, the Group for Sustainability and Technology. The students then present their projects and display their skills in a group puzzle session and are debriefed in the following final lecture session. Students receive grades for both individual and group performance and can earn a bonus on their exam grade when completing the critical thinking exercises.

Did you have the support you needed for the project? Is there additional support you wish you had had to help you to achieve your goals?

The project received funding from different sources. This helped us to hire academic staff to assist the development of new teaching approaches and the production of high-quality videos. In addition, we received specialist guidance in the instructional design and production of videos.

Please describe some of the key outcomes of the project

With regard to our feedback modules, we think that the quality of the argumentation and peer reviews has increased over the years. For example, we learned that the effective design of such peer assessment exercises for students requires training on how to give constructive feedback and that it should involve several feedback loops to support the development and refinement of critical thinking skills. Overall, the course now integrates many innovative teaching elements and was a finalist in the 2018 ETH KITE award.

How did the project impact learners or the way in which you teach?

When students are able to write better and concise arguments that convince critical readers, and if they can give constructive feedback to arguments that are being made to justify strategic decisions, then they are able to actively shape good decisions in a company setting – they can be change-makers for corporate sustainability. The students were motivated by the new teaching approaches such as the supporting videos, interactive questions inside the videos, and the critical thinking exercises. Peer assessment is “homework” for the students, but they know that they can earn a bonus on their exam grade – and they are already rehearsing for some parts of the final exam.

With regard to students’ learning, the peer review process itself is convincing. What is unique to our teaching situation is the incredible diversity in the classroom. A 19-year-old Swiss environmental science student may be sitting next to a 25-year-old Chinese student who is pursuing a master’s degree in management, who in turn sits next to a 35-year-old American part-time student with a PhD in chemistry and a management position with responsibilities for 20 employees in a multinational company. Peer feedback is a powerful solution to bridge these gaps of different levels of experience and cultural backgrounds. It allows younger students to write a creative and brilliant argument without being intimidated by more senior students. It allows a shy and quiet student to gain confidence by formulating a convincing argument whose strengths are recognized in their peers’ feedback. It creates a space for older students to learn how to coach younger classmates with constructive feedback to improve their reasoning.

That is why at D-MTEC, we use peer feedback in other courses as well. Students learn more when actually giving feedback compared to when only submitting an assignment.

What lessons learned do you want to share with your colleagues?

At the beginning, it was a lot of work and many people were involved, but it was worth it. Today, with regard to the critical thinking exercises, we have continuously refined our processes. Every student writes three reviews, thereby ensuring that everyone also receives much more feedback than a single lecturer could provide. The main work for lecturers is providing an overview of the themes in the arguments and summarizing the activity for all students. This lets them know that their individual contribution becomes part of a collective intelligence. There are always truly smart and innovative solutions that need to be shared with the whole class. Also, there is little effort involved in re-grading/moderating student questions about feedback, because we train students to write helpful and considerate feedback and make them aware of that they also have to learn how to receive feedback, especially if it is feedback that they don’t want to, but need to hear.

For the production of videos, we recommend planning enough time and engaging with video experts and instructional designers early on. Especially writing a concise script for a short video requires a surprising amount of time until it effectively conveys your key points.

If you are interested in applying these concepts in your own courses please contact LET.

Note: The project received funding from different sources (Innovedum, Emil Halter Foundation, ETH Critical Thinking Initiative).

Additional resources and comments

Read more

Posted on ,

Case study – Peer Review Food Chemistry Laboratory – Writing reports


As part of a series of case studies, staff at LET sat down to have a conversation with Prof. Laura Nyström and Dr. Melanie Erzinger from the Department of Health Sciences and Technology to discuss their food chemistry laboratory project.


What is the project about?

We introduced a new way to write lab reports, combined with a peer review method to foster collaboration and critical thinking skills among students. In the past students did not have clear criteria as to what makes a good report. Assistants also needed too much time to read the reports and give repeated feedback. Thus we looked for a way to help assistants spend less time on the review process.

We transformed the format of our Food Chemistry Laboratory Course (Food Science, BSc level, 4th semester) from a classical lecture format with lab exercises to a blended learning format. With new videos, we can achieve better coverage of basic knowledge (i.e. security, handling of equipment).

What motivated you to initiate the project?

Student numbers have increased over the past 10 years and we have been losing too much time in covering basic knowledge repeatedly. Using concept videos, students will be able to review key topics on their own. Overall, we also wanted to make the entire course more attractive. A key intention was to develop student skills in report writing and improve report quality.

How did you do it?

We defined additional, clear quality criteria for a good report. During a first round students give each other feedback, such that final review by teaching assistants and lecturer approval involve less effort. For each experiment, every student has to review another student’s report. In total, each reviews four reports over the semester.

Students don’t get a grade for the peer review (semester performance in the lab course is also ungraded). They have one week for each of the four peer reviews, and must complete each by the respective set deadline. They answer various questions related to the quality of the respective report (these involve five aspects plus overall feedback; see the annex at the end of this case study). Students do not “grade” the reports, but give feedback in their own words.

Assistants are aware of what is asked in the reports and are therefore able to provide targeted and helpful feedback in the lab which addresses the quality criteria for reports.

We provide the students with online material on how to write reports (short videos, documents etc.). Previously we had a short lecture with examples. Until now, however, we did not train them in conducting proper peer reviews. We have now realised that we need to do this (especially for Bachelor’s degree students), and will include peer review training with the short lecture next year.

Did you have the support you needed for the project? Is there additional support you wish you had had to help you to achieve your goals?

We learned about a module inside our LMS for administering the peer-review process (“workshop module” in Moodle). It would have been helpful to have had practical tips from others, but apparently not many lecturers have used this tool. Although the general instructions were useful, it took quite some time to learn all the aspects of the tool.

Please describe some of the key outcomes of the project.

Various things changed for the better. Students learned a lot by reading and reviewing the reports of their peers. They gained important input for their own reports. For many it was the first time they had had to give feedback in such a structured way. They also had to find a way to critique something in a good, constructive manner. Overall, students were introduced to a new way of critical thinking and took important first steps in this skill, which is important for their later careers.

We can say clearly that through the new review method we were able to improve the quality of reports and reduce the time needed by lecturers to grade them.

How did the project impact learners or the way in which you teach?

In general the peer review method was well received in the BSc course, and we used the same approach in an MSc-level course. We therefore realised that Bachelor’s degree students need more help and training in peer review than Master’s students.

Overall we saw that the blended learning approach and the peer review methods work to improve our courses, addressing the above-mentioned challenges of lack of student preparation and the need to constantly repeat basic knowledge. Students themselves clearly realised the value and potential of better collaboration, peer feedback and critical thinking skills.

What lessons learned do you want to share with your colleagues?

Not every cohort is the same. While things worked quite well in 2017, in 2018 fewer students adhered to the schedule and deadlines – even though everything was communicated and documented in the same way as in the previous year.

What are your future plans for this work? How do you plan to sustain what you have created through the project?

More and more assistants will become competent in providing full reviews of the already peer-reviewed reports. Currently lecturers still have to do this. Lecturers will thus gain more time to be present in the labs and to give 1:1 feedback to students in the lab and online.

We will definitely create some training material for assistants for this purpose, but it is not available yet. We also want to create a short, ready-to-use document about giving feedback in our specific context: what is constructive feedback, what are the do’s and don’ts? Students, assistants and fellow instructors can use it.

We are interested in learning whether other lab courses at ETH do something similar, and how. We also need to improve the support situation with the Moodle review tool “workshop module”. We will continue to work with it, but it is a bit tricky sometimes.

One additional idea is to make the videos interactive. Students will see in-built questions in the videos which they have to answer right away.

Additional notes regarding resources and tools used.

  • We used a programme called Labster to create virtual labs in some cases to extend the experience to experiments which were not doable in reality in our labs.
  • We learned from other courses and departments regarding effective feedback (ETH “Foundations of Teaching and Learning” course).
  • To conduct the peer review we work with the “workshop module” in Moodle.
  • To make the videos interactive we will work with the new Moodle “interactive video suite” plugin.

Student voices:

What is your opinion about this course and the peer review process (lab reports). How has it influenced your learning process?

Robert Spiess: I think peer-reviewing was a great way to see other students’ work. It gave me the opportunity to experience and compare different ways of writing. I could always detect things that I wanted to include in my reports. At the same time, I could see in which points my reports were better, where my advantages were.

I think this procedure is particularly useful when writing. But the reports should not be too long, because, otherwise, students have to spend too much time on their own report and neglect, as a consequence, the peer-reviewing of someone else’s report. Other courses usually required longer reports. But if the reports were shortened, the method could also be applied in other laboratory courses (such as in the food processing or in the biotechnology lab course).

Aline Candrian: I’m glad I did the course, I think it gives students a first impression of laboratory work. The lab report writing is an essential part of the course to understand the experiment and the obtained data. The peer review approach was fine, even though nobody was eager to do them. VERY little time was invested into peer review by most groups, as far as I’m concerned. Of course, sometimes you could benefit from your peer’s feedback but most of the times we didn’t act on them. That’s probably because it was our first time writing (semi-) real reports. We didn’t really know what we were doing and you mostly think you know better than others, especially if someone reviewed your report who you rate less familiar with chemistry.

At the same time you’re well aware that you know nothing about report writing, so how can you evaluate someone else’s work?

Additionally, motivation was minimal since you were just glad to be done with writing your report. Having to assess another report and then correct your own report again was just another ‘burden’. So, altogether, I would say report-writing was a crucial part of the course but peer review not so much since we had no experience at all. I think peer review makes more sense in the courses in our last semester.

Making the students just do a peer review on the last report might work. They’ll see how it works, they’ll have had written a few reports (and got more familiar with it) and might be more confident in providing feedback. But I’m not an expert, it may not work the way I envision it, what do I know 🙂

Read more

Posted on ,