Simon Bates and Ross Galloway tell us how students can use PeerWise to design high quality and effective learning material

In a recent Endpoint1 piece, we briefly described our experiences with students in our classes creating their own assessment content, in the form of multiple choice questions (MCQs), using PeerWise,2,3 a freely available online tool. This article describes our experiences further, in the context of a JISC-funded Assessment and Feedback project grant, addresses issues of question quality and presents the case - and resources - for developing an online community of practice around the PeerWise system.

p18_Student_EiC_January-2013__410

Source: © GRAHAM FOWELL/THE HITMAN

Benefits of student-generated content 

p19_Student_EiC_January-2013__250

Fig 1: Bloom's (revised) taxonomy5

PeerWise was developed in the department of computer science at the University of Auckland. The system is an online framework where students on a particular course can author rich MCQ questions related to their topic of study, and answer, comment and rate on those authored by their peers. It builds in much of the social functionality present in the recent wave of web 2.0 sites, such as the ability to rate and follow authors. It has been successfully deployed in a wide variety of educational contexts - from secondary schools to undergraduate courses, and across a wide range of disciplines.  

There are evident educational benefits for students. Writing questions, especially those based on more than mere recall of factual knowledge, is a real challenge for students, and probably not one they are used to facing. No less is the challenge of coming up with plausible incorrect answers and providing an explanation of the correct one. These activities - creation, analysis and synthesis - are higher level activities in the cognitive domain of models of learning such as Bloom's taxonomy4 (fig 1). Add into this the requirement to evaluate, judge and discuss the quality and work of their peers and there is the potential to develop a broad range of advanced skills beyond simply content mastery of the discipline.  

Key ingredients   

We have embedded the use of PeerWise into a variety of undergraduate courses at the University of Edinburgh. In practical terms, we have identified a number of key ingredients that are, in our view, highly likely to enhance the engagement and outcomes when deploying this kind of activity with students. The first is to be clear with students about your rationale and purpose for including this kind of activity in the course. Second is to embed the activity as part of the continuous assessment for the course, typically weighted as a few percent of the total course credit. This should ensure a good take-up among students, and evidences the fact that you as the instructor perceive this as valuable intellectual activity worthy of assessment credit in the course. Finally, and probably the most important, is to provide students with appropriate support and resources to help them write good questions.    

We believe these 'scaffolding' resources (see Examples of scaffolding materials and resources) are crucially important both in terms of the help and support provided to students and in setting a marker for the quality of the submissions. In our course, we used a variety of different in-class activities to introduce the PeerWise assignment, but we did not focus on how to use the system (students will find it intuitive and straightforward, so this would be a waste of class time). Instead, we spent up to 90 minutes of class time addressing what makes an (in)effective MCQ and how to identify troublesome concepts and topics from the course of study. (See Examples of (in)effective MCQs). We also designed a pro-forma (see pro-forma for question development) to support the development of questions, answer choices and explanations. All our resources are available online.6 Students working in groups studied a worked example and then went on to collaboratively author a 'practice' question. This seeded the question repository ahead of them being set the assessment task to individually author at least one question, answer at least five questions, and comment on and rate at least a further three questions.

Examples of scaffolding materials and resources

We developed four scaffolding activities for students prior to setting an assignment using PeerWise. These activities were undertaken as part of a weekly workshop, where students worked on each task in groups, spending about 90 minutes in total. All of these materials are available for download from website.

p19_Student_EiC_January-2013__180

  1. A nonsense quiz that taught the language of MCQs (stem, options, key and distractors) and demonstrated how poorly written questions sometimes test nothing but language skills. (This test - colloquially known as 'The Grunge Prowkers Quiz' - was authored by Phil Race and is available online.
  2. A pop quiz that helped students to explore their beliefs about thinking and guided them toward learning orientation and away from performance orientation. 
  3. A question template that introduced students to something we called the 'blue zone', probably better known as Vygotsky's 'Zone of Proximal Development.' This simplified constructivist model, along with information about common misconceptions and errors, encouraged the students to author questions of high cognitive value. 
  4. An example question based on the template to set the bar for creativity and complexity at a very high level. 

Examples of (in)effective MCQs

Much has been written about how to write effective MCQs, and how to avoid common pitfalls when creating them. Such advice is, in general, equally applicable to students as question authors as to academic staff. While being far from an exhaustive list, here are a few suggestions for ingredients for effective MCQs:

  • No fewer than 3 and no greater than 5 possible answers is often most appropriate.
  • Make sure there is only one unambiguously correct response, and be wary of over using negatives in either the question or responses. There is some evidence to suggest that non-native English speakers find cumbersome constructs such as double negatives particularly difficult. 
  • Use 'none of the above' sparingly as an answer choice. Likewise avoid convoluted answer choices such as '(A) and (B) but not (C)'.
  • Attempt to make all possible answer choices approximately the same length. Answer choices that are markedly shorter or longer than all the others tend to stand out.
  • Require a level of thinking beyond simply memorising facts or 'knowing things'.
  • Finally, all answer choices should be plausible (ie not nonsensical).

p20_Student_EiC_January-2013__180(Box).

Pro-forma for question development 

Karon McBride, learning designer in the school of physics and astronomy at the University of Edinburgh. It encourages students to reflect on their own difficulties with the course material and in doing so to operate just above their current knowledge level, in their so-called zone of proximal development.

In our scaffolding activities, students worked in groups through a worked example and then were set the task of collectively creating a group-authored question using a blank version of the pro-forma. These initial group-authored questions were then used to 'seed' the question database. 

Findings from a multi-discipline,multi-institution deployment 

Following our initial pilot study using PeerWise in our courses in 2010-11, we obtained grant funding from JISC under their Assessment and Feedback strand,7 to investigate different implementation models for PeerWise in different subject areas and levels. Through informal collaborations, we were able to incorporate a number of other institutions as well. We were not looking for a magic recipe, but rather at a range of implementations to better understand the transferability of the benefits of student-generated content using PeerWise.  

Our initial pilot study found a high level of engagement and uptake with the PeerWise assessment task. We found this experience replicated in different courses, at different educational levels, with varying assessment requirements in terms of the precise number of questions authored and answered. Students generally reported a very positive engagement and enthusiasm for the system, but with some occasional resistance to the scoring system built into PeerWise (which formed the basis for apportioning summative credit to each student). There were occasional, rare instances of inappropriate questions or attempting to 'game the system' (eg question 'cartels' among groups of students).  

For students on the introductory physics course at Edinburgh, statistical analysis showed a correlation between PeerWise activity (using a combined measure incorporating elements for question authoring, answering, rating and days of use) and end-of-course achievement on the final exam. Students who were more active on PeerWise tended to achieve higher examination scores that those who made less use of the system, and this was not just the 'better' students but was evident across the entire ability range.8 A similar finding has recently been replicated for second year students at the University of Glasgow.  

Question quality 

When presenting and discussing these findings, we are often asked about the quality of the questions that students write. From an early stage of our pilot study, we had the general impression that the average quality was very good indeed, and the highest quality questions were remarkably sophisticated. Detailed and extended discussions were found in the comments sections of many questions. We observed the student community within the system correcting each other's mistakes, and refining explanations and understanding. But was this just our distorted impression from only the really good examples we found when sampling the repository? Earlier this year, with the help of two final year undergraduate project students, we set about systematically cataloguing various dimensions of 'quality' for several hundred of the student-authored questions.  

p20_Student_EiC_January-2013__300

Fig 2: Taxonomic categorisation of student-generated questions in first year physics courses: Newtonian mechanics (light bars) and waves

Two principal dimensions of quality that we investigated were the cognitive level of the question and the quality of the associated explanation of the answer. To categorise the former (fig 2), we used the six cognitive levels of Bloom's Taxonomy, referred to earlier. For the latter (fig 3), we devised our own five-point scale (0 - missing; 1 - inadequate / wrong; 2 - minimal; 3 - good; 4 - excellent). Following standard checks for inter-rater reliability, a total of 602 questions were coded. The sample results shown in figures 2 and  are from the 2010-11 and 2011-12 physics classes.  

p21_Student_EiC_January-2013__300.jpg

Fig 3: Categorisation of student question explanations in first year physics courses

These results show first year undergraduate students are capable of producing questions and explanations of a very high average quality, and higher than those previously reported for other disciplines also using the PeerWise system.9 We suspect this has a strong link to the scaffolding resources that both supported and guided students to be able to write high quality questions and explanations. It is all the more impressive in the light of a recent examination of nearly 10 000 university-level MCQ assessment items used at a variety of US institutions in biology, which found over 90% of them to be classified as being within the lowest two levels of Bloom's taxonomy.10 This work on classifying and understanding what conditions support high quality student question authoring will be continued and extended following the award of an HEA Doctoral Student grant to one of our colleagues in Edinburgh.11

Connecting the dots: an online community of practice 

We have presented our work on deploying PeerWise in our courses extensively, and there has usually been at least one person (often many more) in the audience who enthusiastically approaches us, wanting to try this in their course with their students. Many people considering starting out such an intervention in their courses often have the same, or very similar, questions: what's a reasonable number of questions to ask students to author for a 1-week assessment? How do you use the scoring system in PeerWise to derive an assessment mark? Should I be worried about students plagiarising textbook end of chapter questions?  

What is needed to best support these and other teachers engaged in using PeerWise in their teaching is a community of practice, which Wenger defines as 'groups of people who share a concern or a passion for something they do and learn how to do it better as they interact regularly'.12 To support this we have developed, in collaboration with Paul Denny, the original architect of PeerWise at the University of Auckland, an online community of practice: www.peerwise-community.org. The community is open to all, and it is our aim for it to serve as a space to connect instructors to share advice and questions about using PeerWise in their teaching. In addition, it will function as a repository for PeerWise-related outputs in the form of publications, conference proceedings, posters and other documentation all in one place.  

Quotes from students using PeerWise

Student feedback from using the PeerWise system has generally been very positive, with students well able to recognise the benefit to their learning:

p21_Student_EiC_January-2013__300

However, it is important to recognise that this is a new and taxing activity for many students. Simply asking them to create questions without appropriate scaffolding and support is unlikely to lead to high quality engagement or questions. Even after providing such assistance, student comments still reflected this challenge: 

Uptake of use of PeerWise 

Since its initial release in 2007, the uptake and use of PeerWise has grown rapidly. By 2010, the system was already being used at 45 institutions, with 21 000 students contributing 57 000 questions. By October 2012, these figures have increased dramatically to 308 active institutions, nearly 100 000 unique student registrations, contributing over 3.5 million questions. In total to date nearly 10 million answers have been logged by the system!  

Simon Bates is the academic director of the centre for teaching, learning & technology at the University of British Columbia, Canada. Ross Galloway is a teaching development officer at the University of Edinburgh.