The inspiration for Crowdmark was the logistical nightmare of grading the 2011 Canadian Open Mathematics Challenge (COMC). Imagine banker boxes and banker boxes, each containing FedEx envelopes containing tens of exams, with each exam containing 14 pages of hand-written answers to mathematics contest problems. We used skilled human volunteers to assess 70,000 pages of hand-written math papers. Some of the volunteers could only mark pages 1, 2 and 3. Others would only grade pages 10, 11, 12, etc.
We worked to match human skills to pages, with different teams grading on a page-by-page basis. We ran into serial processing constraints since the exam booklets were stapled together. In some cases, marking teams had to wait until a different team finished grading their set of problems before handing over the exam booklets. There was a lot of paper shuffling. With superglue execution by Pamela Brittain, we got through it but with visceral inefficiencies. I felt like the assessment firepower of our volunteers could be used to grade better.
Crowdmark launched through the UTEST accelerator, funded by MaRS Innovation and the University of Toronto, in May 2012 with two founders: Martin Muñoz and me. Using leveraged funds (Mitacs, VentureStart, OCE) and in consultation with Unspace Interactive, we built a software prototype for marking the 2012 Sun Life Canadian Open Mathematics Challenge. Working with the Canadian Mathematical Society, a partnership of eight universities used Crowdmark to grade the 2012 COMC in half the time compared to 2011.
Here are some aggregate statistics for the pilot deployment:
- 147 Markers
- 30 Facilitators
- 3,316 Exams
- Total Number of Problems Crowdmarked: 46,424
Crowdmark achieved proof-of-concept as a scalable solution to the human assessment blockage problem plaguing education systems.
2012 COMC Marker Survey Results
The Toronto grading team was surveyed for satisfaction. Markers liked Crowdmark:
“I would like to use Crowdmark
to mark another exam.”
- Strongly Disagree: 0%
- Disagree Somewhat: 0%
- Agree with Hesitation: 18%
- Strongly Agree: 82%
2012 COMC Marker and Facilitator Anecdotes
Sophie Chrysostomou, Senior Lecturer for Computer Science and Mathematics, University of Toronto Scarborough:
Amazing! We need to talk!
…crowdmark can be used to mark the term tests. Advantages: No flipping of pages, no adding up of marks at the end, no entering marks to a database, marking can be done from home, the papers are always locked up and safe (no fear of loosing any, spilling coffee on them, etc).
Crowdmark worked very well today considering it was the first time it is used.
Robert Craigen, Professor of Mathematics, University of Manitoba:
I’d mark this as a success. Some room to improve, but a clear success. Each of my markers expressed their general satisfaction with it.
The system is quite nice.
From what I see this Crowdmark is a lovely and elegant app. I hope it runs as smoothly as it looks, it – will make things very easy if it works as planned – or it could be a big pain if not. We’ll see. I look forward to the trials. Whoever wrote the thing is obviously better at anticipating user problems than many “university utility programs”, which can be many times more burdensome than their paper counterparts.
Pamela Brittain, Outreach and Special Projects Coordinator, Department of Mathematics, University of Toronto:
I found the Crowdmark system made huge improvements in the amount of data entry and cross-referencing required. It was also much easier to organize and assign markers to various questions making the whole marking process much more streamlined, efficient and faster.
Anonymous Survey Respondents:
It is truly a great system that was a joy to use… Exceptional program!
It’s very useful when you need to mark huge stacks of exams. Also I prefer to work late at night, so it was always a pain for me to come to the early morning marking. Web-based marking is a great thing!
James Colliander is a Professor of Mathematics at the University of Toronto and Founder/CEO of Crowdmark.