Crowdmark
Community Conference

2025

The Crowdmark Community Conference is an annual gathering of Crowdmark users interested in discussing innovative approaches to teaching and learning with a focus on assessment and grading for improved student performance. Learn from your peers and the wider Crowdmark community in this interactive event of user-presented sessions.

The 2024 Crowdmark Community Conference included virtual presentations by Crowdmark users highlighting innovative practices in teaching and learning. The focus of this conference was to connect, collaborate, and learn from peers and the wider Crowdmark community. 

Speakers included: 

  • Adam Finkelstein, Associate Director, Learning Environments, Teaching and Learning Services, McGill University
  • Dr. Pierre Sullivan, Professor, Mechanical Engineering, University of Toronto 
  • Cecile-Anne Sison, Instructional Technology Lead, Northwestern University
  • Michelle Caers, CEO, Crowdmark
  • Dr. Lindsey Daniels, Assistant Professor of Teaching, University of British Columbia
  • Dr. Corey DeGagne, Instructor, Computer Science, Dalhousie University
  • Dr. Sarah Chisholm, Senior Instructor, Mathematics and Statistics, Learning Centre Coordinator, Dalhousie University 
  • Dr. Erin McGuire, Associate Teaching Professor, Anthropology, University of Victoria
WHAT CAN I EXPECT TO LEARN?

The conference sessions will be based on the themes below:

Learn what evaluation methods through innovative AI technologies Crowdmark users are exploring in their classrooms.

Gain insight into how fellow instructors are using Crowdmark to elevate their grading and assessment processes in their classes. 

Learn about the principles, theories, and methodologies underlying the process of assessment in education. 

2024 Presentations

Streamlining Assessment in Large Courses with Crowdmark
Dr. Pierre Sullivan, Professor, Mechanical Engineering, University of Toronto. The presentation details integrating Crowdmark into a large first-year engineering course, Dynamics. It focuses on challenges encountered in supporting first-year students and implemented solutions. Crowdmark has proven helpful for exam preparation, grading, and tracking student performance over time and significantly improved grading accuracy and consistency, allowing for quantification of results and the impact of different teaching approaches on student outcomes. One issue is to ensure that you understand how Crowdmark works to best streamline the various processes required before and after an exam has been administered. By understanding how Crowdmark operates, instructors can streamline these processes significantly, maximizing both efficiency and effectiveness in the delivery of their large enrollment courses.
Enhancing Equity: Blind Team Grading of Staff Awards in Crowdmark
Cecile-Anne Sison, Instructional Technology Lead, Northwestern University. Seeking a more equitable way of assessing nominations for our annual staff awards, the Weinberg Staff Advisory Board (Northwestern University) implemented grading rubrics as a comment library, and anonymized grading of submissions by the board in Crowdmark. This case-study will explore the journey from grading wonderful words to rewarding the achievements of our colleagues.
Mentoring Early Career Academics With Crowdmark
Dr. Corey DeGagne, Instructor, Computer Science, Dalhousie University and Dr. Sarah Chisholm, Senior Instructor, Mathematics and Statistics, Learning Centre Coordinator, Dalhousie University. We will showcase a few ways that Crowdmark can be utilized to mentor early career academics with their marking practice. The focus will be on enriching the feedback for students, minding tone conveyed via text, and being aware of academic integrity.
Leveraging Text Analytics and Data Visualizations to Inform Teaching in Large Enrollment Courses
Dr. Lindsey Daniels, Assistant Professor of Teaching, University of British Columbia. Diagnostic assessments are often used to measure mastery of prerequisite skills and preparedness for a given university course. In first year undergraduate math courses, where enrollment can be in the hundreds or even thousands at some institutions, students enter with a wide variety of backgrounds, skill sets, motivations, and preparedness. Due to the size and heterogeneity of each cohort, it can be difficult to provide individualized feedback and action-oriented insights for students or instructors. The "Calculus Baseline Assessment'' provides a scalable methodology to analyze student preparedness and evidence of activated mathematical skills through a multiple choice diagnostic assessment coupled with student text responses. In this talk, we will propose a framework for visualizing and interpreting results of this analysis and solicit your feedback. How might you, as an instructor, use these insights to better understand your students and inform your teaching?
Flipping The Classroom With Crowdmark
Dr. Erin McGuire, Associate Teaching Professor, Anthropology, University of Victoria. Student engagement in large classes is a challenge, exacerbated by the ongoing impacts of the COVID-19 pandemic. Motivating students to prepare and participate in classes, especially when recordings are provided for those not in attendance, is more difficult than ever. The University of Victoria’s Introduction to Anthropology (ANTH 100) is a large survey course that covers topics ranging from human evolution and archaeology, through to issues of gender inequality and racism. With two sections of up to 200 students each semester and an instructional team of one faculty member and four teaching assistants, experiential learning opportunities are difficult to sustain. I use a series of flipped classroom exercises to actively involve students in consolidating sets of concepts. The exercises are collaborative and inquiry-driven, making innovative use of Crowdmark and Prezi to engage students.
Leveraging Crowdmark to Improve Assessment Practices From Individual Courses to the Entire University
Adam Finkelstein, Associate Director, Learning Environments, Teaching and Learning Services, and Jasmine Parent, Learning Technology Consultant, McGill University. Crowdmark can enable new opportunities for improving assessment practices both at the individual course level and the entire university. This session will explore how several different types of assessments in different courses of different sizes in different disciplines have successfully implemented Crowdmark to improve learning. In addition, we will explore how Crowdmark has transitioned at our university to a central support model and how this has had to both improved management as well as increased adoption.
What's on the Horizon
Michelle Caers, CEO of Crowdmark. Gain insight and contribute to Crowdmark's product roadmap. 
Previous slide
Next slide

ABOUT THE EVENT

Who is the Crowdmark Community Conference for?

This online conference is for users to discuss the benefits of Crowdmark and ways in which Crowdmark supports innovative pedagogical approaches. Sessions are user created and user presented. Learn with your colleagues about real-world innovations using Crowdmark. 

2024 Schedule

    • 10:00am to 10:05 – Open Remarks 
    • 10:05 am to 11:00 am – Leveraging Crowdmark to Improve Assessment Practices From Individual Courses to the Entire UniversityAdam Finkelstein
    • 11:00 am to 11:30 am – Streamlining Assessment in Large Courses with Crowdmark – Dr. Pierre Sullivan
    • 11:30 am to 12:00 pm – Enhancing Equity: Blind Team Grading of Staff Awards in Crowdmark – Cecile-Anne Sison
    • 12:00 pm to 1:00 pm – What’s on the HorizonMichelle Caers, CEO, Crowdmark
    • 1:00pm to 1:30pm  – Break 
    • 1:30 pm to 2:30pm – Flipping The Classroom With Crowdmark – Dr. Erin McGuire
    • 2:30 pm to 3:30 pm  – Mentoring Early Career Academics With Crowdmark  – Dr. Corey DeGagne and Dr. Sarah Chisholm 
    • 3:30 pm to 4:30 pm  – Leveraging Text Analytics and Data Visualizations to Inform Teaching in Large Enrollment CoursesDr. Lindsey Daniels
    • 4:30 to 5:00 pm – Closing remarks