Crowdmark
Community Conference

2026 Coming soon

The Crowdmark Community Conference is an annual event bringing together educators to explore cutting-edge strategies in teaching, learning, and assessment. With a focus on improving student outcomes through innovative grading practices, this interactive conference features user-led sessions designed to inspire and connect members of the Crowdmark community.

Call for proposals 

The Crowdmark Community Conference is dedicated to furthering the conversation on authentic assessment, effective teaching, and successful learning. This one day users conference will spotlight the most innovative and relevant practices in teaching and learning today. 

For our 5th annual users conference, we invite you to submit a proposal to give a presentation, panel discussion, or workshop. 

How do you utilize Crowdmark to transform teaching and learning experiences in your in-person, online, or hybrid courses, and what impact has it had on your assessment and feedback processes?

WHAT CAN I EXPECT TO LEARN?

The conference sessions will be based on the themes below:

Addressing the impact of AI on grading, feedback, and the overall teaching process.

Gain insight into how fellow instructors are using Crowdmark to elevate their grading and assessment processes in their classes. 

Exploring the role of analytics and data-driven decision-making in shaping pedagogy and outcomes.

Got a bold, unique idea that doesn’t fit the mold? This is your chance to surprise us with something unexpected and impactful!

2025 Presentations

Beyond Student Feedback: Leveraging Crowdmark for Metacognitive Assignments, Question Design, and Educational Research
Phillip C. Delekta is an Assistant Professor in the Department of Microbiology, Genetics, and Immunology (MGI) at Michigan State University (MSU). This session explores how combining Crowdmark’s feedback features with metacognitive assignments can create structured opportunities for students to reflect on their essay writing and learning processes. We will also examine how Crowdmark can support instructors in addressing common teaching challenges ranging from using its analytical data for the quality control and improvement of exam questions to employing Crowdmark as an administrative tool for educational research.
Using Crowdmark to Support Standards-Based Grading for Better Transparency and Communication
Andrew Miller is a Professor of Mathematics at Belmont University in Nashville, TN. For decades, I have used a traditional approach to grading my students’ mathematical work: see a mistake, take points off. Encouraged by many years’ experience reading AP Calculus exams and working with their scoring guidelines and by Crowdmark’s implicit default to “additive grading,” I have recently begun experimenting with standards-based additive grading. In such an approach, student work earns points as the response demonstrates milestone steps in the solution process, as opposed to losing points for each mistake. I have found that this approach helps me better think through my expectations of student work and have better grading transparency with my students. It also helps with grading some of those especially tricky student responses that appear to barely get off the ground. In this talk, I will give examples of standards-based scoring guidelines from several disciplines and from my own classes. I will also demonstrate how Crowdmark can support grading according to such standards and communicating these standards to students.
“What were they thinking?” Lessons learned from conducting pedagogical research with the Calculus Baseline Assessment
Caroline Junkins is an Assistant Professor, Teaching Stream, in the Department of Mathematics and Statistics at McMaster University. Lindsey is an Assistant Professor of Teaching in the Department of Mathematics at the University of British Columbia. In 2022, our team began development of a readiness assessment for students entering first-year calculus, with a design that interweaves traditional multiple choice questions with “Explain your reasoning” text boxes. We deployed this assessment in the Crowdmark platform to over 1700 students at McMaster University, then analyzed the results using methods from thematic analysis, statistics, natural language processing, and machine learning. We found that the assessment can provide instructors with a detailed snapshot of their incoming student cohort, and help identify students who may benefit from additional support. In this talk, we will outline key lessons we learned as both researchers and teachers, and propose a model for implementing the “Baseline Assessment” framework in any large enrollment STEM course.
Using Crowdmark for Collaborative In-Class Activities at Scale
Diana Skrzydlo is an Associate Professor, Teaching Stream in the Department of Statistics and Actuarial Science and the current Math Faculty Teaching Fellow. When I teach small classes, I often use collaborative in-class assignments that are both an assessment and a learning activity in one: students can use their notes, consult each other, and get clarification from the instructor and TAs who walk around and interact with each group. The Crowdmark platform enables me to use these activities in larger classes and achieve the same benefits - more frequent review of material by students, increased engagement, creating a community of learners, and more interaction between students and course staff. In this session I'll talk about how I designed and administered the assessments with Crowdmark, benefits for student learning, and best practices if you’d like to try them yourself in any size course of your own.
The Thought Lab: Using Crowdmark to Foster Low-Stakes, Active Learning in Large Biology Classes
Krystal Nunes is an Assistant Professor in the Department of Chemistry and Biology at Toronto Metropolitan University where she works as a discipline-based education researcher. In large undergraduate biology courses, fostering meaningful peer interaction and active learning can be challenging. To address this, I have integrated Crowdmark software into my teaching to facilitate a recurring in-class activity called "The Thought Lab." In each session, students engage with a problem or scenario designed to spark critical thinking and scientific reasoning. Working in self-selected small groups, students submit a collective response via Crowdmark. Responses are assessed for completion and effort rather than accuracy, contributing to a participation grade. This approach reduces performance anxiety, encourages exploration and idea-sharing, and supports a low-stakes environment where students can engage deeply without fear of grade penalties. Crowdmark’s compatibility with our institutional learning platform and its ease of use for both students and instructors have made it a valuable tool for scaling active learning in large classroom settings. This presentation will share implementation strategies, examples of Thought Lab prompts, and student feedback on the experience.
What's on the Horizon
Michelle Caers, CEO of Crowdmark. Gain insight and contribute to Crowdmark's product roadmap. 

ABOUT THE EVENT

Who is the Crowdmark Community Conference for?

This online conference is for users to discuss the benefits of Crowdmark and ways in which Crowdmark supports innovative pedagogical approaches. Sessions are user created and user presented. Learn with your colleagues about real-world innovations using Crowdmark.

Schedule

    • coming soon!
  •