Welcome back! The Crowdmark team is excited to collaborate with educators to help students learn. This post provides an update on where Crowdmark is now and where the platform is headed over the upcoming academic year.
Crowdmark started as a productivity tool for paper-based exams and has since grown into a universal assessment platform. Crowdmark is in use today to collect, evaluate, and return student work in the humanities, social and physical sciences, engineering, and in my discipline of mathematics. Inspired by feedback from the instructors, teaching assistants, students and administrators we serve, the Crowdmark team has designed and developed elegant improvements that help universities advance on their teaching and learning mission.
Robust integrations into learning management systems streamline the transfer of student roster, grading team, and scores information. Access to Crowdmark is now achieved through secure single-sign-on methods controlled by the universities we serve.
Crowdmark launched enriched support for assignments recently. Instructors can author assignments using their own tools or inside Crowdmark and easily distribute these to students. Students can submit digital or paper-based responses of varying page length using an intuitive interface. Grading teams mark the submissions and leave rich text comments. The evaluated work can be digitally returned to students without wasting any class time passing back assignments. Support for exams with variable page length and improved distribution and collection services for distance education scenarios are coming soon.
Crowdmark delivers new insights into student learning, grading team performance, platform usage across a university, and the integrity of assessments. Descriptive statistics and visualizations of scores are available for each question within each assessment.
Authorized instructors can now access the collection of all scores data (together with student information from the LMS or uploaded metadata CSV files) through a secure REST API. Correlations among scores on various questions and performance comparisons between students from different lecture sections or with different backgrounds can be easily explored through the scores API. By making their data available to them in an easily accessible format, universities and instructors can now build their own early warning systems and tools for measuring of the effectiveness of assessment items.
Sample documentation for the Crowdmark API
Want to experiment with the API? Send an email to firstname.lastname@example.org and we’ll get you started.
The Manager view is a new interface used by administrators to get insights into Crowdmark usage patterns. Visualizations show the growth in numbers of assessments, students, pages uploaded, and grading team members over time. Grading activity timing data within each assessment are coalesced into university-wide reports on total hours spent grading and the time before an evaluated assessment is returned to students.
Example usage data from the Manager view
Exam Matcher app
A university’s reputation is built upon the integrity of its assessments. Crowdmark’s auditable log of events related to submissions improves transparency for digital and paper-based homework. Crowdmark’s mobile app, Exam Matcher, can be integrated into ID card databases allowing universities to build new systems to validate student identity and improve the integrity of examinations.
The mission of Crowdmark is to transform traditional assessment into an enriched dialogue between students and instructional teams. New methods to extract data from pages, analyze the content of text comments, measure the reliability of scoring are are under development for 2016-2017. At Crowdmark, we build resources to help teachers to teach and students to learn. I thank the instructors, teaching assistants, students and administrators we serve for their feedback and inspiration.