Tech roundup: Streamline test grading and returning

SFU has entered into a trial run of CrowdMark, which looks pretty awesome.  And is Canadian!!  I want to try it out.

(Post updated Oct. 2016 after using it for my midterm with 460 students.)

TL;DR: Platform for streamlining test grading, mark inputting, and returning; This software is awesome.

TL;DR #2: If you’re a CrowdMark company person, I’ve got a few feature requests below, because your platform is awesome, and I can also see further potential!

My plan:
  • Scan midterms and upload them to the CrowdMark server.  Then I can divide out the questions between TAs, they can mark electronically, and we can hand the midterm papers back on Canvas.
    • Modified plan for this dry run (see below for why): Grade on paper, use CrowdMark for inputting grades and returning papers to students.
  • Test run context: Use this in my small course (160 students, 4 TAs), to try it out for when it will be more helpful in the fall (400+ students, 8-ish TAs)
Why I’m intrigued:
  • Marking:
    • No trading of papers or waiting for other TAs to be done with the stack of papers
    • No sorting of test pages to divide up among TAs.
    • No worries about regrade requests by students who change their answers
    • You can mark wherever you want (tablet, laptop – maybe even phone but I’m not sure about this)
    • Test papers won’t get lost or nacho-stained by graders who mark while eating
  • Inputting grades:
    • Entering grades is way more efficient.  No hunt-and-peck in Excel.   Automatically exports grades to Canvas.
    • Don’t have to put the sort the test pages alphabetically! at all! ever!
  • Returning papers:  (optional)
    • Giving tests back is done by email
    • Everyone gets their test back!
    • Nobody has to hunt through a stack of (other students’) papers to find their own — good for privacy concerns and for convenience.
    • If you don’t want to give tests back, that’s still fine – it’s not required.  At least you won’t have to keep boxes full of paper sitting around in your office for years, and you saved all that sorting/inputting time.
  • Other uses?
    • In-class worksheets (that I print out): Input participation grades more quickly, return them to students without keeping stacks of paper
    • Assignments (that they print out): grade for marks or participation.
Possible disadvantages:
  1. TAs seem to prefer grading on paper over grading on laptops (varies by TA)
  2. Investing time to learn a new software setup
  3. The platform is not cheap.
  4. Data is stored in the United States, so for Canadians, this requires FIPPA compliance, which is just a small additional hassle.
  5. If everyone gets their test back, we’ll really have to make sure that our grading key is good, and that I’ve done a good job training my TAs to grade well.  This will also open up more feedback discussions with more students, rather than just the ones who would have picked up their test.  …Wait a minute, these two are actually advantages in disguise: accountability and universal feedback?  awesome!
  6. … Not really anything else.  I’ve got a few feature requests/ideas, because this is already awesome and also has great potential.

Here’s what my workflow has looked like (extremely wordy, though this is actually quick to do.)

Word on the street – how I heard about it

(only listed because I’m interested in how pedagogy/tech ideas get spread through academia)

  1. Faculty meeting – it was brought up by my awesome Biology Department Chair.  Apparently we’ve got a trial version in the hopes that SFU buys in.  It’s already integrated with Canvas (our LMS).
  2. Went and talked with Brenda Davison, who’s been using this in Math for their midterms/finals.  Thanks for the run-through!
Test prep
  1. Set the midterm like usual – some short answer, some multiple choice.
    1. CrowdMark doesn’t have OMR* (bubble/scantron) technology yet.
    2. In the meantime, for multiple choice I’ll use FormScanner which is free and I’ve been moderately pleased with before.
  2. Upload test to Crowdmark to get barcoded.
    1. I found this to be super fast, clean interface, very intuitive for me.
      • Realize that the barcode overlaps with my test questions; have to delete assessment and do over a few times.  (I could just use their template, but I’m trying to keep the # pages low so the page real estate is very valuable and the template has a huge header)
        • Feature request**: Let me do a practice run uploading, so that I can see what the barcoded test will look like without having to delete the whole assessment to try again.
        • Feature request: Let me print out a number of documents that is smaller than the number of students (e.g. for group tests where there are 4x fewer pages needed than students)
      • After a couple re-formats to make this work, midterm is finally uploaded; file is barcoded.  This is really quick! It took CrowdMark like 20 seconds for a 2-page midterm.  This is awesome.
  3. Download the barcoded file and print it out.
    • I’m lucky that this is a single double-sided sheet, because I don’t know how to tell the photocopier to take a 338 page document and staple every few pages together.
    • Feature request: Give me the option to download a zip file of assessments as individual pdfs.  That way, I can tell my photocopier to staple them.  On a mac, I can probably do this myself in Automator or Acrobat though.  Or I’ll pay the $3 for PDF Toolset – the trial version works great.
      • This is still a big hassle on my photocopier, but that’s not Crowdmark’s fault; it’s due to a crummy printer driver for Minoltas and Macs.  Just something to be aware of – make sure you can figure this out or you’ll spend way too much time copying, if you’re like me and your test is never ready early enough for sending off to print.
    • Note to self for students who write at the centre for disabilities/accessibility: Need to give them individual tests (with individual barcodes) rather than one test that they can copy a few times. Make sure I don’t use the same barcode twice (e.g. for CSD student and for a student writing with the rest of the class).
  4. For Canadians: Set up a student survey to ask for consent to store data on American servers (for FIPPA compliance).
    1. Can run the survey for bonus marks, or (in Canvas) can make survey completion  (irrespective of yes/no consent) a pre-requisite for opening up the rest of the course content.  I chose the latter option, and it worked fine.
    2. For the few students who don’t consent, this is not a big hassle for grading — just pull their tests from the paper stack, and grade them on paper.  Summer: Only a couple students out of 160 chose this option.  Fall 2016: Zero students out of 460 chose this option.
Run the test
  1. Just like usual.  Only one question typo identified!  Hurrah!
    • Make a note to students not to draw on the barcode, if they want their marks.  On my original test file, I put a box around the barcode indicating this.  I believe CrowdMark has templates available as well.
    • Reminder to self next time: assemble my test documents into packages to make setup way faster.  Package for me includes: CrowdMark-barcoded short answer question page, multiple choice question booklet, bubble sheet.
    • Feature request: OMR bubble technology within so that I only have to have one document for handing out and grading.
  2. Get the grading key together, share/discuss with TA.  
    1. TA grades a subset, then we’ll discuss together, then he’ll grade the rest.  If we were doing the grading online, this ‘pre-grading’ step could be done in the platform.
Details from learning the platform, as a first try

TL:DR; the learning curve is pretty small.  Much of this I won’t have to do next time, but there are some feature requests that emerged from me figuring this out.  Skip to below if you’re not interested in this.

  1. Take one of the blank midterms and fill it out as a test run.
  2. Scan the test run
    1. Regular settings on photocopier – I think it was 200dpi, auto colour.
  3. Upload the test run to CrowdMark.  This worked really quickly and was quite intuitive for me!  Awesome.
    1. Feature request: include a demo student automatically in all assessments, so that I can test this out; I don’t want to have to add one with a fake email address.  Bonus: this would also allow me to set up the grading splitting prior to assessments being handed in.  (maybe I can already do this; it was hard to know here because I’ve inadvertently only got one question page.)
    2. Sub-feature request: I want to remove these test pages from CrowdMark after I’ve made sure everything works, so that they’re not included in the ‘complete assessments’ list.
  4. Realize that I have messed up by putting assessment questions on the cover sheet.  Oops!  CrowdMark can’t handle this, and in retrospect they made it really clear.
    1. Feature request to self: Future Megan, please remember to read the documentation before diving into new things!  Sincerely, Past Megan.
    2. Feature request to CrowdMark: Please let me put assessment questions on the cover sheet.  I would like to keep the # of pages small (for trees, and for avoiding the annoyance of stapling then cutting off staples.)  I’m not too worried about TAs seeing the student name.  I’d also like to be able to use this for inputting participation grades on run-of-the-mill worksheets, which are often single-page, front and back.
    3. Sub-feature request to CrowdMark: If I were worried about TAs seeing the student name, couldn’t you just set it up for me to set a ‘blackout’ area on the cover page that the graders can’t see?
  5. Talk with TAs about marking on laptops versus on paper.  Some want to mark on paper.  This works out well because we have to mark the first page manually anyways this time (see #4 above).
    1. Feature request: I would like the option to grade manually and then scan for grade inputting and distribution (without having to type in grades in CrowdMark).  OMR bubbles would be awesome for this – when grading manually, fill in a bubble for the question score, and then I would just have to teach CrowdMark to automatically record the bubbled score, and show me any bubble-errors to manually check.  I acknowledge that this would miss out on CrowdMark’s awesome setup for dividing /assigning the grading of different questions.
  6. Because of this situation, my TA will mark manually, then we will scan and upload.  We’ll use CrowdMark to input the grades and return the graded tests to students.  This is still awesome! and an improvement over typing into excel, and handing back paper copies.
    1. Feature request: Let me have multiple questions (or at least multiple subtotals)  on the same page.  Then it’d be easier for me to distribute this out to multiple graders, and I’d also get per-question data rather than per-page data.
Marking:
  1. Add grading TA to CrowdMark (an email invitation): add TA.png
    1. Feature request: let me assign a TA as both an uploader and a grader (but not instructor/facilitator), rather than just one role.
  2. Tell Crowdmark the question values on each page.question values.png
    1. I only had one TA so I’m not sure how to divide/assign out questions in Crowdmark.  Will try that next time when I’ve got multiple TAs marking.  Given the rest of their interface, I expect this is going to be pretty intuitive.
  3. Scan the graded tests, and upload them: upload
    • I scanned these using my department photocopier, scanned onto a stick.  Normal scan settings (200dpi, auto colour); 312 pages is a 144MB file.  Took about 20 minutes of scanning once I figured out the right settings for saving the file from the copier.
    • If you have a stapled test booklet, you need to cut off the staples — there is a barcode on each page, so students don’t need to write their name 500 times on the test.  This is actually easiest to do with scissors than with a papercutter.
    • Crowdmark figures out page orientation and order, so you don’t need to sort at all.
    • Upload took 30 seconds on a good internet connection;  CrowdMark took 5 minutes to process the files.
      • Also, how good is it at recognizing its barcode?  What’s the interface for when it has trouble recognizing (ie when students draw on the barcode)?  I didn’t have any students write on the barcode, and I had no problems with barcode recognition in CrowdMark; I didn’t specifically test this issue out, so I don’t know.
      • Occasionally the students draw on the barcodes despite being asked not to; Crowdmark has quite a nice system for fixing this and assigning the paper to the student, easy peasy.
  4. Connect the barcodes with the student information in CrowdMark
    1. This was quite intuitive; you begin typing a student name and it finds the student in the class list.  They claim that a 1000-student class can be matched in 1 hour.  I can say that a 160 student class took 28 minutes (including getting up to speed on my first time doing it, and making the odd note here and there).match
    2. Things I noticed:
      • Your keyboard shortcuts are great.  No mouse = very fast!
        • Except that it’s annoying to delete comments.
      • For me, matching by name (then double-checking some digits of student ID) was fastest.
      • I really wouldn’t want to do this without a number-pad keyboard.
      • Software is smart! after a student has been matched, it doesn’t show up in the top hits any more.
      • For double-checking, software is smart – once narrowed down to one, it shows full info about that student.
      • Some nitpicky feature requests to help me get faster:
        • When the black box pops up for me to confirm that this is the correct student, the last name gets cut off, often.  Would be nice to not have this happen, since that’s usually my double-check term.
        •  If I type last name then first name, it doesn’t find the student.  (I think it’s treating my whole search as a string, rather than as two strings?)  Please fix this, so that I don’t have to clear the search box when the last name isn’t enough to narrow it down. (It’s faster for me to keep typing than it is to sift through the list.)
        • The primary info I have about the student is not their email address.  Can the main search window instead show name and student ID?
        • Many students often have an English first name and a naive language first name (often asian names, but also nicknames, preferred middle names, etc).  The students’ preferred name doesn’t seem to be in the column that is pulled from Canvas.  Can CrowdMark also (automatically) pull the ‘preferred’ name column from Canvas?  (I see that I can add this myself using metadata, which is a really nice feature – definitely will do when I’m running my bigger class in the fall.  Could also use this for group names, etc.)
      • Main feature request here: OMR bubble so that I don’t have to do this matching step at all.  Like so: They bubble their student ID, I teach CrowdMark where the student ID field is, CrowdMark checks the IDs against the class list, and flags any mismatches/missing/errors.  Ta da! Then I’d be manually checking just a subset rather than a whole class.  From what I understand, you’re already working on this. 🙂
      • Feature I noticed, that is awesome: Can put multiple people on the same test!  Group exams, group worksheets, that’d be great with this!!
        1. Side note: I’m interested in trying out CrowdMark’s “Exam Matcher” app for matching during the test using a phone, when I’m checking their ID cards; I hope to try this for the midterm!
        2. Question: Can multiple TAs be using the exam matcher app at the same time?  So that I can divide up the task of checking IDs?  Apparently it works like this (from their website) – it’d be good to see how it goes in reality:exammatcher-screeshots
    3. Use CrowdMark to input the grades.
      1. Here’s what the interface looks like: grading.png
        • TAs can see the barcode, but not the student information here.
        • It’s nice to see that the TAs can flag/tag individual papers, if there are questions about the grading.
        • It is really smart – you can mark some on paper, and some online and it’s no problem.  Electronic annotations don’t get lost if you upload the files a 2nd time.
      2. For this time grading, we just used this system to input marks that we’d written on the page.  (I got the TA to write them on the top right corner, for easiest mouse-free inputting).  I probably wouldn’t have thought of this if I hadn’t done a trial run with a test page… so it made me glad that I had done so.
        • TA who was inputting grades was pleased with the system (he said it is “really slick”); it took him 17 minutes to input 160 marks.
        • Mouse-free!  Type number, hit enter to save it; hit enter to go to the next unscored paper.  Doing this with a number-pad keyboard is best.
        • Update from later semesters: The TAs liked the system.
          • No paper sorting!  No losing papers!  Can work anywhere!
          • We did our calibration marking (to get used to the question, and to make sure the key is appropriate) on a subset of paper, then the actual marking on Crowdmark.  This worked well – it served as a checkpoint for our TA-instructor conversation.
          • Somewhat annoying on a laptop — they ended up commenting (which is a little clunky) rather than drawing.  Works great on a tablet.   Idea: Can we get some departmental or media-services loaner tablets to lend to the TAs for their grading?
          • Idea for crowdmark: can we get threaded comments in the tag/flagging of papers?  Similar to track-changes in word, so that a paper can be marked ‘resolved’ without deleting the tags.
          • As an instructor, can I sort the grading grid by tagged/untagged, rather than by number?  This would float to the top the papers I need to look at.
Returning the papers:
  1. Amazing.  They get their tests back by email.  This is awesome.
    • I used it for final exams as well, even though I don’t give their tests back.  Still worth it for the many other reasons.
  2. I don’t directly upload grades to Canvas, since I like to double-check my grading in excel.  But this feature is nice for those who do.
  3. Hurrah!  All done!

 

*Replace OMR with OCR, throughout, if you want to get fancy; but I’m not picky!

** Some of these features may already be available; I haven’t looked for all of them yet.  This is a bit of a note to myself to try checking these things out.

 

 

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s