Uncategorized
News
Case Study
Case Study: Carnegie Mellon University
Autograding Excel Spreadsheet-Based Assignments on Vocareum
An interview with Carnegie Mellon’s Teaching Professor, Dr. Hakan Erdogmus
Hakan Erdogmus
Teaching Professor of Software Engineering at Carnegie Mellon University’s Silicon Valley Campus
Hakan Erdogmus is a Teaching Professor at Carnegie Mellon University’s Department of Electrical and Computer Engineering. He is a co-founder and the current lead of the department’s professional graduate program in Software Engineering, offered at Carnegie Mellon’s Silicon Valley campus. Dr. Erdogmus is a co-editor of two volumes, Advances in Software Engineering: Comprehension, Evaluation and Evolution and Value-Based Software Engineering. He is a past recipient of the Eugene L. Grant Award from the American Society for Engineering Education. In 2016, he received the Dean’s Early Career Fellowship and the Philip L. Dowd Education Fellowship Award from Carnegie Mellon. His current research focuses on empirical software engineering, software engineering education, and software security. He co-chaired the software engineering education and training track at the 2021 International Conference on Software Engineering.
In this case study
In this case study, Dr. Erdogmus discusses how transitioning to Vocareum’s autograding helped him save time and allocate his resources in providing more meaningful feedback to his students. It also allowed him to scale up in his courses to a larger number of students.
Please tell us a little bit about yourself.
I have been teaching an interdisciplinary course titled Decision Analysis and Engineering Economics for Software Engineers. The course combines and applies various techniques from finance, statistics, data analysis, economics, and decision theory to software engineering decisions across the whole spectrum, from technical decisions at the architecture and design levels to product, tool, and process decisions at the managerial level. It is a unique, applied course that builds on structured case studies and has a quantitative orientation.
Please tell us a little bit about the class.
I was introduced to Vocareum Notebook by a colleague from the School of Computer Science. I was preparing a new course on data analytics and needed a platform that allowed students to work with Jupyter notebooks effectively
What were you looking for in a platform for autograding spreadsheet assignments?
Many of my labs and assignments are conducted using spreadsheets. Students are given relevant data and a solution template in a spreadsheet. They complete the assessment using the given spreadsheet and submit it. These assessments are sometimes fairly complex, and include raw data, tables with several computed columns, decision trees, statistical and financial analyses, sensitivity analyses, which are often accompanied with visualizations in the form of charts and plots. They are error-prone both to complete and to grade manually because there may be hundreds of cells and computations to perform and check. To complicate matters further, the computations and analyses build on previous ones, and thus can propagate mistakes. My TA’s used to spend countless hours on grading. And because the grading process was so labor-intensive, my students used to get only one chance in submitting their solution, with no opportunity to receive feedback and correct their mistakes. I had previously used Vocareum in another course to successfully automate software testing and verification assessments. My idea was to adapt the auto-grading strategies I had used in that course for coding-based assignments to grade spreadsheet assignments in this new type of course. Vocareum provides the necessary flexibility: you can develop a home-made grading strategy and integrate it to Vocareum, letting the platform take care of assignment creation and setup, submission management, publishing grades, and pushing the grades to our LMS system, Canvas.
Can you describe your solution?
The grading technique is cell-based. Students submit their solution as an Excel file. The submission is processed using a key file that contains a reference solution to which the student’s solution is compared. A spreadsheet cell that is supposed to contain a computed or student-specified value is graded using a rubric embedded directly in the corresponding key cell. This is done in the form of an Excel note attached to the key cell. The rubrics are specified using a tiny rubric language. For example, we can instruct the spreadsheet grader to grade a cell by simply comparing the submission cell’s value to the corresponding key cell’s value. Or the cell’s grading can be based on comparing the two formulas as arithmetic expressions. The grader is smart enough to realize whether the submission cell’s formula is mathematically reducible to the key cell’s formula, in which case the student’s solution is considered correct. This is important because correct solutions can be specified in infinitely many different ways as a formula, both syntactically and mathematically. Plain value checks and formula checks are the two basic rubric types, but the grader has several other rubric types for more flexible, hybrid, and complicated grading strategies. You can find more information at https://se-edu.org/auto-grading-spreadsheet-assignments/
When did you start collaborating with Vocareum?
I started collaborating with Vocareum in 2015, during Vocareum’s early days. I suppose I was a beta user, or lead client. I started by experimenting with unit testing assessments for my software testing and verification course. I was using another autograding tool specifically developed for Java-based coding assignments, but needed more flexibility so that I could create more complex assessments that don’t have the same cookie-cutter format. My teaching assistants and I first ported my old assessments to Vocareum, and with relative ease. In later years, I devised grading strategies based on similar ideas for my model checking assessments for the same course. This led to a new auto-grader specifically for model checking assessments. Vocareum was very supportive through all of these initiatives, meeting all my back-office and platform needs, including installation of one-of-a-kind software and components required by my custom auto-graders
Can you please describe the impact Vocareum has made on your class?
Vocareum, and transitioning to auto-grading, made a huge impact. My courses have peculiar features: ready-made solutions don’t tend to satisfy my unique needs. Vocareum allows me and my teaching assistants to use our own home-grown autograders by easily integrating them with the platform. This kind of robustness is super important for me. We need to be able to patch, extend, and improve our custom auto-graders over time in alignment with our evolving needs. In the end, by automating the mechanical bits of the assessments in multiple courses, the teaching team ended up saving a lot of time, time that we could allocate to providing deeper and more meaningful feedback to students. Auto-grading also allowed me to scale up my courses to a greater number of students. Students benefit in particular because they can get early and precise feedback in multiple rounds, and gradually fix their mistakes
Similar Case Studies