Semi-automated assessment of SQL schemas via database unit testing
A key skill for students learning relational database concepts is how to design and implement a database schema in SQL. This skill is often tested in an assignment where students derive a schema from a natural language specification. Grading of such assignments can be complex and time consuming, and novice database students often lack the skills to evaluate whether their implementation accurately reflects the specified requirements. In this paper we describe a novel semi-automated system for grading student-created SQL schemas, based on a unit testing model. The system verifies whether a schema conforms to a machine-readable specification and runs in two modes: a staff mode for grading, and a reduced functionality student mode that enables students to check that their schema meets specified minimum requirements. Analysis of student performance over the period this system was in use shows evidence of improved grades as a result of students using the system.
Editor: Yang, Jie Chi; Chang, Maiga; Wong, Lung-Hsiang; Rodrigo, Ma. Mercedes T.
Publisher: Asia-Pacific Society for Computers in Education (APSCE)
Conference: International Conference on Computers in Education (ICCE), Manila, Philippines
Rights Statement: Copyright 2018 Asia-Pacific Society for Computers in Education. All rights reserved. No part of this book may be reproduced, stored in a retrieval system, transmitted, in any forms or any means, without the prior permission of the Asia-Pacific Society for Computers in Education. Individual papers may be uploaded on to institutional repositories or other academic sites for self-archival purposes.
Keywords: automated assessment; SQL; database schema; data definition language (DDL); student performance; unit testing
Research Type: Conference or Workshop Item (Paper published in proceedings)