automatically grading small programming assignments

Hello, I have a couple of classes where I teach introductory programming using Python. What I would love to have is for the students to go through a lot of very small programs, to learn the basic programming structure. Things like, return the maximum in a list, making lists with certain patterns, very simple string parsing, etc. Unfortunately, it takes a lot of time to grade such things by hand, so I would like to automate it as much as possible. I envision a number of possible solutions. In one solution, I provide a function template with a docstring, and they have to fill it in to past a doctest. Is there a good (and safe) way to do that online? Something like having a student post code, and the doctest returns. I'd love to allow them to submit until they get it, logging each attempt. Or perhaps there is a better way to do this sort of thing. How do others who teach Python handle this? thanks, Brian Blais -- ----------------- bblais@bryant.edu http://web.bryant.edu/~bblais

Hello Brian, I do not teach (much to my regrets) but I have been thinking about what you describe. See below. On 12/14/06, Brian Blais < bblais@bryant.edu> wrote:
Hello,
I have a couple of classes where I teach introductory programming using Python. What I would love to have is for the students to go through a lot of very small programs, to learn the basic programming structure. Things like, return the maximum in a list, making lists with certain patterns, very simple string parsing, etc. Unfortunately, it takes a lot of time to grade such things by hand, so I would like to automate it as much as possible.
I envision a number of possible solutions. In one solution, I provide a
function template with a docstring, and they have to fill it in to past a doctest. Is there a good (and safe) way to do that online? Something like having a student post code, and the doctest returns. I'd love to allow them to submit until they get it, logging each attempt.
I may have a partial solution. I (co-)wrote a program called Crunchy (crunchy.sf.net) which, among other features, allow automated correction of code that has to satisfy a given docstring. As it stands, it only allows self-evaluation i.e. there's no login required nor is the solution forwarded to someone else. (This has been requested by others and may, eventually, be incorporated in Crunchy.) So, as a tool for learning, it's working; the grading component is simply not there. However, the code is open-source and you could adopt it to your needs ;-) André Or perhaps there is a better way to do this sort of thing. How do others
who teach Python handle this?
thanks,
Brian Blais
-- -----------------
bblais@bryant.edu http://web.bryant.edu/~bblais<http://web.bryant.edu/%7Ebblais> _______________________________________________ Edu-sig mailing list Edu-sig@python.org http://mail.python.org/mailman/listinfo/edu-sig

Brian Blais wrote:
I envision a number of possible solutions. In one solution, I provide a function template with a docstring, and they have to fill it in to past a doctest. Is there a good (and safe) way to do that online? Something like having a student post code, and the doctest returns. I'd love to allow them to submit until they get it, logging each attempt.
Crunchy Frog (now called Crunchy, as I understand). I just researched and presented on it this past weekend and was impressed with its abilities, including that of the instructor providing a doctest and the student working to write code that lets it pass. Very cool. There is not however currently any logging of progress, counts to get it right, etc. But I would imagine that would not be hard to add to the Crunchy Frog backend. It is just an HTTP proxy with some template expansion to get a Python interpreter inside the browser window. Safety -- ehh. Each Python interpreter is running inside that HTTP proxy with full access to the underlying system, as whatever user it is running as. The design is to have each student run it locally, so they can only trash their own system. However, I could imagine you could set it up to run on a private classroom server, where the attempt records would be kept, and still be safe. http://crunchy.sourceforge.net/index.html For another solution, I wonder whether you could make use of the new Abstract Syntax Tree (AST) in Python 2.5, where you convert the source of an attempt into an abstract data structure, anonymize the method/variable/class names and compare the tree against a correct solution. It would let you quickly handle those students who solved it in a conformist way, and then you'd need to manually review the rest for creatively solving it another way. ;-) But I think Crunchy is the most classroom-friendly way to quickly solve this. A weekend's work at most. -Jeff
participants (3)
-
Andre Roberge
-
Brian Blais
-
Jeff Rush