[Python-cuba] [RFC] Community-driven platform for running workshops to teach Python programming

Olemis Lang olemis at gmail.com
Mon Jul 11 15:51:33 EDT 2016


On 5/23/16, kirby urner <kirby.urner at gmail.com> wrote:
> On Sun, May 22, 2016 at 8:02 PM, Olemis Lang <olemis at gmail.com> wrote:
>
[...]
>> a decentralized open source and collaborative platform to run
>> workshops to teach Python programming . Do you know of anything
>> similar ?
>>
[...]
>
> I've been polling edu-sig (@python.org) with similar questions, having had
> three+ years with an in-house asynchronous so-called Nano, which allowed
> Python students to submit programs and quiz answers for evaluation to
> human evaluators (called mentors).
>

In this case a notorious difference consists in the fact that
evaluation is automated . I'm not sure of whether that matters when it
comes to your thought below .

> When every quiz submission is also an opportunity to ask specific questions
> requiring customized sometimes researched answers, you're at one end of
> the spectrum.  Everything robo-graded where the machine evaluates in
> mere milliseconds, serving millions, is at the other end of said spectrum.
>

So this is what  was talking about . What I'm looking for is more
close to the fully automated grading scenario , but with eventual
possibity of being complemented by in person support/mentoring .

> Providing one-on-one coaching through mentor services is not practical in a
> real time tutorial or workshop setting.  Even with a one-to-one ratio (not
> common), a difficult bug will not yield to scrutiny immediately, and some
> concepts (such as 'yield' in Python) take time to digest.

+1

[...]
>
> The robo-grading parlor
> is useful to students who also have access to live humans, but
> over-reliance
> on robo-grading is not going to benefit students beyond a certain level.
>

... but I guess it may be deployed to the masses as a first contention
approach .

> What Nano did for us is organize student submissions in queues, time
> stamped, but returnable in any order.  Our rule was three business days
> max for any turnaround, with the average quite a bit less, sometimes
> just hours or even minutes.  Sorting by project meant evaluating many
> solutions to the same problem and that could be a more productive order
> for a mentor, ditto quizzes.  Students could ask for clarification and more
> background, and expect considered answers -- just not in real time.
>
> I've also done real time workshops over the web, and that's a different
> animal.

yeah , I'd consider that a bit off-topic indeed .

[...]
>
> A potential pitfall is either/or thinking and imagining some contest
> between
> all these modes where in fact such a contest does not exist. They're all
> with us to stay.  That being said, finding the right mix or sweet spot is
> in
> itself a skill and I think we're still in the initial experimental phase of
> finding
> out how client-server technology is going to change our learning
> strategies.
>

I might agree or not , but I'm definitely looking for one (such)
imperfect instance resembling workshoppers + NodeSchool .

[...]
> I'd be interested to find out about
> more mentor-centric solutions especially (the robo-grader systems already
> seem to be doing pretty well).
>

I know of robo-graders but found nothing similar to NodeSchool
workshoppers [1]_ implemented in Python .

.. [1] http://nodeschool.io/building-workshops.html

-- 
Regards,

Olemis - @olemislc

Apache™ Bloodhound contributor
http://issues.apache.org/bloodhound
http://blood-hound.net

Brython committer
http://brython.info
http://github.com/brython-dev/brython

SciPy Latin America - Cuban Ambassador
Chairman of SciPy LA 2017

Blog ES: http://simelo-es.blogspot.com/
Blog EN: http://simelo-en.blogspot.com/

Featured article:


More information about the Python-cuba mailing list