[Python-ideas] Does jargon make learning more difficult?

Chris Angelico rosuav at gmail.com
Tue Aug 21 16:42:04 EDT 2018


On Wed, Aug 22, 2018 at 6:33 AM, Abe Dillon <abedillon at gmail.com> wrote:
> [Chris Angelico]
>>
>> I use decks of cards primarily for non-game
>> usage (for instance, teaching data structures and algorithms - cards
>> laid out on a table can represent a tree, heap, array, etc)
>
>
> I do too. They're a great tool for visualizing and physically trying out
> different techniques.
>
> [Chris Angelico]
>>
>> A deck containing four suits of thirteen
>> cards plus one joker would have 53 cards, which is a prime number;
>> printing 54 cards lets you lay them out as 9 by 6 on a sheet, so it's
>> easy to add a second joker. Some decks (I have an Alice in Wonderland
>> themed deck) have *four* jokers.
>> As such, the most logical way to do this would be as an attribute of
>> the card.
>
>
> In most cases I've seen wild cards used, it's a declaration about a certain
> card (e.g. "Eights are wild."
> or "The 2 of clubs is wild."). I've found that if you're trying to model a
> game like poker or Monopoly, it's tempting to add complexity to simple
> objects, but it can lead to problems later on. A card doesn't know if it's
> wild. That's a function of the game being played. An ace may be high or low.

Well, okay. Then you should check if the card is a joker, rather than
checking if it's wild. That means your sort function is "by suit, and
within that by rank, but jokers are at the end" (ignoring wildness).

That said, though... I've seen and used a Canasta deck of cards, which
have game-specific information printed on them. Physical cards. So
there's precedent for that!

ChrisA


More information about the Python-ideas mailing list