{{indexmenu_n>1}}
[<6>]
====== Card sorting machine ======
===== Android: Netrunner =====
My eldest son and I occasionally play a game called [[https://en.wikipedia.org/wiki/Netrunner|Android Netrunner]] together. It's a 1v1 competitive card game where each player assembles a deck of about 50 cards from a pool of a few hundred. It went out of print a few years ago but then a community group took over designing and making new cards. It's similar to [[https://en.wikipedia.org/wiki/Magic:_The_Gathering|Magic: The Gathering]] in some ways, but without the scummy expensive blind card pack collectathon. You buy a Netrunner set and you get all the cards.
The game has two main factions, Runner and Corporate. When building a deck you choose cards from your faction. There are rules that define what cards a valid deck can contain, and different deck compositions have different strengths and weaknesses. My eldest plays the Runner faction, and I play Corporate.
A problem arises when you're always playing against the same person: Each player is trapped in a constant cycle of adaptation to their opponent's deck. Exploiting their weaknesses and mitigating your own. This tends towards a see-saw pattern of wins and losses, and involves a lot of deckbuilding which neither of us particularly enjoy.
===== Solution (?) =====
I think it would be fun to build a robot that can do deckbuilding for you. The entire card pool is well-documented with online tools like [[https://www.jinteki.net/|Jinteki.net]], and the rules for deck validity are easy to codify. I think it might be fun to have a computer formulate a random-but-valid deck, and to build a robot to physically pluck those cards out of the pool so you can be ready to play.
===== Motive force =====
I've used stepper motors in a few projects, recently my [[projects:sand_drawing:overview|sand drawing robot]]. A stepper motor is able to turn its shaft like a regular motor, but can also move it in tightly-controlled "steps". Usually 200 steps per revolution, but "microstepping" can divide a step up into sub-steps. 16 or 32 microsteps-per-step are common, giving angular accuracy in the range of 0.1° to 0.01°. You can make a normal motor turn by applying the appropriate voltage. In contrast stepper motors require carefully sequenced pulses to turn.
I've been looking for an excuse to play with closed-loop stepper motors. These have the stepper control circuitry built-in, and a sensor to measure the angle of the shaft. You provide the motor a DC voltage and step/direction signals to make them turn. They also have a serial UART interface, which means you can send them commands from a microcontroller rather than counting out hundreds of step pulses.
Matthias Wandel has an excellent video on them:
{{ youtube>OPgbm81q8Uk?large }}
{{:projects:card_sorting_machine:pasted:20250521-004502.png|A photo of a stepper motor with various colourful wires attached. A small OLED screen on the back of the motor reads "-34.8k° 0.04err <-99Kclk". Some trees are visible through a window in the background.}}
I settled on a [[https://github.com/makerbase-mks/MKS-SERVO42C|MKS Servo42C]] because they are cheap, readily-available and have a UART interface.
==== Diversion into interfacing ====
I was shocked to learn there was no batteries-included python library for controlling these steppers, so I [[https://pypi.org/project/pyservo42c|wrote one]] and published it on pypi. Now you can control your stepper with code like
from time import sleep
from servo42cUart import Servo42CUart
s = Servo42CUart("/dev/ttyUSB0", 9600)
# Start turning clockwise at full speed
s.set_constant_speed(Servo42CUart.Direction.CLOCKWISE, 127)
# Wait a second
sleep(1)
# Stop turning
s.stop();
# Turn 360 degrees, assuming 16x microstepping, as fast as possible.
s.set_angle(Servo42CUart.Direction.CLOCKWISE, 127, 200 * 16)
===== Eyes =====
The robot will need to be able to identify a card for it to be sorted. I plan to do this with a webcam and [[https://github.com/tesseract-ocr/tesseract|tesseract-ocr]]. This is a very powerful text recognition engine that Google publishes open-source. Some early testing has yielded promising results. I set up a HD webcam with a goose-neck worklight, with a white background. The webcam is connected to an old Raspberry Pi Model 3, running Debian 10.
Other [[https://www.youtube.com/watch?v=giR1BRpc2Z0|similar projects]] have used more complex machine learning smarts to identify the whole card, graphics and all, rather than just looking at the text. I'm going to see how far the text-only approach takes me, so I can keep all the brains on a single old RPI.
{{:projects:card_sorting_machine:pasted:20250521-005743.jpeg|A photo of a netrunner card standing upright against a white background with carpet beneath it. A light is shining at the card from an oblique angle, and a webcam is sitting on a piece of black plastic pointing at the card. The webcam is about 150mm from the card. Other junk is visible in the background.}}
A cobbled-together test rig.
{{:projects:card_sorting_machine:pasted:20250521-005950.jpeg|A frame from the webcam showing the NOISE netrunner identity card. }}
The view through the webcam.
tesseract-ocr read:
NOISE
IDENTITY: G-mod
Whenever you install a virus program,
the Corp trashes the top card of
R&D.
“Watch this. It'll be funny,”
Which is pretty impressive for such a bodge-fest.
{{:projects:card_sorting_machine:pasted:20250521-010210.jpeg|A frame from the webcam showing the INFILTRATION netrunner event card}}
The view through the webcam.
Result:
INFILTRATION
Gain 29 or expose | card
Bring back any memories, Monica?’
John “Animal” McEvoy
0 2012 oats le Coast LLG, @2012FFG
===== Next steps =====
Next up I have some mechanical design work to do. The machine must:
* Support a stack of cards,
* Provide outlets through which cards can be fed,
* A means of transmitting the motor torque into card-sliding action,
* Supporting lighting,
* Supporting the camera.
Should be fun! I'm anticipating some challenges:
* Handling sleeved and unsleeved cards,
* Glare from the lighting,
* Camera focus/exposure issues,
* Ensuring single-card feeding at different stack heights,
* Keeping the camera view unobstructed by mechanical gubbins,
* Fuzzy-matching card text against the card database.
[<6>]