Now in the closing few days of its Kickstarter campaign the Pixy camera board from Carnegie Mellon is an interesting departure from cameras intended to be connected to micro-controllers like the Arduino. It isn’t just another camera, it’s a “smart” vision sensor.
Pixy has its own processor and connects to your micro-controller board through one of several interfaces: UART serial, SPI, I2C or simply through a digital or analog pin. Rather than providing raw image data to the micro-controller, instead it analyses the images on-board and sends more useful data—actionable data—to your micro-controller, e.g. a red ping-pong ball has been detected at x=54, y=103. Although even without an attached micro-controller board attached Pixy can use its digital and analog outputs to trigger switches or servos, which means that you can use it to drive simple robots without any programming.
The Pixy uses a hue-based color filtering algorithm to detect objects, calculating the hue and saturation of each pixel from the image sensor making it reasonably robust to lighting changes, and a connected component algorithm to detect multiple objects.
Howeve,r the really interesting thing about Pixy is that you can easily teach it what you’re interested in sensing. If you want to track a red ball the you just hold it in-front of the sensor and press the button on top of the board. While in learning mode Pixy will generate a statistical model of the colors contained in the object in front of the camera when the button was pressed, and going forward it’ll use that to find similar color signatures (and objects) later on.
While other sensors have proliferated, cameras have been left on the shelf by makers, they’re generally too hard to play with directly from a micro-controller board and the data isn’t as particularly useful—directly actionable—as data from other sensors. It’s possibly the Pixy will be able to change that and really reduce the friction in using computer vision to build robots.