Project MoNET will attempt to update weekly with goings on. Here's what we are up to this week:


Our landing page is up with some basic information and a spiffy custom domain name ( We really wanted but it was unfortunately taken.

Our website is a Node.js app using the Express framework and a Mongo database. It's all hosted via RedHat's Platform-as-a-Service offering OpenShift.

We are also hosing the blog that you are reading via OpenShift. It uses a Node.JS blogging platform called Ghost, which is pretty sweet.

Express and a PaaS like OpenShift makes it crazy easy for us newbie developers to quickly spin up a working app to serve a single webpage ... Nooooo, it's crazy powerful and easy that's why ...

The Colorwall

We have our first early technology demonstrator up already in the form of a crowd-sourced virtual painting generator that we call The Colorwall. The Colorwall allows users to take a picture (either from a cell-phone camera or uploaded file), extract the colors from the picture, and add them to the painting.

Want to add your mark to The Colorwall? Mosey over to and start sampling. This color sampling via cell-phone will be a core input stream for the Project MoNET robot.

Technical stuff: All image analysis and color extraction takes place on the client side (in your phone/computer/tablet) which means we don't ever actually handle your pictures (which is awesome for bandwidth and privacy reasons). Once the color is extracted is submitted via real-time bidirectional messaging ( to the server.

The server (Node.js/Express) handles storing your color samples, assigning its position on the painting, and letting anyone else currently watching The Colorwall that a new sample has been added.

This is a work in progress. Expect to see more soon.


Most of the work on the hardware front has been centered around color sampling. Originally we had planned to use an esoteric color sensor as a camera approach. However, since the cell-phone as a color sensor is working so well, there's a strong chance we'll go that route.

But experimenting is fun:

Here's the color sensor experiment. It's an ESP8266 WiFi microcontroller (NodeMCU) pulling samples off of a TCS34725 color sensor. It actually works, but only at short ranges. The camera ended up being a more attractive option.

Aside: these ESP8266 boards are Arduino compatible (so even we can program them) and crazy cheap. Need a remote sensing platform that you can afford to shoot out a cannon? Look no further. These may be out for now, but they will be back powering the robot.

We've been talking up the cell-phone camera and file upload, but we always wanted the ability to have a stand along method for taking portraits ... possibly using this Raspberry Pi and an el-cheapo webcam? Maybe we'll have a photo booth set up ...

Until next time ...