Yes We Can — My first Proof of Concept

I’m currently working on the User Interface for an open source robot project. While at a work session last night, someone asked if it was possible to add a live video stream to the interface. I stated it absolutely was possible. The next question, “how hard is it?”

Let me check…

I found some code that only relied on basic HTML and a little JavaScript. It was empowering to be able to look at the code and feel confident it wasn’t malicious in any way. I created a new file on my computer and opened the HTML document. It didn’t work. I remember that while using JavaScript in the past I have utilized various task runners. Task runners accomplish many things including compiling and minifying files, check for changes and updating the browser. I quickly set up the files to utilize Gulp and ran it again.

My beautiful face back at me on the screen.

It’s my first proof of concept!

There are many steps between the final goal, and what I could achieve in a half hour on my Mac. My Mac already has a webcam and my preferred dev environment installed. It will be quite a different task as I anticipate several learning curves. One example I know will come up as a learning curve is my text editor of choice (Sublime) won’t work on a Raspberry Pi, but I’m excited by the challenge. Additionally, I expect I’ll have to install the webcam into the Pi and configure it using command line interfaces, but that is speculation.