Archive for the ‘Video’ Category

Using a Video Experimenter as MIDI Controller

This simple project shows how a video signal captured by a Video Experimenter Shield can be used to send MIDI messages to a synthesizer. A small camera module is connected to the input of the Video Experimenter (the red wire in the picture is for powering the camera from the Arduino VIN pin). The video output goes to a small TV. A MIDI shield sits atop the Video Experimenter to allow the Arduino to send MIDI messages to a Synthino XM synthesizer. By the way, the Synthino XM is our new synthesizer product and you can read all about it on synthino.com. It’s awesome.

Video Experimeter as MIDI Controller


The Arduino sketch uses the Video Experimenter’s frame capture ability to capture simple monochrome low-res frames. The “on” pixels are counted to determine the pitch of the note that should be sent to the synthesizer. The code then just sends a MIDI note-on message to play a note on the synth. The brigher the image (more “on” pixels) sends higher pitched notes. In the video you can see me adjusting the Video Experimenter threshold knob to alter the brightness and also shining a flashlight on the camera.

You can download the code and make sure you install the enhanced TVout library from the Video Experimenter product page.


Published by Michael, on September 5th, 2015 at 1:24 pm. Filed under: Arduino,Audio,Video. | No Comments |

Displaying Android Phone Video on an RGB LED Matrix

Difficulty Level = 10 [What’s this?]

I bought this awesome RGB LED matrix panel from Adafruit and really wanted to see if I could make it display video from an Android phone. It was somewhat difficult, but by using my Android phone, the OpenCV computer vision library for Android, a Sparkfun IOIO board, and an Arduino, I got it working.



All of the hardware and software setup details are below, but before I explain how it works, let’s see it in action:

How It Works

This is not a beginner project, so if you don’t have experience doing any Android development, you’ll need to be patient. Just getting your Eclipse development environment set up for Android development with the OpenCV and IOIO libraries took me a couple of hours, and I’ve been using Eclipse for about 10 years.

An Android app running on the phone captures video frames and processes them down to a lower resolution suitable for the 16×32 LED matrix. OpenCV is a powerful computer vision/image processing library, and there’s a version that runs on Android. I used the OpenCV library to convert the video frames to 16×32 pixel resolution to match the LED matrix. I also constrained the color space of the frames to 12 bit color. That is, each pixel has 4 bits each for red, green, and blue. That means that each pixel can have 16 different brightness levels of red/green/blue, yielding 4096 possible colors. In other words, all of the image processing is performed on the phone because it’s much more powerful than the Arduino.

The 16×32 12-bit image uses 1024 bytes of memory on the phone (2 bytes per pixel). The Android then uses the IOIO library to write this data out one of the IOIO board’s serial ports. Each frame starts with a two-byte frame marker 0xF0 0x00, then the bytes for the pixel values are written. The performance bottleneck is between the phone and the IOIO board. I can only write about 4 frames per second, even though the serial interface between the IOIO and Arduino is 115200 baud. Since each pixel really only needs 1.5 bytes instead of 2, I could pack the pixel data tighter to get perhaps one more frame per second, but didn’t think it was worth the trouble.

The green wire in the picture below is a serial connection from the IOIO and Arduino. The Arduino code simply reads the pixel values, using the frame marker to know when a new frame begins. The pixel values are written to the LED matrix panel using the Adafruit library for controlling the panel. Driving this matrix is no small feat for the Arduino, since the matrix panel does not do any PWM on its own; the Arduino needs to generate the PWM. This matrix driver software could have been written for the IOIO to control the matrix directly without an Arduino, but Adafruit had really tuned this library for high-performance and very precise timing, so I thought I’d better stay with the Arduino code for now. The result is video at about 4 frames per second. Not very fast, but the color rendition is pretty good.

Hardware Setup




The RGB matrix panel is wired to the Arduino just as Adafruit’s instructions describe. They have an excellent page that describes how the panel works and how to use it.

The RGB matrix and the Arduino are powered by a 5V regulated power supply that can provide 2A (also from Adafruit). The IOIO board is powered independently by a 9V supply that can provide 1A. It’s important to provide plenty of current to the IOIO board so that the phone can charge, however you can adjust a potentiometer on the IOIO to reduce the charging current. As with any project with multiple power supplies, all the grounds must be connected. A single green wire provides the serial data feed from the IOIO to the Arduino RX pin.

I used a diffuser in front of the display to make it look much better. Without a diffuser, the LEDs are simply blinding and it’s not easy to see any image. My diffuser is a piece of acrylic with paper vellum on it. The diffuser is held about 5mm in front of the LED panel (with a little roll of duct tape as a spacer).




The phone (a Samsung Nexus S) is connected to the IOIO via USB. I mounted above the panel by holding it very gently with a Panavise.



Software Setup

Android + IOIO + OpenCV Software
The hardest part of the software setup is preparing your development environment for Android, IOIO, and OpenCV development. The details of how to do this are beyond the scope of this article, but all of the steps are documented in various places.

  1. Set up your Android development environment: this is documented on the Android SDK website. After you have performed this step, you will be able to write simple Android programs and run them on your phone.
  2. Install the IOIO library: see this great Sparkfun tutorial which describes how to run Android apps that communicate with a connected IOIO board. After you have performed this step, you will be able to upload the HelloIOIO app to your phone and have it communicate with your IOIO board. Don’t move on to the next step until you are sure you have the IOIO working with your phone.
  3. Install the OpenCV library for Android by following these instructions. After successfully doing this, you should be able to run the OpenCV Android examples on your Android phone. Don’t proceed until you have this working successfully.
  4. Now that you have all the supporting libraries in place, download the RGBMatrixDriver Android application project and install it in your Eclipse workspace and open it. With luck, it will compile cleanly. If not, make sure that the project is correctly pointing to the IOIO and OpenCV libraries as in the image below.
  5. You may need to customize the code a bit. I used IOIO pin 7 to send serial data to the Arduino, so you may need to change the pins specified in the call to openUart in RGBMatrixActivity.java. You may also need to change some screen dimensions specified in RGBMatrixView.java to work with your phone — just follow the comments.

Once you have the application running on your phone, this is what it looks like in action. The video image is displayed with the same resolution and colors as the RGB matrix.

Arduino Software
Now that the hard part is done, it’s easy to get the Arduino software installed.

  1. First, you’ll need Adafruit’s library for driving the panel. This project uses a slightly older version which you can find here. Install it in your Arduino sketchbook libraries folder just like any other library.
  2. Then download and install the RGBMatrixSerial sketch and install it in your Arduino sketchbook. Compile it and upload it onto your Arduino. The sketch is so simple, I’ll show the whole thing here:
    #include "RGBmatrixPanel.h"
    
    #define A   A0
    #define B   A1
    #define C   A2
    #define CLK 8
    #define LAT A3
    #define OE  9
    #define WIDTH 32
    
    int count = 0;
    byte currentByte = 0;
    byte lastByte = 0;
    uint16_t color;
    RGBmatrixPanel matrix(A, B, C, CLK, LAT, OE, true);
    
    void setup()
    {
      Serial.begin(115200);
      matrix.begin();
    }
    
    void loop() {
      int index;
      while (Serial.available()) {
    
        lastByte = currentByte;
        currentByte = Serial.read();
    
        // Look for the frame marker 0xF000
        if ((lastByte == 0xF0) && (currentByte == 0x00)) {
          count = 0;
          matrix.swapBuffers(false);
        } else {
          if ((count % 2) == 1) {
            color = (lastByte << 8) | currentByte;
            index = (count-1)/2;
            matrix.drawPixel(index % WIDTH, index / WIDTH, color);
          }
          count++;
        }
      }
    }
    

Future Ideas

  • increase the framerate a bit by packing 2 pixels in 3 bytes of transmitted data (only really need 1.5 bytes per pixel), but need different frame marker detection.
  • use the 32x32 matrix panel from Adafruit
  • try BlueTooth connection between phone and IOIO board (need to upgrade IOIO firmware)
  • Get an Arduino Mega ADK and use it to interface with the Android phone instead of the IOIO. The framerate should be higher.

Published by Michael, on January 22nd, 2012 at 6:09 am. Filed under: Android,Arduino,IOIO,Level 10,Video. | 22 Comments |

Visualizing TV Dialog Using Closed Caption Data

Difficulty Level = 8 [What’s this?]




One of the coolest things you can do with the nootropic design Video Experimenter shield for Arduino is decode the closed caption data embedded in NTSC (North American) television broadcasts. I figured out how to do this and documented it in a another project, so if you want to understand all the details of how to capture and decode closed captions, refer to that project. With this project, I take it a step further and show how the spoken dialog embedded in a television show can be visualized on a computer in a “cloud” of words. This is the same type of cloud (often called a “tag cloud”) that you see on blogs, where the frequency of a particular word is reflected in the size of the word. More frequent == larger word.

Hardware Setup

First, here’s how the hardware is set up. It’s really simple. The Video Experimenter needs a composite video feed from a TV tuner like a DVR (e.g. Tivo) or VCR. You can also use a DVD player because DVDs usually have closed captioning data. The USB cable connects to your computer where you run a Processing sketch (program) to visualize the words as they are decoded by the Arduino. The Processing sketch dynamically builds the TV cloud as the words are extracted from the closed caption stream!

Hardware setup


 

Demo Video

Here’s a video where I show a TV cloud of spoken dialog being created dynamically. I superimposed a video of the television broadcast so you can correlate the broadcast with the Processing application, but note that the Processing application doesn’t acutally display the video. Words spoken with higher frequency are larger.

Example TV Clouds

I always have noticed that whenever I happen to see a US national news broadcast, all the commercials are for drugs. I guess only old people watch the news on TV anymore. Here’s a TV cloud of the commercials shown during NBC Nightly News. Can you guess which drugs are being advertised? Can you guess which maladies they claim to cure? Look at all those nasty side effects!

TV cloud made from drug commercials. Click to enlarge.


 

Here is a TV cloud made while watching a baseball game. For US readers familiar with baseball, can you guess which teams were playing? Answer is at the end of this post.

TV cloud built from part of a baseball game broadcast. Click to enlarge.


 

The Software

The Arduino sketch is fairly simple, and for details on how it works, please see the in-depth article about decoding closed captions.
Download the Arduino sketch

The Processing sketch reads words from the serial line and filters out any word less than 3 letters and some very common words like “the”, “and”, “for”, etc. This application relies on the very nice OpenCloud Java library, so you’ll need to download that and use it in your Processing environment. Create this structure in your Processing sketchbook libraries directory: opencloud/library/opencloud.jar
Download the Processing sketch

Answer

Answer to the baseball broadcast question: Kansas City Royals vs. Minnesota Twins (go Twins!)


Published by Michael, on July 18th, 2011 at 7:17 pm. Filed under: Arduino,Level 8,Processing,Video. | 13 Comments |

Video Experimenter on the Seeeduino Mega

The nootropic design Video Experimenter shield uses some pretty advanced features of the Arduino’s ATmega328 microcontroller. One downside of this is that you can’t use the Video Experimenter shield on an Arduino Mega. Why? Well, the designers of the Arduino Mega didn’t connect a lot of the ATmega1280/ATmega2560 pins to headers on the board so that you could use them! And, as it turns out, the pins with key features utilized by the Video Experimenter are not connected to anything!

To perform video overlay, the Video Experimenter relies on an input capture pin (to capture the exact time that the pin has changed state). Even though the ATmega1280/ATmega2560 has 4 input capture pins, none of them are connected!

Important pins not even connected!




And to capture video images in the Arduino’s memory, Video Experimenter uses the analog comparator in the chip. But the AIN0 pin for the analog comparator is not connected! What were the Arduino Mega designers thinking?

Fortunately, there is the Seeeduino Mega. The guys at Seeed Studio broke out nearly all the pins on the ATmega1280 so that you can use them. I love the Seeeduino Mega because it provides so many pins on a rather small board.

Simply make 5 connections with jumper wires and you can use the Video Experimenter on the Seeeduino Mega. No code changes necessary!

By connecting 5 jumper wires, you can use the Seeeduino Mega


 
Here are the connections to make:
11 to 9 (white wire in picture above)
7 to 29 on the Seeeduino Mega (yellow wire)
INPUT pin on the Video Experimenter to PE2 on the Seeeduino Mega (green wire)
SYNCOUT pin on the Video Experimenter to PD4 on the Seeeduino Mega (gray wire)
VSYNC pin on the Video Experimenter to 21 on the Seeeduino Mega (brown wire)

Now you can use the Video Experimenter with an Arduino that has a more powerful processor. It really helps to have 8K of SRAM instead of 2K. Now you can do text and graphics overlay with higher resolutions, like 192×128. Have fun!


Published by Michael, on July 13th, 2011 at 2:02 pm. Filed under: Arduino,Level 1,Video. | 14 Comments |