Wii Nunchuk and VE at the same time

Store Forums Video Experimenter General Discussion Wii Nunchuk and VE at the same time

Viewing 15 posts - 1 through 15 (of 24 total)
  • Author
    Posts
  • #620
    alecw
    Member

    Hello,

    The HackVision board supports a Nunchuk controller via I2C. Is it at all feasible to try the same thing with the VE board? I know that the VE’s use of interrupts makes I2C tricky. If it’s feasible, am I best using the HackVision’s Controllers library rather than any other Nunchuk library?

    Failing that, I could substitute the Nunchuk for a 2-axis joystick potentiometer. If I’m only doing video overlay (no input processing) am I able to read from the analog pins as normal?

    Many thanks,
    alec

    #1679
    Michael
    Keymaster

    You are correct about I2C and video. I re-implemented the I2C lib for Hackvision mostly to cut down on memory consumption (the original Wire library uses big 32 byte buffers). Look at this project http://nootropicdesign.com/projectlab/2011/03/20/tv-blaster/ to see how I only do I2C communication during the vertical blanking interval. Timing your I2C communication during this time slot is the key. If your I2C communication is quick, then you may be able to make it work. It is tricky, though.

    Yes, you can do normal analogRead while using the Video Experimenter. Just don’t use A2, as that is used by the VE board.

    #1680
    alecw
    Member

    Hi Michael,

    Thanks for the reply.

    In terms of “if your I2C communication is quick”, all I want to do is read values from the Nunchuk and send it via pollserial. The idea is to make a wireless joystick controller for a bot that is equipped with a 900MHz wireless camera. The camera receiver is attached to the VE, and I’m going to overlay a heads-up-display onto it via the VE. Also attached to the controller/VE is a Nunchuk and an XBee on TX/RX. I want to poll the Nunchuk and send the joystick position to the bot by using pollserial to send it to the XBee (which will send it to the bot, which will act on the values).

    Aside from the potential I2C timing issues, it sounds feasible to me, I hope?

    If I get it working I’ll write it all up 🙂

    alec

    #1683
    Michael
    Keymaster

    Yes, I think you can get that to work. The nunchuck timing is the hardest. Pollserial should work fine with the XBee radios. Good luck!

    #1681
    alecw
    Member

    Hi Michael,

    Do I have to do anything special to make the TVBlaster sample work on a PAL TV, aside from changing the TV.begin() call? Either the crosshair zooms of to the top right (and won’t come back), or it doesn’t move at all.

    Many thanks,
    alec

    #1684
    alecw
    Member

    OK, this is strange. Here’s what I hope to be some minimal code:

    #include 
    #include
    #include

    TVout tv;

    void initOverlay() {
    TCCR1A = 0;
    // Enable timer1. ICES0 is set to 0 for falling edge detection on input capture pin.
    TCCR1B = _BV(CS10);

    // Enable input capture interrupt
    TIMSK1 |= _BV(ICIE1);

    // Enable external interrupt INT0 on pin 2 with falling edge.
    EIMSK = _BV(INT0);
    EICRA = _BV(ISC11);
    }

    ISR(INT0_vect) {
    display.scanLine = 0;
    }

    void setup() {
    tv.begin(PAL, 128, 96);
    Nunchuk.init(tv, 4);
    initOverlay();
    tv.select_font(font6x8);
    tv.fill(0);
    tv.draw_circle( 64, 48, 15, 1 );
    tv.draw_rect(0,0, 127, 95, 1);
    tv.set_pixel( 64, 48, 1 );
    }

    void loop()
    {
    Nunchuk.getData();

    char joy[32];
    sprintf( joy, "= %d = %d =", Nunchuk.getJoystickX(), Nunchuk.getJoystickY() );
    tv.print( 0, 0, joy );
    }

    This seems to work fine, and it puts the X and Y coords on the screen. However, if I point the camera somewhere else (or put my hand up to the lens, which will cause it to turn on its IR LEDs and show a very bright image of my hand), the VE will stop updating the Nunchuck coords. If I remove my hand or move the camera back where it was pointing originally, it starts reading coordinates again.

    Any idea why this could be?

    Many thanks,
    alec

    #1685
    Michael
    Keymaster

    Alec,

    Basically, using an I2C device with the Video Experimenter is tricky, and sometimes the timing doesn’t work out. Are you using the Hackvision Controllers library for your I2C nunchuck communication? I remember having the same phenomenon where the nunchuck would drift up, and I remember fiddling with the order of the code until it became stable. Sorry, it’s just really hard to use an interrupt based protocol while generating video. The Arduino isn’t too powerful in this regard.

    #1686
    alecw
    Member

    Yes, I’m using the HackVision controllers library.

    It all works fine, unless I change what the camera is pointing at. Why would what the camera sees affect what I can do with a Nunchuck? Does my code do more than it should? I have no need to capture or process any input, just overlay onto the feed.

    Many thanks,
    alec

    #1687
    Michael
    Keymaster

    I think you are asking the nunchuk for data way too fast. Slow this whole process down by adding a delay with tv.delay_frame(1) or delay(100). Look at the TVBlaster code and see how it only asks for more nunchuk data on even numbered trips through loop(). Just slow way down and give the nunchuk a chance to process your request.

    #1688
    alecw
    Member

    Hi Michael,

    Thanks for the tip – I’ll try to make it do less work.

    I’m still confused why the work/not-work state is defined by what is on the camera rather than anything else. Could it be anything in my initOverlay() function? I just copied this from one of the other examples, and I don’t really know what it’s doing.

    void initOverlay() {
    TCCR1A = 0;
    // Enable timer1. ICES0 is set to 0 for falling edge detection on input capture pin.
    TCCR1B = _BV(CS10);

    // Enable input capture interrupt
    TIMSK1 |= _BV(ICIE1);

    // Enable external interrupt INT0 on pin 2 with falling edge.
    EIMSK = _BV(INT0);
    EICRA = _BV(ISC11);
    }

    ISR(INT0_vect) {
    display.scanLine = 0;
    }

    Do I need to to all of this? I’m not interested in capturing anything, so do I need to do things like “Enable input capture interrupt”?

    Many thanks,
    alec

    #1689
    Michael
    Keymaster

    Yes, you need all that code to enable overlay capability. I don’t know why your program behaves differently depending on the camera signal. That should not affect the timing.

    #1690
    alecw
    Member

    It’s really quite odd. The Nunchuck also stops responding if I flick the output select switch to “sync only”.

    #1698
    alecw
    Member

    Hi Michael,

    Should the TVBlaster sample work when the VE’s output select switch is set to Sync Only? It works fine when overlaid over a video stream, but the minute I flick the switch to Sync Only the crosshair stops moving.

    Do you know why that might be?

    Many thanks,
    alec

    #1700
    alecw
    Member

    OK, I’ve put R4 all the way to the right and now the Nunchuck works regardless of what’s on the screen. It also works when Output Select is set to Sync Only.

    Does this make any sense at all?

    Many thanks,
    alec

    #1701
    Michael
    Keymaster

    Hmm. That’s odd. The purpose of R4 is to “tune” the precise 680K resistance needed in the circuit for the LM1881 chip. Turning it all the way up adds 100K so 780K.

    The LM1881 is a very simple, old, and cheap chip, and it’s sensitive to noise. Maybe the nunchuk is making it act funny, but since you could “tune” away the problem, that’s great.

Viewing 15 posts - 1 through 15 (of 24 total)
  • You must be logged in to reply to this topic.