Tuesday, April 12, 2011



It has been a long time since I posted any of my creations. This is because for the past few days, I had been working on possibly my most entertaining project to date: A NXT cyclops that can track faces! Before I get started, here are some pics.

Front View, base and cam

Side view, cam holder in detail

Gearing to turn the camera

I see you!

So to make this creation, I used the OpenCV wrapper for Java (JavaCV), Lejos as well as Java SE. This creations consists of two motors. One motor drives a set of gears which can tilt the camera up or down. Another set of gears rotates a geared base CW or CCW, which in turn rotates the camera. The camera itself is connected to a terminal, which is the only wired connection to an external terminal in the whole project. You can use Bluetooth webcams and do away with wires. The geared base, rotating motor and the NXT brick all sit on a pretty sturdy base, which rests on four wheels. The whole project is about 20x15x20 cm. So basically the camera has four degrees of freedom. So what does this project achieve and how does it do that? I wanted for a long time to be able to build a robot that could track objects in real time and this robot does just that. The camera gets input from the environment and sends it to the computer for processing. A Java/JavaCV program running on the computer processes this input video and draws a rectangle on all detected faces (program works with only one face though).
       Now a method in the program computes the distance of the edges of the box around the face to the edges of the video frame. If the distance from the left edge of the box to the left edge of the frame is different by more than 80px compared to the distance from the right edge of the box to the right edge of the frame, the JavaCV program on the computer sends a command to the NXT Brick to either turn left or right, so as to position the face in the center of the video frame. The same works in the y axis, where the distances from the top and bottom edges of the box around the face to the top and bottom edges of the video frame are compared. Again if the difference is more than 60px, the JavaCV program sends a command to the NXT to tilt the camera up or down depending on the scenario.
       So that is pretty much how the program works. I have posted links to the code below, from both the NXT and the Computer. It is annotated and licensed under the MIT OSI License. Feel free to go over the coed and edit it. Hopefully it is educational. If you are having problems building the robot itself, send me a message, and I can help you out.  I have posted pics above that can guide you. Oh and a huge thanks to Nashruddin whose blog taught me how to program face recognition in OpenCV! And finally here is a video showing the robot in action.

Code on the NXT
Code on the computer
Classifier to detect faces, must be in the folder from which comp.class is running.

This is not the end...


  1. oh good, I'm trying to do the same as you but recognizing the hands and manipulate my software is a little hard to handle the javacv

  2. Really, interested for this NXT project.
    Can you upload the building instruction for this NXT ?
    Also, which port in the NXT brick will be put the motor sensor, etc ?

  3. but would you mind explaining how does tecknically the javacv control the robot, does the mindstorm needs any specific program to be installed in .

  4. good, nice, I made the same thing but with ev3 the new one from 2013

  5. Wow,it is so cool project! Now I am making webcam observing system for my hamster with my daughters. It could be really helpful to me. I have mindstorm next and logetech webcam. And I designing up & down and tilt and turning webcam system.
    But I am really beginner. If you could give me some advice or share your work to me, I am really appreciated to that. kimyongim@naver.com