The AVRcam is a camera module sold by JRobot. It uses an atMega8 for the controller and an OV6620 for the camera. It comes as a kit and must be assembled. Moderate soldering skills are required.

The AVRcam handles the communication with the AV6620 and processes the raw image data to provide bounding box data for blobs of color as defined in a color map. This offloads the image processing from your main microcontroller. All communication with the AVRcam is via a serial connection.

To use the AVRcam with the Axon microcontroller, I have written a module that handles the low-level communication and provides you with an API to simplify coding. This module was written specifically for the Axon, but could be used with minor modification on any atMega processor.


Connection to the Axon requires 3 steps:

  1. Remove jumper 2 on the AVRcam to allow TTL level serial communication
  2. Connect AVRcam TTL serial to the Axon UART2. Make sure TX on the AVRcam connects to RX on the Axon, and vis-versa.
  3. Provide power to the AVRcam. I just connected it to one of the 5V headers on the Axon, but it could be anthing from 5-20V. Just make sure that the AVRcam and the Axon have a common ground.


Below is a description of the API. You can download the code at the bottom of this page.

  • avrCamInit() - Must be called at the start of the program to set up the communications with the camera
  • avrCamPing() - Pings the camera and returns 1 if it gets an ACK, 0 if not
  • avrCamVersion() - Returns the version string from the camera or a descriptive error string if something goes wrong.
  • avrCamSetCameraReg(reg,val) - Sets the OV6620 registers. This should be used carefully since it can make the camera unresponsive.
  • avrCamSetColorMap(slot,rL,rH,gL,gH,bL.bH) - Stores color values in a local copy of the color map. The first parameter is the slot # in the color map that you want to change. The colors are the low and high values for red, green, and bule that define the color range that will be matched by this entry. This function does not change the color map in the camera until the avrCamSendColorMap() function is called.
  • avrCamGetCenterColor(slot,size) - Dumps a frame from the camera, then samples the center of the image to get the color values to store in the defined slot of the local color map. The second parameter is the size of the center square to sample in pixels. This function does not change the color map in the camera until the avrCamSendColorMap() function is called.
  • avrCamSendColorMap() - Sends the local copy of the color map to the camera module.
  • avrCamGetPanDelta(minSize) - Returns the number of degrees of pan required to center the largest blob. If there is no blob larger than the min size passed, then it will return 0; The number of degrees returned is not true degrees, but rather an value that works well with a specific pan/tilt unit. The fudge factor can be adjusted with the avrCamPixPerDegX definition. This function uses the last stored data, so you must call avrCamGetTrackLine() first to process the latest tracking data.
  • avrCamGetTiltDelta(minSize) - Same as avrCamGetPanDelta, but returns the number of degrees of tilt required and he fudge factor can be adjusted with the avrCamPixPerDegY definition.
  • avrCamTrackStart() - Put camera in tracking mode
  • avrCamTrackStop() - Returns camera to idle mode
  • avrCamGetTrackLine() - Processes last tracking line. The data is stored in the avrCamBlob[] and avrCamBlobInfo structures.

Example Code

#define panMaxDeg 90 #define tiltMaxDeg 30 int8_t pan =0; int8_t tilt =0; uint8_t blobs; avrCamInit(); avrCamSetColorMap(0,180,255,180,255,180,255); // Set up color map 0 to pick up light areas avrCamSendColorMap(); avrCamTrackStart(); while (1) { blobs=avrCamGetTrackLine(); if (blobs>0) { // Pan pan+=avrCamGetPanDelta(500); // Get delta for Pan if (pan>panMaxDeg) { // Check for max value pan=panMaxDeg; } else if (pan < -panMaxDeg) { // Check for min value pan=-panMaxDeg; } // // Your code to drive the pan servo goes here... // aimPan(pan); // // Tilt tilt+=avrCamGetTiltDelta(500); // Get delta for Tilt if (tilt>tiltMaxDeg) { // Check for max value tilt=tiltMaxDeg; } else if (tilt < -tiltMaxDeg) { // Check for min value tilt=-tiltMaxDeg; } // // Your code to drive the tilt servo goes here... // aimTilt(tilt); // } }



  • Using the standard baud rate code on the Axon does not work for communicating with the AVRcam. At 115200, it has too much skew error to work. By using the double clock rate way of setting the baud rate, it works properly. I don't know if this is a problem with the Axon, the AVRcam, or just a quirk of how they work together. I just know that you have to do this for it to work. This is done for you in the avrCamInit() function.
  • The AVRcam will sometimes hang or fail to answer a ping on start up. This seems to mostly happen when you try to communcate with it too soon. I need to look into this further.
  • This module is uses TIMER2 which is configured by default on the Axon for /64 from the system clock. I then just use the overflow counter. Since this is an 8-bit counter, the overflow is effectively a /256. This results in each unit on the overflow counter to be approx 1.024 mS on a 16Mhz Axon. You must do a timer2Init() somewhere before calling this API. The API uses this timer to provide timeout functions. If you use timer2 for anything other than running free at the default rate without resets, then you will need to modify the getNow() and getTimeDiff(...) functions.
  • If you want to connect the AVRcam to a different UART, you can change that in the code with the appropriate defines.
  • The API assumes that your standard output UART is the USB on UART1. If that is not the case, you can change that in the code with a define. You must have this set properly if you have any other output in your code since the rprintf must be changed to the camera to send commands. It then must return to the correct UART for STDOUT.
  • It turns out that the output from the camera gets sent to both serial connections. Jumper 2 on the AVRcam is used to select which connector is used for input. I put a switch attached to jumper 2 so I can easily switch back and forth between RS232 and TTL serial. I then connect the camera module to both the Axon and my PC for easy debugging. To see what the camera is doing, Power up the AVRcam, switch to RS232, start up the Java console, Connect, and put it into Passive Tracking mode. Then you can switch back to TTL and start up your Axon code. This will let you see the blobs on the Java console at the same time that your code is processing them.


Download Axon Code