![]() |
|
QuickCam Assembly Language Code | Last updated: |
This article was reviewed by: Steven Kaehler, Neil Sablatzky and Dafydd Walters. My thanks to you all! |
I have successfully interfaced a PC black & white parallel-port version of the QuickCam to a 68HC11 microcontroller. The QuickCam was made by Connectix; new QuickCams are offered by Logitech. The QuickCam used here is of 1997 vintage, the current models are USB-based. When I first started playing with this setup, the parallel port QuickCam was the current offering. Now it is obsolete and the new VC version using a USB connection is current. I have seen these older BW parallel port versions selling on eBay and other places. If you find good source for them please let me know and I will post it here.
Specifically I have interfaced the BW Parallel-port QuickCam with a BOTBoard2 designed by Marvin Green. I have modified the BOTBoard2 (BB2) to have an 8K EEPROM as well as a 32 K Static RAM, which gives us plenty of program space (the EEPROM) and lots of RAM for the image. The assembly language program included with this article takes about 1130 bytes of memory; it could be tightened up a lot depending on the specifics you want. If you use an 'E2 device in the BOTBoard2, the internal 2K of EEPROM would be plenty of program space.
Figure 1. Modified BOTBoard2 (top) with the PC-parallel-port underneath. The DB-25 is the interface to the QuickCam and the 5-pin DIN connector is the power for the QuickCam. Notice the stacked memory on the BOTBoard2; a 32K RAM on the bottom and an 8K EEPROM on the top.
Ths first step to interfacing the HC11 with the QuickCam is to build the PC parallel-port capability for the HC11. This HC11 parallel-port circuit can be used for other PC parallel-port needs. I have focused on using it for the QuickCam interface, you could use this for other PC-type parallel port interfacing. Look below for a table describing the pin-outs on the PC-parallel port. There are also lots of web-site detailing the PC parallel port if you look around.
Figure 2. Schematic diagram of the PC-parallel-port circuit for the 68HC11.
![]() Figure 3. Underside of the BOTBoard2 showing the male 2x8 connector. |
The PC parallel port interface to the BB2 uses the 2x8 pin header on the BB2, which is
a very nice expansion capability for memory-mapping devices at $4000, $5000 and $6000.
With the 2x8 pin connector (S1/S2), I used a 2x8 male connector on the BB2 with the pins
accessible on the bottom-side of the BB2. I built up the parallel-port board (PPB) using a Radio-Shack pre-drilled circuit board cut to the same size as the BB2, with holes that line up for stand-off connection. The 2x8 female connector on the PPB is mounted on the top so the two boards can be sandwiched together via the connector. The 2 connectors for the QuickCam are mounted on the PPB; a DB-25 female parallel-port connector and a female 5-pin keyboard (also used for MIDI) connector. |
The chips used on the PPB are all 74HCxxx devices which I placed in sockets. Be sure to use a power capacitor (i.e. 10uF to 100 uF) on the board, and also use decoupling capacitors (i.e. 0.1uF) for about every two 74HCxxx devices. My schematic above specifies a 10uF power cap, while the board I built uses a 100uF power cap (it must have been the first cap that jumped out of my drawer). Either will work just fine.
Figure 4. PC-parallel-port board layout.
I should point out that I have an extra 74HC373 on my board above in Figure 4. I used this, along with the jumper just under my thumb, to experiment with inverting a line or two. The schematic (Figure 2) reflects my final design and does not include this extra chip.
The CCD is very sensitive to IR light. The camera comes with an IR filter. If you remove this filter you can get very good IR images.
The QuickCam uses a CCD which is 336 x 243 pixels. The left 12 columns of this are not exposed to light, so they are always black. This gives us an active pixel area of 324 x 243. These black pixels (12x243 = 2,916 total) can be used as a calibration reference.
The camera can take images which are either 4-bits per pixel or 6-bits per pixel (16 or 64 levels of gray respectively). When downloading an image from the camera, you can specify a sub-region to download, and you can also specify that you want every pixel, every other pixel in both X and Y (2:1 mode), or every 4th pixel (4:1) mode. In 2:1 mode the image will be 1/4 the number of pixels and in 4:1 mode the image will be 1/16 the number of pixels as the 1:1 image.
These options for image downloading give you a lot of flexibility to specify the type of image which you want to process. Let's consider some of these options and how they work with our 68HC11 and 32 K of RAM. I'm just running raw calculations here (and dropping fractions), so treat these figures as estimates to give you a feel for the sizes involved.
Image Size | Transfer Mode | Bits per Pixel | Total Pixels | Packed Bytes |
---|---|---|---|---|
324 x 243 | 1:1 | 6 | 78,732 | 59,049 |
324 x 243 | 1:1 | 4 | 78,732 | 39,366 |
324 x 243 | 2:1 | 6 | 19,683 | 14,762 |
324 x 243 | 2:1 | 4 | 19,683 | 9,841 |
324 x 243 | 4:1 | 6 | 4,920 | 3,690 |
324 x 243 | 4:1 | 4 | 4,920 | 2,460 |
317 x 13 | 4:1 | 4 | 320 | 160 |
Notice at the 1:1 transfer mode, our little BOTBoard2 cannot hold all of the pixels in
our 32K RAM, even in 4-bits-per-pixel mode and packing two of these 4-bit pixels into a
single byte in RAM.
The mode I use the camera in is to get a sub-image region, possible with 2:1 or 4:1 mode,
to give me a workable image to process. The other consideration to balance is the more
pixels-per-image you read in, the more processing time it will require to look at the
image, thus the fewer frames-per-second you can process. I would rather process smaller
and coarser images and be able to process them quickly. For example, the last entry in the
list is a 317 x 13 image at 4:1, which gives us an 80 x 4 pixel rectangle which would be
great for line-following. Perhaps an 80 x 1 image would be even better, and would only be
80 pixels.
The QuickCam Interface is defined in a very full 45-page document, and to get access to the interface specification I signed a non-disclosure to not share it. However, as a developer I can share working code and enough information to be able to use my software. I will provide a rather abbreviated version of the interface, specifications and protocol, but with my working code you should be able to get a working system up and running to be able to start experimenting.
The QuickCam has defined its own use of the PC parallel port (which is rather non-standard). It also defines two different interface modes; a Nybble Mode and a Byte Mode. The Nybble Mode is always used to communicate with the QuickCam. To download images you can either use the Nybble Mode or the Byte Mode--the advantage of the Byte mode for downloads is you are downloading 12-bits at a time in parallel as opposed to 4-bits at a time in Nybble Mode.
DC-25 Pin Number | PC Name | Nybble Name | Nybble Direction (Output is a PC output) | Byte Name | Byte Direction (Output is a PC output) |
---|---|---|---|---|---|
1 | Strobe | -na- | - | -na- | - |
2 | DB0 | Cmd0 | Output | CamRdy2 | Input |
3 | DB1 | Cmd1 | Output | Data0 | Input |
4 | DB2 | Cmd2 | Output | Data1 | Input |
5 | DB3 | Cmd3 | Output | Data2 | Input |
6 | DB4 | Cmd4 | Output | Data3 | Input |
7 | DB5 | Cmd5 | Output | Data4 | Input |
8 | DB6 | Cmd6 | Output | Data5 | Input |
9 | DB7 | Cmd7 | Output | Data6 | Input |
10 | ACKIN | Nybble2 | Input | Data10/Nybble2 | Input |
11 | BUSY | Nybble3 | Input | Data11/Nybble3 | Input |
12 | PARAMEND | Nybble1 | Input | Data9/Nybble1 | Input |
13 | SELECT | Nybble0 | Input | Data8/Nybble0 | Input |
14 | Autofeed | -na- | - | -na- | - |
15 | Error | CamRdy1 | Input | Data7 | Input |
16 | Initialize | /Reset_N | Output | Reset_N | Output |
17 | SELECTIN | PCAck | Output | PCAck | Output |
18-25 | Ground | Ground | - | Ground | - |
Commands are sent to the QuickCam through the PC parallel port. This has to be via the
Nybble Mode. The sequence to send a command is as follows:
Note that the PCAck line in inverted in hardware, so these levels are from the
software's point of view.
There are two commands (GetOffset and SendVersion) which GET parameters from the QuickCam. Here is the sequence for these two commands:
Here is a list of the QuickCam commands. Most of them take an input parameter byte.
|
I have set up an HC11 subroutine "Send_Command" to handle the low-level handshaking to the QuickCam. Two parameters are passed into Send_Command, the command number (register A) and the command parameter (register B).
If the Get Video Frame command specifies Byte mode (12-bit bytes) for the transfer, the HC11 must execute a "Port Turn-Around Handshake". This changes the data lines DB0-DB7 from outputs to inputs.
After this turnaround, the CamRdy2 line is used for handshaking signal.
After the video data has been sent by the QuickCam, an End of Frame MUST be sent before anything else is done.
After this, the QuickCam should be ready to accept commands.
The sequence to take an image is:
|
The details of the parameter for the Get Video Frame command (#7) are:
There are many bit-level handshaking interactions which need to occur. The Reset_QCam and Send_Command subroutines interact with the QuickCam, controlling the required bits on the parallel-port and using the needed timing or handshaking. The subroutine Port_Turnaround puts the QuickCam into a send-data mode for the images.
Here is a resulting image taken using the included assembly language code and the following BASIC code on a PC. The image is 3 washers on a white paper. If you squint your eyes you can clearly see the washers, including the sides and the holes.
..............................................................................
......................0BB0....................................................
....................0I=|I!HO..................................................
...................O* |H!!O...........0BOOB.................................
..................Ol +IHO!O........BHl|***|lO0..............................
.................0!|:=;=HBH!!.......!*++-++++=*!0.............................
.................OHIIllIOOl|!.....B|=--;;;+-++=*H.............................
.................OHHHBOB0H|+!....O*+--;;-==+++-=|0............................
.................BHHOB00H-+=....0*+-;;=IOOH|=+++*O............................
..................OHHH!l+-=0....I=---|B....H*+++=O............................
...................BOI|-;!.....B|=++|B.....H=---*O............................
.....................B!O.......H*=+=I.....0|+--+|0............................
............0HOHB0.............!|===l0...0l;;;++!.............................
..........BI+*IHHOB............!*=+==l!Hl=: :-+|B.............................
.........O* :lHH!O............O|==++++;: :;-*H..............................
........BI :lO!IH.............I*==+-;: :;--*H...............................
........O|;-;=!HI*!.............Bl*==+-;-;-+|B................................
........O!I!|I!H|lH..............0!l*=+;-=lO0.................................
.......0O!HOOBB!==B................0O!l+lO....................................
........BOOBBOl*+I............................................................
.........BO!I*;:!.............................................................
..........O!=-+B..............................................................
...........0OB................................................................
..............................................................................
The image is transferred from the HC11 to the PC as an ASCII character for each pixel. In the HC11 code, the 4-bit pixel value is associated with an ASCII value using a 16-character lookup table:
fcc '.,:;-+=*|lI!HOB0'
I selected characters which progressed in their coverage of the character, giving the effect from light to dark. To display larger images, you would write a program on the PC to accept direct pixel data from the HC11 (with a header specifying the X and Y dimensions as well as the bit-depth to be quite general). The PC program would then create an image using typical graphics routines. This is just to see what the HC11 sees. The real challenge is to have the HC11, as a robotic brain, look at the image, decide what it sees, and to make intelligent behavioral changes.
Here is a very simple BASIC program to run on the PC which was used to display the 3-washer image above using ASCII characters sent directly to it from the HC11. This is about as simple of a display program as one can write. I've kept some lines of code which I experimented with as comments (REM stands for remark). Notice that I expect exactly 110 characters to be sent from the HC11. Also, the HC11 sends end-of-line characters to align the rows within the image.
This BASIC program uses channel COM1 for communication with the HC11. It also writes the received image to a file called "qcam.txt".
CLOSE #1 CLOSE #2 CLOSE OPEN "COM1:9600,N,8,1,cd0,cs0,ds0,op0,rs" FOR INPUT AS #1 REM open options, CD...... OPEN "qcam.txt" FOR OUTPUT AS #2 I = 0 REM WHILE (I < 24 * 78 * 3) WHILE (I < 110) S$ = "tpd" INPUT #1, S$ WRITE #2, S$ REM j = 65 REM j = INP(&H3FC) REM WRITE #2, CHR$(j) I = I + 1 WEND CLOSE #1 CLOSE #2
QuickCam Assembly Language Code
.I do recall when looking at the resulting images that sometimes the image background flipped from "white" to "black". This is caused to the ordering of the pixel values and that the "white" value is at the "black" end of the scale and my code did not adjust for this.
I hope this is helpful. Please e-mail me with questions, problems, and hopefully
success stories...
Tom Dickens: thomas.p.dickens@boeing.com