<< return to Pixycam.com

Pixycam 2 and FIRST Robotics, LabView


Our FRC team wants to use the Pixycam on our robot for the 2019 contest but we are new to vision and the LabView dashboard isn’t easy to adapt to (at least not for us).
Suggestions welcome.
Charlie Affel,
mentor, FRC 423


Hello Charlie,
I must admit that I don’t understand the different FIRST competitions. Some teams/competitions are using Java, some are using LabVIEW. You’re using LabVIEW with RoboRIO hardware, is this correct?



Hi Charlie,
I found this:

But it’s Java. Is it true that you have a choice between Java and LabVIEW?


yes, exactly.

you can see more about our newest contest at:

Our challenge is that we use LabView and not Java or one of the lower level languages.
Adaptations, interfaces and subroutines by National Instruments and Woster Polytechnic for LabView sometime fall behind those of the more popular languages.

Right now, we have images from the Pixcam 2 on our windows screen but not yet visible through the required LabView ‘Dashboard’.

We have just three weeks to complete the machine, including driver testing.

Charlie Affel, mentor FRC team 423.
c 215 605 3924


Thanks for the suggestion.
FIRST FRC (Sr. High School) Teams can choose Java or LabView and some C variants.
We have just three weeks left to complete the machine and driver testing, so its too late to change the SW design paradigm to Java for this season.


Hi Charlie,

How are you connecting RoboRIO to the Pixycam 2? I2C, SPI, UART?

I can probably help get data from Pixycam into LabVIEW. Are you using FRC 2019? (http://www.ni.com/download/labview-for-frc-18.0/7841/en/)


Thanks for your interest.
We are using the latest Labview version allowable with updates as recent as mid-January : 2018.

Right now (tonight) we plan to connect the Pixy2 to the Rio using the USB port (for the live video) which must appear on the LabView ‘Dashboard’. for #1 below.
I do have our Pixy2 showing live video as expected on a laptop screen (using the Pixymon program on windows and connected via USB).
We also have a working microsoft webcam to use in testing.

The Robo Rio’s USB port has some limitations and the LabView Dashboard itself can be a hurdle, and so its not a done deal for those steps
#1: Robot mounted cameras are essential to teleoperated driving during the initial 15 second blackout of driver vision.
#2: Next, we hope to have the pixy do some image finding to direct the robot to a colored target on the wall (reflective tape illuminated by our green ring light.
#3: There are also wide lines of tape on the floor to guide the machine the last two feet to an object (scoring opportunity).
I plan to connect the Pixy2 I2c outputs to the I2c inputs on the RoboRio. We have Labview code snippets (vi’s) that were designed (two years ago) to read x-y data directly from the Pixy into Labview.
We are aware of much Arduino and Java work for the Pixy out there. I consider these to be either out of reach or a bit late given our commitment to LabView for the 2019 contest year.
My hope is live within the Pixy’s built in image following capabilities.
The machine must be completed on President’s day and each subteam including programming, is currently maxed out…

Charlie Affel, mentor FRC 423


Thanks for the link.
Assuming the notes are correct, then we are using LabView 2019 (this is an improvement over prior years when we were licensed with a earlier numbered version (frozen ?) of LabView from the prior year all through the contest year.
Again, we have downloaded, installed and updated the entire current Labview suite for FIRST Robotics as recently as two weeks ago. Then re-imaged the Robo Rio after all of this was completed.
It was encouraging that these numerous steps went more smoothly than last year (its a two - three hour process). The error messages are confusing and sometimes multiple tries have been required, no rhyme or reason, just dumb persistence.



For #1, I don’t believe that Pixycam 2 will work as a USB web camera… someone can correct me if I’m wrong on that. (See: http://www.cmucam.org/boards/9/topics/7409)

For #2 and #3, Pixycam 2 is perfect for this… I got my hands on a RoboRIO. Let me get some code in place and submitted to the git repository…


Regarding #1, I’m not sure if this is a UVC kind of thing or possible through some kind of (custom) LabVIEW driver. Pixy2 has a libusb-based C++ library called libpixy2usb that includes a get raw frame function that could be used to rapidly get frames for streaming and/or displaying live video. But I don’t know if libusb and C-code can be used on the hardware in question.

mightysorensen, be sure to check out our porting guide, which includes protocol reference.


Thanks for your help!



I’ll check it out.

I know that RoboRIO is using Linux (http://www.ni.com/white-paper/14627/en/), so it should be possible to get some frames out.



Pixy2 support was added here: https://github.com/charmedlabs/Pixy2LabVIEW

We are initially supporting I2C only. SPI and UART are in the works.

Let us if this meets your needs and how we can improve Pixy2 support for you!



Thank you so much, David for this.
I will work on getting our programming team to try it out and spread the word to other Labview teams.

Charlie FRC 423


You’re welcome! Please let us know what is missing so we can improve!



I got the Pixy2 camera and I have spent time reading the wiki and I have been playing with it, but I still don’t know how to use it for what I need.

I also am helping with my daughter’s robotics team FRC and I am connecting the Pixy2 to a National Instruments RoboRIO using the I2C interface. I installed the VIPM package support for LabVIEW that David (mightysorenson) uploaded to Github, (by the way, the examples don’t install). I created a new robot project using LabVIEW. I initialized using the I2C Init.vi, selected “I2C On-Board” for the I2C bus, used “0x54” for the device address. In order to get the block information that I need I am using the CCC.vi but I don’t know what to enter for Signature Map and the Max Blocks input.

I am also now sure how to teach the camera to recognize the targets that I need for the competition. I am enclosing some sample images of the targets provided with the LabVIEW example from National Instruments.
Any advice will be greatly appreciated.CargoLine24in CargoLine60in LoadingStraight36in LoadingStraight108in RocketPanelStraight48in RocketPanelStraight84in


I am able to talk to the Pixy2 with the example uploaded by David. Now I just need some help getting the Pixy2 to learn the targets. When try to teach the Pixy2 a target, it learns(grabs) a big rectangle, I would like it to learn and recognize just the reflective tape. the Any advice?


Hi Nelson,

I’m glad the example worked for you. Please let me know how I can make the examples better and easier to find.

Here are some hints that may help:

I think you’ll want to use Pixy2 in two different modes:

  • CCC mode to target the green LED arrays or the red balls or maybe even the yellow circular target.
  • Line tracking mode to find the white lines on the floor.

In CCC mode, Pixy returns the x,y position of the target in the image along with width and height. You can use the width and height (area) to determine how close the targets are and prioritize nearer targets depending on your strategy.

I think training Pixy signatures is most easily done using PixyMon. You should make sure that your target is being recognized well using PixyMon. Remember that you can connect both PixyMon and roboRIO at the same time.

Line tracking mode will work better for finding and following the lines on the floor. You’ll have to tell Pixy to track white lines on a dark background. You can do this through Set Mode.vi (and use mode = 0x80).

You may also need to adjust where Pixy is looking. You can use the pan-tilt kit: https://pixycam.com/pixy2-pan-tilt-kit/ and the Set Servos.vi to do this.

Let me know what questions you have. I’ll do my best to help!



Hey Nelson,

additional info on teaching Pixy an object can be found here: https://docs.pixycam.com/wiki/doku.php?id=wiki:v2:teach_pixy_an_object_2

You might especially look at the “Signature Tuning” sliders: https://docs.pixycam.com/wiki/doku.php?id=wiki:v2:teach_pixy_an_object_2#signature-tuning

Hope this helps!