<< return to Pixycam.com

Track Following Robot


Hey guys,

I am currently doing a project where we have to race around a track whilst avoiding obstacles (some moving). It’s 2 motors (one left and one right), a servo motor attached to the pixy (so we can look at the different lanes/obstacles).

So the track is configured as follows:
3 lanes, left, center and right (all different colors and set to different signatures)
Obstacles (same color so set to only one other signature)

I can easily follow each of those lanes, I can stop when I see the obstacle (avoidance not perfect yet) but am having trouble with:

  • Taking into account all lanes, center being the one used to generate the trajectory, how can I ensure that can get the robot to actually follow the racing line (i.e. identifying apex and using it to its advantage to reduce lap times)?

  • Since it’s a circuit, it can identify things that might be a bit in the distance, how should I compensate for that?

Thanks in advance for the help!



I’m having trouble understanding your question. I don’t know what you mean by “racing line” and “apex”. Are you using Pixy2?



Hey Edward,

I’m using the Pixy2 to track the lanes.

What I mean by racing line is taking the line that, within the limits, takes the shortest path or the most optimal line. The line makes use of the entire width of the track to lengthen the radius of a turn: entering at the outside edge, touching the “apex”—a point on the inside edge—then exiting the turn by returning outside.

So, essentially, you want to almost “cut the corner” around a track. Now, it can be either a corner at the left or right (the edge lanes), so in this case we have to choose how to optimally track either signature 2 or 3 (the side lanes since signature 1 is for the center line, that gives the general trajectory).


Note that the line tracking algorithm is a greyscale (no color) algorithm. The color connected components uses color cues (aka signatures). But the line tracking and color connected components algorithms can’t be run simultaneously. When you refer to signatures in the context of the line tracking, it sounds like you are mixing the 2 algorithms.



In fact what we do is:

self.bot.setServoPosition(0) # set servo to centre
	self.bot.setMotorSpeeds(0, 0) # set racer to stop
	while True:
		smallestCenterLine = -1;
		for i in range(0, self.newCount): # go through all the blocks from PixyCam to find the center line block
			if self.newBlocks[i].m_signature == self.centerLineID:
				smallestCenterLine = i
		if not smallestCenterLine >= 0: # stop the racer and wait for new blocks
			self.bot.setMotorSpeeds(0, 0)
		else: # drive while we see a center line

			compute the center line angular error as derived from the pixel error. 
			centerLineError = (self.newBlocks[smallestCenterLine].m_x - self.pixyCenterX)*self.pixyX_FoV/self.pixyMaxX
			racerError = centerLineError + self.bot.servoPosition
			self.drive(speed, racerError/(self.pixyX_FoV))
			### Come up with a steering command to send to self.drive(speed, steering) function

So we’re following the signature as long as we can see it.