Updated Blocks Sample Op Mode for TensorFlow Object Detection (markdown)
@ -18,7 +18,7 @@ Let's take a look at the initial blocks in the op mode. The first block in the
|
||||
|
||||
<p align="center">[[https://raw.githubusercontent.com/wiki/WestsideRobotics/FTC-training/Images/010_Blocks_TFOD_webcam_init.png]]<br/>Initialize the Vuforia and TensorFlow libraries.<p>
|
||||
|
||||
You can initialize both the Vuforia and the TensorFlow libraries in the same op mode. This is useful, for example, if you would like to use the TensorFlow library to recognize the Duck cargo and then use the Vuforia library to help the robot autonomously navigate on the game field.
|
||||
You can initialize both the Vuforia and the TensorFlow libraries in the same op mode. This is useful, for example, if you would like to use the TensorFlow library to recognize the Duck and then use the Vuforia library to help the robot autonomously navigate on the game field.
|
||||
|
||||
Note that in this example the ObjectTracker parameter is set to true for this block, so an _object tracker_ will be used, in addition to the TensorFlow interpreter, to keep track of the locations of detected objects. The object tracker _interpolates_ object recognitions so that results are smoother than they would be if the system were to solely rely on the TensorFlow interpreter.
|
||||
|
||||
|
Reference in New Issue
Block a user