diff --git a/Using-TensorFlow-Lite-for-the-Ultimate-Goal-Challenge.md b/Using-TensorFlow-Lite-for-the-Ultimate-Goal-Challenge.md index 2943e73..63e92c1 100644 --- a/Using-TensorFlow-Lite-for-the-Ultimate-Goal-Challenge.md +++ b/Using-TensorFlow-Lite-for-the-Ultimate-Goal-Challenge.md @@ -23,4 +23,12 @@ Click on the following links to learn more about these sample Op Modes. * [Blocks Tensor Flow Object Detection Example](Blocks-Sample-TensorFlow-Object-Detection-Op-Mode) * [Java Tensor Flow Object Detection Example](Java-Sample-TensorFlow-Object-Detection-Op-Mode) +### Using a Custom Inference Model +Teams have the option of using a customer inference model with the FIRST Tech Challenge software. For example, some teams prefer to use the [TensorFlow Object Detection API](https://github.com/tensorflow/models/tree/master/research/object_detection) to create an enhanced model of the game elements, or they might want to create a custom model to detect other entirely different objects. Other teams might also want to use an available pre-trained model to build a robot that can detect common everyday objects (for demo or outreach purposes, for example). + +The FTC software includes sample op modes (Blocks and Java versions) that demonstrate how to use a custom inference model: + +* [Blocks Tensor Flow Object Detection Example](Blocks-Sample-TensorFlow-Object-Detection-Op-Mode) +* [Java Tensor Flow Object Detection Example](Java-Sample-TensorFlow-Object-Detection-Op-Mode) + \ No newline at end of file