98 Commits

Author SHA1 Message Date
315e3b4240 Add sample file to debug on-board IMU 2024-09-13 13:40:21 -07:00
fb570d6be8 Static-ization of variables that are modified constantly 2024-09-13 13:36:51 -07:00
a0126ba1a2 A bit of cleanup 2024-09-13 13:21:47 -07:00
fb7f160a9f Updated motor/encoder names with that of SCDS Robot configuration 2024-08-29 12:24:10 -07:00
af672c7807 Bump RR 2024-07-21 21:55:34 -07:00
d7f145300f Merge FTC SDK v9.2 2024-07-21 21:14:41 -07:00
6ce588fc79 Merge pull request #990 from FIRST-Tech-Challenge/20240701-085519-release-candidate
FtcRobotController v9.2
2024-07-11 12:56:07 -07:00
48019d026b FtcRobotController v9.2 2024-07-11 12:31:10 -07:00
416ba0d1ad Bump RR 2024-03-16 08:58:51 -07:00
6b35895532 Use new params 2024-03-02 22:56:53 -08:00
03d95d791e Rename LazyImu drive members
Also makes sure the mecanum drive localizer fetches the IMU
on construction.
2024-02-20 22:07:37 -08:00
9d2cd48287 Stop reading localizer inputs on construction
This ensures that any motion between construction and first update
will be ignored (this is usually the period between init and start
of an op mode). If teams want to track pose during that period,
they can call `updatePoseEstimate()` explicitly. This matches the
behavior of the 0.5.x localizers (without the annoying reset on
every pose estimate set).

The localizers also now log much more data to help troubleshoot
localization issues in the future.
2024-02-19 14:23:43 -08:00
ecc49aa778 Merge SDK v9.1 2024-02-17 10:51:28 -08:00
1da45a36c6 Merge pull request #941 from FIRST-Tech-Challenge/20240215-115542-release-candidate
FtcRobotController v9.1
2024-02-16 14:13:01 -08:00
c303962469 FtcRobotController v9.1 2024-02-15 13:10:49 -08:00
8e84495f1a Add drawing class 2024-01-31 17:52:54 -08:00
01d55d76f4 Draw robot in LocalizationTest (fixes #360) 2024-01-30 22:10:27 -08:00
c6ef075c48 Update to lazy IMU (fixes #346) 2024-01-25 22:27:07 -08:00
5cf1596910 Set better default lateralInPerTick 2024-01-14 09:09:14 -08:00
7381cb6b23 Bump RR FTC 2023-12-24 14:40:51 -08:00
a01ac85ef7 Better spline test 2023-12-24 13:30:20 -08:00
a890105110 Add encoder reverse directions to dead wheel localizers
Direction doesn't matter for position regression, but it does matter
for `inPerTick`, `lateralInPerTick`. And everything makes more sense
if the directions are correct.
2023-12-23 13:23:43 -08:00
ce080b0b51 Always throw RuntimeException
The FTC SDK only catches `Exception` thrown by user code,
so switch all throwables to `RuntimeException`.
2023-12-20 18:10:52 -08:00
e4cff7367d Bump RR FTC 2023-12-17 12:45:19 -08:00
263100de3b Bump RR 2023-12-12 09:48:38 -08:00
02d8ea7615 Move drive encoder reversal before first get
Changing the direction of an encoder after reading it the first time
will create a potentially massive delta in the first odometry update.
2023-12-09 09:17:20 -08:00
01453c24d4 Add TODO comments 2023-12-05 23:18:58 -08:00
d586fc3e74 Bump RR FTC 2023-11-26 14:28:01 -08:00
416cab6f51 Log power commands 2023-11-26 13:54:10 -08:00
7ad781b8ac Replace log() with toDouble() 2023-11-26 11:46:46 -08:00
79d6297a46 Add more debug messages 2023-11-20 22:59:36 -08:00
0e66ff3715 Add missing imports 2023-11-20 20:32:35 -08:00
ddfb994c78 Localization test degrees heading 2023-11-20 20:31:08 -08:00
df17f93fd3 Fix yaml 2023-11-20 00:21:34 -08:00
a6ed85a5c4 Merge branch 'master' of github.com:acmerobotics/road-runner-quickstart 2023-11-20 00:13:42 -08:00
d955adf47c Add bug template 2023-11-20 00:13:30 -08:00
071e405f92 Require odometry wheel positions to run ManualFeedbackTuner (#292) 2023-11-12 21:55:25 -08:00
e69e475565 Bump dash 2023-11-05 16:56:40 -08:00
e291d590a6 Ensure 2 samples when drawing paths 2023-10-21 14:19:36 -07:00
a6b4dbebc4 Replace rear with back 2023-10-17 22:48:23 -07:00
42246e0296 Drive view use feedforward factory (fixes #258) 2023-10-15 09:10:55 -07:00
57db0dc66b Pipe feedforward into drive view factory 2023-10-10 14:49:06 -07:00
1b1197db0c Re-create feedforward objects 2023-10-09 22:07:21 -07:00
2584ce8d45 Add more dash tunables 2023-10-08 15:43:48 -07:00
6115d0311f Bump RR FTC 2023-10-08 15:35:35 -07:00
fc9b93aab4 Fix feedback tuner start 2023-10-07 09:58:00 -07:00
4ff75136aa Remove outdated TODO 2023-10-07 09:39:27 -07:00
dd6d43d9c0 Change default par1YTicks
This avoids populating `NaN` in odometry-derived values.
2023-10-01 15:54:38 -07:00
219e07813d Merge SDK v9.0.1 2023-09-30 17:08:15 -07:00
7cce1c43a9 Apply heading velocity correction in localizer 2023-09-24 15:29:43 -07:00
629c103d77 Bump dash 2023-09-22 18:26:54 -07:00
973ca84a46 Bump RR FTC 2023-09-19 22:20:41 -07:00
5f2fcaf51f Dead wheel overflow encoders by default 2023-09-18 22:51:19 -07:00
334252c421 Bump RR FTC 2023-09-17 16:12:05 -07:00
3c2d54523a Bump deps 2023-09-17 09:36:46 -07:00
2f2c6e6295 Rename dead wheel params 2023-09-15 18:45:36 -07:00
dc1b43e13f Remove mecanum forward kinematics rotation factors 2023-09-10 09:12:50 -07:00
b12b121447 Add more unit comments 2023-09-10 09:12:35 -07:00
5cd5bdd04e Finish tank params refactor 2023-09-10 09:12:20 -07:00
a4747329ef Merge FTC SDK v9.0 2023-09-10 07:47:04 -07:00
76e5f3e4b9 Bump RR FTC 2023-09-06 20:34:57 -07:00
1486131931 Bump RR FTC 2023-09-05 07:17:22 -07:00
7c43265fd6 Bump RR FTC 2023-09-03 21:54:12 -07:00
5c0fd0b519 Update README.md 2023-09-03 21:49:49 -07:00
e002d64689 Use RR FTC module 2023-08-26 10:48:26 -07:00
9fa28159da Remove tuning web files 2023-08-26 10:09:43 -07:00
3fc724ee39 Merge SDK v8.2 2023-08-19 19:30:45 -07:00
9db94c089d Move quickstart files to separate module 2023-08-19 19:29:18 -07:00
0c14630d40 Bump RR 2023-07-19 23:13:58 -07:00
1ae2be3dcf Add localization test
Also shorten some longer method names.
2023-05-20 22:59:50 -07:00
491bb42123 Fix tick units 2023-04-05 22:22:16 -07:00
5fbb278d09 Add link to quickstart2 docs (fixes #217) 2023-03-27 22:25:28 -07:00
1c857f2f95 Bump dash 2023-03-10 01:22:48 -08:00
db869f3e01 Adapt regression to signals 2023-03-06 22:50:07 -08:00
332704b536 Record signals on Java side 2023-03-05 21:03:45 -08:00
47151cbebd Correct velocity in ManualFeedforwardTuner 2023-03-05 18:32:16 -08:00
98cbe1e950 Fix drive powers 2023-03-02 00:23:39 -08:00
80ea431b29 Regression improvements 2023-03-02 00:16:35 -08:00
9651735df3 No intercept regression 2023-03-01 23:32:02 -08:00
b4af81739a Use IMU heading in mecanum drive localizer (fixes #215) 2023-03-01 22:17:29 -08:00
77dbd7f5e2 Move logging stuff 2023-02-02 19:49:41 -08:00
b617f87bac Add logging behind the scenes 2023-02-02 01:59:44 -08:00
8a88605057 Add setDirection() to Encoder interface (fixes #203) 2023-02-02 01:59:24 -08:00
970625f1c0 Migrate to IMU interface 2023-01-19 09:29:24 -08:00
b38e9e6153 Bump dash 2023-01-05 18:04:04 -08:00
4fc5d6465b Bump RR 2023-01-02 16:59:15 -08:00
d0b06e2d59 Update for new actions 2023-01-02 00:44:13 -08:00
2d9c57a373 Fix action drawing 2022-12-13 23:18:46 -08:00
45d77b8b35 Bump dash 2022-12-07 22:12:43 -08:00
4a3417ca49 Merge SDK v8.1.1 2022-12-07 22:11:56 -08:00
1b7f3ca0d0 Fix DriveView setDrivePowers() 2022-12-06 21:16:31 -08:00
7c9fd6c18a Hopefully fix track width regression 2022-12-06 21:15:01 -08:00
b44d5e0ca5 Fix CW turns 2022-12-05 22:00:23 -08:00
6f6a0c9771 Merge SDK v8.1 2022-11-25 10:13:20 -08:00
b7211333e6 Make Data fields in Logger OpModes public (#179) 2022-11-01 19:22:06 -07:00
57cc92bc47 Remove AccelLogger from TuningOpModes (#177) 2022-10-30 14:08:01 -07:00
05056c07b8 Bump dash 2022-09-28 22:33:54 -07:00
67b0165c32 Add initial quickstart files 2022-09-26 00:04:23 -07:00
41 changed files with 3582 additions and 2316 deletions

33
.github/ISSUE_TEMPLATE/bug.yml vendored Normal file
View File

@ -0,0 +1,33 @@
name: Bug Report
description: File a bug report
body:
- type: input
id: version
attributes:
label: RR FTC Version
description: Open `TeamCode/build.gradle` and read the version from the
`implementation "com.acmerobotics.roadrunner:ftc:LIBRARY_VERSION_HERE"`
line
placeholder: 0.1.8
validations:
required: true
- type: textarea
id: repro
attributes:
label: Observed Behavior
validations:
required: true
- type: textarea
id: tuning
attributes:
label: Tuning Files
description: Click the "Download" button on the tuning page and attach the file
(if applicable)
- type: textarea
id: logs
attributes:
label: Robot Logs
description: Download the relevant logs from `http://192.168.43.1:8080/logs` and
attach them here

5
.gitignore vendored
View File

@ -78,4 +78,7 @@ lint/intermediates/
lint/generated/
lint/outputs/
lint/tmp/
# lint/reports/
# lint/reports/
.DS_Store
/.idea

View File

@ -1,8 +1,8 @@
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:versionCode="52"
android:versionName="9.0.1">
android:versionCode="54"
android:versionName="9.2">
<uses-permission android:name="android.permission.RECEIVE_BOOT_COMPLETED" />

View File

@ -29,6 +29,7 @@
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.util.Size;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;

View File

@ -0,0 +1,104 @@
/* Copyright (c) 2024 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
/**
* This OpMode demonstrates the basics of using multiple vision portals simultaneously
*/
@TeleOp(name = "Concept: AprilTagMultiPortal", group = "Concept")
@Disabled
public class ConceptAprilTagMultiPortal extends LinearOpMode
{
VisionPortal portal1;
VisionPortal portal2;
AprilTagProcessor aprilTagProcessor1;
AprilTagProcessor aprilTagProcessor2;
@Override
public void runOpMode() throws InterruptedException
{
// Because we want to show two camera feeds simultaneously, we need to inform
// the SDK that we want it to split the camera monitor area into two smaller
// areas for us. It will then give us View IDs which we can pass to the individual
// vision portals to allow them to properly hook into the UI in tandem.
int[] viewIds = VisionPortal.makeMultiPortalView(2, VisionPortal.MultiPortalLayout.VERTICAL);
// We extract the two view IDs from the array to make our lives a little easier later.
// NB: the array is 2 long because we asked for 2 portals up above.
int portal1ViewId = viewIds[0];
int portal2ViewId = viewIds[1];
// If we want to run AprilTag detection on two portals simultaneously,
// we need to create two distinct instances of the AprilTag processor,
// one for each portal. If you want to see more detail about different
// options that you have when creating these processors, go check out
// the ConceptAprilTag OpMode.
aprilTagProcessor1 = AprilTagProcessor.easyCreateWithDefaults();
aprilTagProcessor2 = AprilTagProcessor.easyCreateWithDefaults();
// Now we build both portals. The CRITICAL thing to notice here is the call to
// setLiveViewContainerId(), where we pass in the IDs we received earlier from
// makeMultiPortalView().
portal1 = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"))
.setLiveViewContainerId(portal1ViewId)
.addProcessor(aprilTagProcessor1)
.build();
portal2 = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 2"))
.setLiveViewContainerId(portal2ViewId)
.addProcessor(aprilTagProcessor2)
.build();
waitForStart();
// Main Loop
while (opModeIsActive())
{
// Just show some basic telemetry to demonstrate both processors are working in parallel
// on their respective cameras. If you want to see more detail about the information you
// can get back from the processor, you should look at ConceptAprilTag.
telemetry.addData("Number of tags in Camera 1", aprilTagProcessor1.getDetections().size());
telemetry.addData("Number of tags in Camera 2", aprilTagProcessor2.getDetections().size());
telemetry.update();
sleep(20);
}
}
}

View File

@ -1,202 +0,0 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of using both AprilTag recognition and TensorFlow
* Object Detection.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: Double Vision", group = "Concept")
@Disabled
public class ConceptDoubleVision extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
/**
* The variable to store our instance of the AprilTag processor.
*/
private AprilTagProcessor aprilTag;
/**
* The variable to store our instance of the TensorFlow Object Detection processor.
*/
private TfodProcessor tfod;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal myVisionPortal;
@Override
public void runOpMode() {
initDoubleVision();
// This OpMode loops continuously, allowing the user to switch between
// AprilTag and TensorFlow Object Detection (TFOD) image processors.
while (!isStopRequested()) {
if (opModeInInit()) {
telemetry.addData("DS preview on/off","3 dots, Camera Stream");
telemetry.addLine();
telemetry.addLine("----------------------------------------");
}
if (myVisionPortal.getProcessorEnabled(aprilTag)) {
// User instructions: Dpad left or Dpad right.
telemetry.addLine("Dpad Left to disable AprilTag");
telemetry.addLine();
telemetryAprilTag();
} else {
telemetry.addLine("Dpad Right to enable AprilTag");
}
telemetry.addLine();
telemetry.addLine("----------------------------------------");
if (myVisionPortal.getProcessorEnabled(tfod)) {
telemetry.addLine("Dpad Down to disable TFOD");
telemetry.addLine();
telemetryTfod();
} else {
telemetry.addLine("Dpad Up to enable TFOD");
}
// Push telemetry to the Driver Station.
telemetry.update();
if (gamepad1.dpad_left) {
myVisionPortal.setProcessorEnabled(aprilTag, false);
} else if (gamepad1.dpad_right) {
myVisionPortal.setProcessorEnabled(aprilTag, true);
}
if (gamepad1.dpad_down) {
myVisionPortal.setProcessorEnabled(tfod, false);
} else if (gamepad1.dpad_up) {
myVisionPortal.setProcessorEnabled(tfod, true);
}
sleep(20);
} // end while loop
} // end method runOpMode()
/**
* Initialize AprilTag and TFOD.
*/
private void initDoubleVision() {
// -----------------------------------------------------------------------------------------
// AprilTag Configuration
// -----------------------------------------------------------------------------------------
aprilTag = new AprilTagProcessor.Builder()
.build();
// -----------------------------------------------------------------------------------------
// TFOD Configuration
// -----------------------------------------------------------------------------------------
tfod = new TfodProcessor.Builder()
.build();
// -----------------------------------------------------------------------------------------
// Camera Configuration
// -----------------------------------------------------------------------------------------
if (USE_WEBCAM) {
myVisionPortal = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"))
.addProcessors(tfod, aprilTag)
.build();
} else {
myVisionPortal = new VisionPortal.Builder()
.setCamera(BuiltinCameraDirection.BACK)
.addProcessors(tfod, aprilTag)
.build();
}
} // end initDoubleVision()
/**
* Add telemetry about AprilTag detections.
*/
private void telemetryAprilTag() {
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
telemetry.addData("# AprilTags Detected", currentDetections.size());
// Step through the list of detections and display info for each one.
for (AprilTagDetection detection : currentDetections) {
if (detection.metadata != null) {
telemetry.addLine(String.format("\n==== (ID %d) %s", detection.id, detection.metadata.name));
telemetry.addLine(String.format("XYZ %6.1f %6.1f %6.1f (inch)", detection.ftcPose.x, detection.ftcPose.y, detection.ftcPose.z));
telemetry.addLine(String.format("PRY %6.1f %6.1f %6.1f (deg)", detection.ftcPose.pitch, detection.ftcPose.roll, detection.ftcPose.yaw));
telemetry.addLine(String.format("RBE %6.1f %6.1f %6.1f (inch, deg, deg)", detection.ftcPose.range, detection.ftcPose.bearing, detection.ftcPose.elevation));
} else {
telemetry.addLine(String.format("\n==== (ID %d) Unknown", detection.id));
telemetry.addLine(String.format("Center %6.0f %6.0f (pixels)", detection.center.x, detection.center.y));
}
} // end for() loop
} // end method telemetryAprilTag()
/**
* Add telemetry about TensorFlow Object Detection (TFOD) recognitions.
*/
private void telemetryTfod() {
List<Recognition> currentRecognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", currentRecognitions.size());
// Step through the list of recognitions and display info for each one.
for (Recognition recognition : currentRecognitions) {
double x = (recognition.getLeft() + recognition.getRight()) / 2 ;
double y = (recognition.getTop() + recognition.getBottom()) / 2 ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100);
telemetry.addData("- Position", "%.0f / %.0f", x, y);
telemetry.addData("- Size", "%.0f x %.0f", recognition.getWidth(), recognition.getHeight());
} // end for() loop
} // end method telemetryTfod()
} // end class

View File

@ -1,198 +0,0 @@
/* Copyright (c) 2019 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of TensorFlow Object Detection,
* including Java Builder structures for specifying Vision parameters.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: TensorFlow Object Detection", group = "Concept")
@Disabled
public class ConceptTensorFlowObjectDetection extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
// TFOD_MODEL_ASSET points to a model file stored in the project Asset location,
// this is only used for Android Studio when using models in Assets.
private static final String TFOD_MODEL_ASSET = "MyModelStoredAsAsset.tflite";
// TFOD_MODEL_FILE points to a model file stored onboard the Robot Controller's storage,
// this is used when uploading models directly to the RC using the model upload interface.
private static final String TFOD_MODEL_FILE = "/sdcard/FIRST/tflitemodels/myCustomModel.tflite";
// Define the labels recognized in the model for TFOD (must be in training order!)
private static final String[] LABELS = {
"Pixel",
};
/**
* The variable to store our instance of the TensorFlow Object Detection processor.
*/
private TfodProcessor tfod;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initTfod();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryTfod();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end runOpMode()
/**
* Initialize the TensorFlow Object Detection processor.
*/
private void initTfod() {
// Create the TensorFlow processor by using a builder.
tfod = new TfodProcessor.Builder()
// With the following lines commented out, the default TfodProcessor Builder
// will load the default model for the season. To define a custom model to load,
// choose one of the following:
// Use setModelAssetName() if the custom TF Model is built in as an asset (AS only).
// Use setModelFileName() if you have downloaded a custom team model to the Robot Controller.
//.setModelAssetName(TFOD_MODEL_ASSET)
//.setModelFileName(TFOD_MODEL_FILE)
// The following default settings are available to un-comment and edit as needed to
// set parameters for custom models.
//.setModelLabels(LABELS)
//.setIsModelTensorFlow2(true)
//.setIsModelQuantized(true)
//.setModelInputSize(300)
//.setModelAspectRatio(16.0 / 9.0)
.build();
// Create the vision portal by using a builder.
VisionPortal.Builder builder = new VisionPortal.Builder();
// Set the camera (webcam vs. built-in RC phone camera).
if (USE_WEBCAM) {
builder.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"));
} else {
builder.setCamera(BuiltinCameraDirection.BACK);
}
// Choose a camera resolution. Not all cameras support all resolutions.
//builder.setCameraResolution(new Size(640, 480));
// Enable the RC preview (LiveView). Set "false" to omit camera monitoring.
//builder.enableLiveView(true);
// Set the stream format; MJPEG uses less bandwidth than default YUY2.
//builder.setStreamFormat(VisionPortal.StreamFormat.YUY2);
// Choose whether or not LiveView stops if no processors are enabled.
// If set "true", monitor shows solid orange screen if no processors enabled.
// If set "false", monitor shows camera view without annotations.
//builder.setAutoStopLiveView(false);
// Set and enable the processor.
builder.addProcessor(tfod);
// Build the Vision Portal, using the above settings.
visionPortal = builder.build();
// Set confidence threshold for TFOD recognitions, at any time.
//tfod.setMinResultConfidence(0.75f);
// Disable or re-enable the TFOD processor at any time.
//visionPortal.setProcessorEnabled(tfod, true);
} // end method initTfod()
/**
* Add telemetry about TensorFlow Object Detection (TFOD) recognitions.
*/
private void telemetryTfod() {
List<Recognition> currentRecognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", currentRecognitions.size());
// Step through the list of recognitions and display info for each one.
for (Recognition recognition : currentRecognitions) {
double x = (recognition.getLeft() + recognition.getRight()) / 2 ;
double y = (recognition.getTop() + recognition.getBottom()) / 2 ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100);
telemetry.addData("- Position", "%.0f / %.0f", x, y);
telemetry.addData("- Size", "%.0f x %.0f", recognition.getWidth(), recognition.getHeight());
} // end for() loop
} // end method telemetryTfod()
} // end class

View File

@ -1,142 +0,0 @@
/* Copyright (c) 2019 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of TensorFlow Object Detection, using
* the easiest way.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: TensorFlow Object Detection Easy", group = "Concept")
@Disabled
public class ConceptTensorFlowObjectDetectionEasy extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
/**
* The variable to store our instance of the TensorFlow Object Detection processor.
*/
private TfodProcessor tfod;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initTfod();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryTfod();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end runOpMode()
/**
* Initialize the TensorFlow Object Detection processor.
*/
private void initTfod() {
// Create the TensorFlow processor the easy way.
tfod = TfodProcessor.easyCreateWithDefaults();
// Create the vision portal the easy way.
if (USE_WEBCAM) {
visionPortal = VisionPortal.easyCreateWithDefaults(
hardwareMap.get(WebcamName.class, "Webcam 1"), tfod);
} else {
visionPortal = VisionPortal.easyCreateWithDefaults(
BuiltinCameraDirection.BACK, tfod);
}
} // end method initTfod()
/**
* Add telemetry about TensorFlow Object Detection (TFOD) recognitions.
*/
private void telemetryTfod() {
List<Recognition> currentRecognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", currentRecognitions.size());
// Step through the list of recognitions and display info for each one.
for (Recognition recognition : currentRecognitions) {
double x = (recognition.getLeft() + recognition.getRight()) / 2 ;
double y = (recognition.getTop() + recognition.getBottom()) / 2 ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100);
telemetry.addData("- Position", "%.0f / %.0f", x, y);
telemetry.addData("- Size", "%.0f x %.0f", recognition.getWidth(), recognition.getHeight());
} // end for() loop
} // end method telemetryTfod()
} // end class

View File

@ -1,186 +0,0 @@
/* Copyright (c) 2020 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.hardware.camera.CameraName;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.VisionPortal.CameraState;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of TensorFlow Object Detection, using
* two webcams.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: TensorFlow Object Detection Switchable Cameras", group = "Concept")
@Disabled
public class ConceptTensorFlowObjectDetectionSwitchableCameras extends LinearOpMode {
/**
* Variables used for switching cameras.
*/
private WebcamName webcam1, webcam2;
private boolean oldLeftBumper;
private boolean oldRightBumper;
/**
* The variable to store our instance of the TensorFlow Object Detection processor.
*/
private TfodProcessor tfod;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initTfod();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryCameraSwitching();
telemetryTfod();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
doCameraSwitching();
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end runOpMode()
/**
* Initialize the TensorFlow Object Detection processor.
*/
private void initTfod() {
// Create the TensorFlow processor by using a builder.
tfod = new TfodProcessor.Builder().build();
webcam1 = hardwareMap.get(WebcamName.class, "Webcam 1");
webcam2 = hardwareMap.get(WebcamName.class, "Webcam 2");
CameraName switchableCamera = ClassFactory.getInstance()
.getCameraManager().nameForSwitchableCamera(webcam1, webcam2);
// Create the vision portal by using a builder.
visionPortal = new VisionPortal.Builder()
.setCamera(switchableCamera)
.addProcessor(tfod)
.build();
} // end method initTfod()
/**
* Add telemetry about camera switching.
*/
private void telemetryCameraSwitching() {
if (visionPortal.getActiveCamera().equals(webcam1)) {
telemetry.addData("activeCamera", "Webcam 1");
telemetry.addData("Press RightBumper", "to switch to Webcam 2");
} else {
telemetry.addData("activeCamera", "Webcam 2");
telemetry.addData("Press LeftBumper", "to switch to Webcam 1");
}
} // end method telemetryCameraSwitching()
/**
* Add telemetry about TensorFlow Object Detection (TFOD) recognitions.
*/
private void telemetryTfod() {
List<Recognition> currentRecognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", currentRecognitions.size());
// Step through the list of recognitions and display info for each one.
for (Recognition recognition : currentRecognitions) {
double x = (recognition.getLeft() + recognition.getRight()) / 2 ;
double y = (recognition.getTop() + recognition.getBottom()) / 2 ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100);
telemetry.addData("- Position", "%.0f / %.0f", x, y);
telemetry.addData("- Size", "%.0f x %.0f", recognition.getWidth(), recognition.getHeight());
} // end for() loop
} // end method telemetryTfod()
/**
* Set the active camera according to input from the gamepad.
*/
private void doCameraSwitching() {
if (visionPortal.getCameraState() == CameraState.STREAMING) {
// If the left bumper is pressed, use Webcam 1.
// If the right bumper is pressed, use Webcam 2.
boolean newLeftBumper = gamepad1.left_bumper;
boolean newRightBumper = gamepad1.right_bumper;
if (newLeftBumper && !oldLeftBumper) {
visionPortal.setActiveCamera(webcam1);
} else if (newRightBumper && !oldRightBumper) {
visionPortal.setActiveCamera(webcam2);
}
oldLeftBumper = newLeftBumper;
oldRightBumper = newRightBumper;
}
} // end method doCameraSwitching()
} // end class

View File

@ -0,0 +1,78 @@
/* Copyright (c) 2024 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DigitalChannel;
/*
* This OpMode demonstrates how to use a digital channel.
*
* The OpMode assumes that the digital channel is configured with a name of "digitalTouch".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Sensor: digital channel", group = "Sensor")
@Disabled
public class SensorDigitalTouch extends LinearOpMode {
DigitalChannel digitalTouch; // Digital channel Object
@Override
public void runOpMode() {
// get a reference to our touchSensor object.
digitalTouch = hardwareMap.get(DigitalChannel.class, "digitalTouch");
digitalTouch.setMode(DigitalChannel.Mode.INPUT);
telemetry.addData("DigitalTouchSensorExample", "Press start to continue...");
telemetry.update();
// wait for the start button to be pressed.
waitForStart();
// while the OpMode is active, loop and read the digital channel.
// Note we use opModeIsActive() as our loop condition because it is an interruptible method.
while (opModeIsActive()) {
// button is pressed if value returned is LOW or false.
// send the info back to driver station using telemetry function.
if (digitalTouch.getState() == false) {
telemetry.addData("Button", "PRESSED");
} else {
telemetry.addData("Button", "NOT PRESSED");
}
telemetry.update();
}
}
}

View File

@ -0,0 +1,141 @@
/*
* Copyright (c) 2024 DigitalChickenLabs
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.digitalchickenlabs.OctoQuad;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import org.firstinspires.ftc.robotcore.external.Telemetry;
/*
* This OpMode illustrates how to use the DigitalChickenLabs OctoQuad Quadrature Encoder & Pulse Width Interface Module
*
* The OctoQuad has 8 input channels that can used to read either Relative Quadrature Encoders or Pulse-Width Absolute Encoder inputs.
* Relative Quadrature encoders are found on most FTC motors, and some stand-alone position sensors like the REV Thru-Bore encoder.
* Pulse-Width encoders are less common. The REV Thru-Bore encoder can provide its absolute position via a variable pulse width,
* as can several sonar rangefinders such as the MaxBotix MB1000 series.
*
* This basic sample shows how an OctoQuad can be used to read the position three Odometry pods fitted
* with REV Thru-Bore encoders. For a more advanced example showing additional OctoQuad capabilities, see the SensorOctoQuadAdv sample.
*
* This OpMode assumes that the OctoQuad is attached to an I2C interface named "octoquad" in the robot configuration.
*
* The code assumes the first three OctoQuad inputs are connected as follows
* - Chan 0: for measuring forward motion on the left side of the robot.
* - Chan 1: for measuring forward motion on the right side of the robot.
* - Chan 2: for measuring Lateral (strafing) motion.
*
* The encoder values may be reset to zero by pressing the X (left most) button on Gamepad 1.
*
* This sample does not show how to interpret these readings, just how to obtain and display them.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* See the sensor's product page: https://www.tindie.com/products/digitalchickenlabs/octoquad-8ch-quadrature-pulse-width-decoder/
*/
@TeleOp(name = "OctoQuad Basic", group="OctoQuad")
@Disabled
public class SensorOctoQuad extends LinearOpMode {
// Identify which encoder OctoQuad inputs are connected to each odometry pod.
private final int ODO_LEFT = 0; // Facing forward direction on left side of robot (Axial motion)
private final int ODO_RIGHT = 1; // Facing forward direction on right side or robot (Axial motion )
private final int ODO_PERP = 2; // Facing perpendicular direction at the center of the robot (Lateral motion)
// Declare the OctoQuad object and members to store encoder positions and velocities
private OctoQuad octoquad;
private int posLeft;
private int posRight;
private int posPerp;
/**
* This function is executed when this OpMode is selected from the Driver Station.
*/
@Override
public void runOpMode() {
// Connect to OctoQuad by referring to its name in the Robot Configuration.
octoquad = hardwareMap.get(OctoQuad.class, "octoquad");
// Read the Firmware Revision number from the OctoQuad and display it as telemetry.
telemetry.addData("OctoQuad Firmware Version ", octoquad.getFirmwareVersion());
// Reverse the count-direction of any encoder that is not what you require.
// Eg: if you push the robot forward and the left encoder counts down, then reverse it so it counts up.
octoquad.setSingleEncoderDirection(ODO_LEFT, OctoQuad.EncoderDirection.REVERSE);
octoquad.setSingleEncoderDirection(ODO_RIGHT, OctoQuad.EncoderDirection.FORWARD);
octoquad.setSingleEncoderDirection(ODO_PERP, OctoQuad.EncoderDirection.FORWARD);
// Any changes that are made should be saved in FLASH just in case there is a sensor power glitch.
octoquad.saveParametersToFlash();
telemetry.addLine("\nPress Play to read encoder values");
telemetry.update();
waitForStart();
// Configure the telemetry for optimal display of data.
telemetry.setDisplayFormat(Telemetry.DisplayFormat.MONOSPACE);
telemetry.setMsTransmissionInterval(50);
// Set all the encoder inputs to zero
octoquad.resetAllPositions();
// Loop while displaying the odometry pod positions.
while (opModeIsActive()) {
telemetry.addData(">", "Press X to Reset Encoders\n");
// Check for X button to reset encoders
if (gamepad1.x) {
// Reset the position of all encoders to zero.
octoquad.resetAllPositions();
}
// Read all the encoder data. Load into local members
readOdometryPods();
// Display the values
telemetry.addData("Left ", "%8d counts", posLeft);
telemetry.addData("Right", "%8d counts", posRight);
telemetry.addData("Perp ", "%8d counts", posPerp);
telemetry.update();
}
}
private void readOdometryPods() {
// For best performance, we should only perform ONE transaction with the OctoQuad each cycle.
// Since this example only needs to read positions from a few channels, we could use either
// readPositionRange(idxFirst, idxLast) to get a select number of sequential channels
// or
// readAllPositions() to get all 8 encoder readings
//
// Since both calls take almost the same amount of time, and the actual channels may not end up being sequential,
// we will read all of the encoder positions, and then pick out the ones we need.
int[] positions = octoquad.readAllPositions();
posLeft = positions[ODO_LEFT];
posRight = positions[ODO_RIGHT];
posPerp = positions[ODO_PERP];
}
}

View File

@ -0,0 +1,278 @@
/*
* Copyright (c) 2024 DigitalChickenLabs
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.digitalchickenlabs.OctoQuad;
import com.qualcomm.hardware.digitalchickenlabs.OctoQuadBase;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.ElapsedTime;
import com.qualcomm.robotcore.util.MovingStatistics;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import org.firstinspires.ftc.robotcore.external.Telemetry;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import java.util.ArrayList;
import java.util.List;
/*
* This OpMode illustrates how to use advanced features of the DigitalChickenLabs OctoQuad Quadrature Encoder & Pulse Width Interface Module
*
* The OctoQuad has 8 input channels that can used to read either Quadrature Encoder signals (like with most FTC motors)
* or Pulse-Width style Absolute Encoder inputs (eg: REV Through Bore encoder pulse width output).
*
* This OpMode illustrates several of the more advanced features of an OctoQuad, including Pulse Width measurement and velocity measurement.
* For a more basic OpMode just showing how to read encoder inputs, see the SensorOctoQuad sample.
*
* This OpMode assumes that the OctoQuad is attached to an I2C interface named "octoquad" in the robot configuration.
*
* One system that requires a lot of encoder inputs is a Swerve Drive system.
* Such a drive requires two encoders per drive module: one for position & velocity of the Drive motor/wheel, and one for position/angle of the Steering motor.
* The Drive motor usually requires a quadrature encoder, and the Steering motor requires an Absolute encoder.
* This quantity and combination of encoder inputs is a challenge, but a single OctoQuad could be used to read them all.
*
* This sample will configure an OctoQuad for a swerve drive, as follows
* - Configure 4 Relative Quadrature Encoder inputs and 4 Absolute Pulse-Width Encoder inputs
* - Configure a velocity sample interval of 25 mSec
* - Configure a pulse width input range of 1-1024 uSec for a REV Through Bore Encoder's Absolute Pulse output.
*
* An OctoSwerveDrive class will be created to initialize the OctoQuad, and manage the 4 swerve modules.
* An OctoSwerveModule class will be created to configure and read a single swerve module.
*
* Wiring:
* The OctoQuad will be configured to accept Quadrature encoders on the first four channels and Absolute (pulse width) encoders on the last four channels.
*
* The standard 4-pin to 4-pin cable can be used to connect each Driver Motor encoder to the OctoQuad. (channels 0-3)
*
* A modified version of the REV 6-4 pin cable (shipped with the encoder) connects the steering encoder to the OctoQuad. (channels 4-7)
* To connect the Absolute position signal from a REV Thru-Bore encoder to the OctoQuad, the Provided 6-pin to 4-pin cable will need to be modified.
* At the 6-pin connector end, move the yellow wire from its initial pin-3 position to the ABS pin-5 position. This can be done easily
* by using a small flathead screwdriver to lift the small white tab holding the metal wire crimp in place and gently pulling the wire out.
* See the OctoSwerveDrive() constructor below for the correct wheel/channel assignment.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* Note: If you prefer, you can move the two support classes from this file, and place them in their own files.
* But leaving them in place is simpler for this example.
*
* See the sensor's product page: https://www.tindie.com/products/digitalchickenlabs/octoquad-8ch-quadrature-pulse-width-decoder/
*/
@TeleOp(name="OctoQuad Advanced", group="OctoQuad")
@Disabled
public class SensorOctoQuadAdv extends LinearOpMode {
@Override
public void runOpMode() throws InterruptedException {
// Connect to the OctoQuad by looking up its name in the hardwareMap.
OctoQuad octoquad = hardwareMap.get(OctoQuad.class, "octoquad");
// Create the interface for the Swerve Drive Encoders
OctoSwerveDrive octoSwerveDrive = new OctoSwerveDrive(octoquad);
// Display the OctoQuad firmware revision
telemetry.addLine("OctoQuad Firmware v" + octoquad.getFirmwareVersion());
telemetry.addLine("\nPress Play to read encoder values");
telemetry.update();
waitForStart();
// Configure the telemetry for optimal display of data.
telemetry.setDisplayFormat(Telemetry.DisplayFormat.MONOSPACE);
telemetry.setMsTransmissionInterval(50);
// Run stats to determine cycle times.
MovingStatistics avgTime = new MovingStatistics(100);
ElapsedTime elapsedTime = new ElapsedTime();
while (opModeIsActive()) {
telemetry.addData(">", "Press X to Reset Encoders\n");
if(gamepad1.x) {
octoquad.resetAllPositions();
}
// read all the swerve drive encoders and update positions and velocities.
octoSwerveDrive.updateModules();
octoSwerveDrive.show(telemetry);
// Update cycle time stats
avgTime.add(elapsedTime.nanoseconds());
elapsedTime.reset();
telemetry.addData("Loop time", "%.1f mS", avgTime.getMean()/1000000);
telemetry.update();
}
}
}
// ============================ Internal (Inner) Classes =============================
/***
* OctoSwerveDrive class manages 4 Swerve Modules
* - Performs general OctoQuad initialization
* - Creates 4 module classes, one for each swerve module
* - Updates swerve drive status by reading all data from OctoQuad and Updating each module
* - Displays all swerve drive data as telemetry
*/
class OctoSwerveDrive {
private final OctoQuad octoquad;
private final List<OctoSwerveModule> allModules = new ArrayList<>();
// members to hold encoder data for each module.
public final OctoSwerveModule LeftFront;
public final OctoSwerveModule RightFront;
public final OctoSwerveModule LeftBack;
public final OctoSwerveModule RightBack;
// Prepare an object to hold an entire OctoQuad encoder readable register bank (pos and vel)
private final OctoQuad.EncoderDataBlock encoderDataBlock = new OctoQuad.EncoderDataBlock();
public OctoSwerveDrive(OctoQuad octoquad) {
this.octoquad = octoquad;
// Clear out all prior settings and encoder data before setting up desired configuration
octoquad.resetEverything();
// Assume first 4 channels are relative encoders and the next 4 are absolute encoders
octoquad.setChannelBankConfig(OctoQuad.ChannelBankConfig.BANK1_QUADRATURE_BANK2_PULSE_WIDTH);
// Create the four OctoSwerveModules, and add them to a 'list' for easier reference.
// Note: The wheel/channel order is set here. Ensure your encoder cables match.
//
// Note: The angleOffset must be set for each module so a forward facing wheel shows a steer angle of 0 degrees.
// The process for determining the correct angleOffsets is as follows:
// 1) Set all the angleValues (below) to zero then build and deploy the code.
// 2) Physically rotate all the swerve drives so the wheels are facing forward in the desired 0 degree position
// 3) Run the OpMode, view the telemetry and record the "Degrees" value for each module.
// 4) Update the code by entering the recorded Degrees value for each module as the angleOffset (last) parameter in the lines below.
//
// Rebuild and deploy the new code. Verify that the telemetry now indicates 0 degrees when the wheels are facing forward.
// Also verify that the correct module values change appropriately when you manually spin (drive) and rotate (steer) a wheel.
allModules.add(LeftFront = new OctoSwerveModule(octoquad, "LF ",0,0));// Drive = 0, Steer = 4
allModules.add(RightFront = new OctoSwerveModule(octoquad, "RF ",1,0));// Drive = 1, Steer = 5
allModules.add(LeftBack = new OctoSwerveModule(octoquad, "LB ",2,0));// Drive = 2, Steer = 6
allModules.add(RightBack = new OctoSwerveModule(octoquad, "RB ",3,0));// Drive = 3, Steer = 7
// now make sure the settings persist through any power glitches.
octoquad.saveParametersToFlash();
}
/**
* Updates all 4 swerve modules
*/
public void updateModules() {
// Read full OctoQuad data block and then pass DataBlock to each swerve module to update themselves.
octoquad.readAllEncoderData(encoderDataBlock);
for (OctoSwerveModule module : allModules) {
module.updateModule(encoderDataBlock);
}
}
/**
* Generate telemetry data for all modules.
* @param telemetry OpMode Telemetry object
*/
public void show(Telemetry telemetry) {
// create general header block and then have each module add its own telemetry
telemetry.addData("pos", " Count CPS Degree DPS");
for (OctoSwerveModule module : allModules) {
module.show(telemetry);
}
}
}
/***
* The OctoSwerveModule class manages a single swerve module
*/
class OctoSwerveModule {
public double driveCounts;
public double driveCountsPerSec;
public double steerDegrees;
public double steerDegreesPerSec;
private final String name;
private final int channel;
private final double angleOffset;
private final double steerDirMult;
private static final int VELOCITY_SAMPLE_INTERVAL_MS = 25; // To provide 40 updates per second.
private static final double DEGREES_PER_US = (360.0 / 1024.0); // based on REV Through Bore Encoder
private static final double VELOCITY_SAMPLES_PER_S = (1000.0 / VELOCITY_SAMPLE_INTERVAL_MS);
// The correct drive and turn directions must be set for the Swerve Module based on the specific hardware geometry.
// Forward motion must generate an increasing drive count.
// Counter Clockwise steer rotation must generate an increasing Steer Angle (degrees)
private static final boolean INVERT_DRIVE_ENCODER = false; // Set true if forward motion decreases drive "Count"
private static final boolean INVERT_STEER_ENCODER = false; // Set true if counter clockwise steer action decreases steer "Degree"
/***
* @param octoquad provide access to configure OctoQuad
* @param name name used for telemetry display
* @param quadChannel Quadrature encoder channel. Pulse Width channel is this + 4
* @param angleOffset Angle to subtract from absolute encoder to calibrate zero position. (see comments above)
*/
public OctoSwerveModule (OctoQuad octoquad, String name, int quadChannel, double angleOffset) {
this.name = name;
this.channel = quadChannel;
this.angleOffset = angleOffset;
this.steerDirMult = INVERT_STEER_ENCODER ? -1 : 1 ; // create a multiplier to flip the steer angle.
// Set the drive encoder direction. Note the absolute encoder does not have built-in direction inversion.
octoquad.setSingleEncoderDirection(channel, INVERT_DRIVE_ENCODER ? OctoQuad.EncoderDirection.REVERSE : OctoQuad.EncoderDirection.FORWARD);
// Set the velocity sample interval on both encoders
octoquad.setSingleVelocitySampleInterval(channel, VELOCITY_SAMPLE_INTERVAL_MS);
octoquad.setSingleVelocitySampleInterval(channel + 4, VELOCITY_SAMPLE_INTERVAL_MS);
// Setup Absolute encoder pulse range to match REV Through Bore encoder.
octoquad.setSingleChannelPulseWidthParams (channel + 4, new OctoQuad.ChannelPulseWidthParams(1,1024));
}
/***
* Calculate the Swerve module's position and velocity values
* @param encoderDataBlock most recent full data block read from OctoQuad.
*/
public void updateModule(OctoQuad.EncoderDataBlock encoderDataBlock) {
driveCounts = encoderDataBlock.positions[channel]; // get Counts.
driveCountsPerSec = encoderDataBlock.velocities[channel] * VELOCITY_SAMPLES_PER_S; // convert counts/interval to counts/sec
// convert uS to degrees. Add in any possible direction flip.
steerDegrees = AngleUnit.normalizeDegrees((encoderDataBlock.positions[channel+ 4] * DEGREES_PER_US * steerDirMult) - angleOffset);
// convert uS/interval to deg per sec. Add in any possible direction flip.
steerDegreesPerSec = encoderDataBlock.velocities[channel + 4] * DEGREES_PER_US * steerDirMult * VELOCITY_SAMPLES_PER_S;
}
/**
* Display the Swerve module's state as telemetry
* @param telemetry OpMode Telemetry object
*/
public void show(Telemetry telemetry) {
telemetry.addData(name, "%8.0f %7.0f %7.0f %6.0f", driveCounts, driveCountsPerSec, steerDegrees, steerDegreesPerSec);
}
}

View File

@ -0,0 +1,156 @@
/*
SPDX-License-Identifier: MIT
Copyright (c) 2024 SparkFun Electronics
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.hardware.sparkfun.SparkFunOTOS;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
/*
* This OpMode illustrates how to use the SparkFun Qwiic Optical Tracking Odometry Sensor (OTOS)
*
* The OpMode assumes that the sensor is configured with a name of "sensor_otos".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* See the sensor's product page: https://www.sparkfun.com/products/24904
*/
@TeleOp(name = "Sensor: SparkFun OTOS", group = "Sensor")
@Disabled
public class SensorSparkFunOTOS extends LinearOpMode {
// Create an instance of the sensor
SparkFunOTOS myOtos;
@Override
public void runOpMode() throws InterruptedException {
// Get a reference to the sensor
myOtos = hardwareMap.get(SparkFunOTOS.class, "sensor_otos");
// All the configuration for the OTOS is done in this helper method, check it out!
configureOtos();
// Wait for the start button to be pressed
waitForStart();
// Loop until the OpMode ends
while (opModeIsActive()) {
// Get the latest position, which includes the x and y coordinates, plus the
// heading angle
SparkFunOTOS.Pose2D pos = myOtos.getPosition();
// Reset the tracking if the user requests it
if (gamepad1.y) {
myOtos.resetTracking();
}
// Re-calibrate the IMU if the user requests it
if (gamepad1.x) {
myOtos.calibrateImu();
}
// Inform user of available controls
telemetry.addLine("Press Y (triangle) on Gamepad to reset tracking");
telemetry.addLine("Press X (square) on Gamepad to calibrate the IMU");
telemetry.addLine();
// Log the position to the telemetry
telemetry.addData("X coordinate", pos.x);
telemetry.addData("Y coordinate", pos.y);
telemetry.addData("Heading angle", pos.h);
// Update the telemetry on the driver station
telemetry.update();
}
}
private void configureOtos() {
telemetry.addLine("Configuring OTOS...");
telemetry.update();
// Set the desired units for linear and angular measurements. Can be either
// meters or inches for linear, and radians or degrees for angular. If not
// set, the default is inches and degrees. Note that this setting is not
// persisted in the sensor, so you need to set at the start of all your
// OpModes if using the non-default value.
// myOtos.setLinearUnit(DistanceUnit.METER);
myOtos.setLinearUnit(DistanceUnit.INCH);
// myOtos.setAngularUnit(AnguleUnit.RADIANS);
myOtos.setAngularUnit(AngleUnit.DEGREES);
// Assuming you've mounted your sensor to a robot and it's not centered,
// you can specify the offset for the sensor relative to the center of the
// robot. The units default to inches and degrees, but if you want to use
// different units, specify them before setting the offset! Note that as of
// firmware version 1.0, these values will be lost after a power cycle, so
// you will need to set them each time you power up the sensor. For example, if
// the sensor is mounted 5 inches to the left (negative X) and 10 inches
// forward (positive Y) of the center of the robot, and mounted 90 degrees
// clockwise (negative rotation) from the robot's orientation, the offset
// would be {-5, 10, -90}. These can be any value, even the angle can be
// tweaked slightly to compensate for imperfect mounting (eg. 1.3 degrees).
SparkFunOTOS.Pose2D offset = new SparkFunOTOS.Pose2D(0, 0, 0);
myOtos.setOffset(offset);
// Here we can set the linear and angular scalars, which can compensate for
// scaling issues with the sensor measurements. Note that as of firmware
// version 1.0, these values will be lost after a power cycle, so you will
// need to set them each time you power up the sensor. They can be any value
// from 0.872 to 1.127 in increments of 0.001 (0.1%). It is recommended to
// first set both scalars to 1.0, then calibrate the angular scalar, then
// the linear scalar. To calibrate the angular scalar, spin the robot by
// multiple rotations (eg. 10) to get a precise error, then set the scalar
// to the inverse of the error. Remember that the angle wraps from -180 to
// 180 degrees, so for example, if after 10 rotations counterclockwise
// (positive rotation), the sensor reports -15 degrees, the required scalar
// would be 3600/3585 = 1.004. To calibrate the linear scalar, move the
// robot a known distance and measure the error; do this multiple times at
// multiple speeds to get an average, then set the linear scalar to the
// inverse of the error. For example, if you move the robot 100 inches and
// the sensor reports 103 inches, set the linear scalar to 100/103 = 0.971
myOtos.setLinearScalar(1.0);
myOtos.setAngularScalar(1.0);
// The IMU on the OTOS includes a gyroscope and accelerometer, which could
// have an offset. Note that as of firmware version 1.0, the calibration
// will be lost after a power cycle; the OTOS performs a quick calibration
// when it powers up, but it is recommended to perform a more thorough
// calibration at the start of all your OpModes. Note that the sensor must
// be completely stationary and flat during calibration! When calling
// calibrateImu(), you can specify the number of samples to take and whether
// to wait until the calibration is complete. If no parameters are provided,
// it will take 255 samples and wait until done; each sample takes about
// 2.4ms, so about 612ms total
myOtos.calibrateImu();
// Reset the tracking algorithm - this resets the position to the origin,
// but can also be used to recover from some rare tracking errors
myOtos.resetTracking();
// After resetting the tracking, the OTOS will report that the robot is at
// the origin. If your robot does not start at the origin, or you have
// another source of location information (eg. vision odometry), you can set
// the OTOS location to match and it will continue to track from there.
SparkFunOTOS.Pose2D currentPosition = new SparkFunOTOS.Pose2D(0, 0, 0);
myOtos.setPosition(currentPosition);
// Get the hardware and firmware version
SparkFunOTOS.Version hwVersion = new SparkFunOTOS.Version();
SparkFunOTOS.Version fwVersion = new SparkFunOTOS.Version();
myOtos.getVersionInfo(hwVersion, fwVersion);
telemetry.addLine("OTOS configured! Press start to get position data!");
telemetry.addLine();
telemetry.addLine(String.format("OTOS Hardware Version: v%d.%d", hwVersion.major, hwVersion.minor));
telemetry.addLine(String.format("OTOS Firmware Version: v%d.%d", fwVersion.major, fwVersion.minor));
telemetry.update();
}
}

View File

@ -0,0 +1,812 @@
/*
* Copyright (c) 2024 DigitalChickenLabs
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.digitalchickenlabs.OctoQuad;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.Gamepad;
import org.firstinspires.ftc.robotcore.external.Telemetry;
import java.util.ArrayList;
import java.util.Stack;
/*
* This OpMode illustrates how to use the DigitalChickenLabs OctoQuad Quadrature Encoder & Pulse Width Interface Module.
*
* The OctoQuad has 8 input channels that can used to read either Relative Quadrature Encoders or Pulse-Width Absolute Encoder inputs.
* Relative Quadrature encoders are found on most FTC motors, and some stand-alone position sensors like the REV Thru-Bore encoder.
* Pulse-Width encoders are less common. The REV Thru-Bore encoder can provide its absolute position via a variable pulse width,
* as can several sonar rangefinders such as the MaxBotix MB1000 series.
*
* This OpMode assumes that the OctoQuad is attached to an I2C interface named "octoquad" in the robot configuration.
*
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* Select, Init and run the "OctoQuad Configuration Tool" OpMode
* Read the blue User-Interface tips at the top of the telemetry screen.
* Use the UI buttons to navigate the menu and make any desired changes to the OctoQuad configuration.
* Use the Program Settings To FLASH option to make any changes permanent.
*
* See the sensor's product page: https://www.tindie.com/products/digitalchickenlabs/octoquad-8ch-quadrature-pulse-width-decoder/
*/
@TeleOp(name = "OctoQuad Configuration Tool", group="OctoQuad")
@Disabled
public class UtilityOctoQuadConfigMenu extends LinearOpMode
{
TelemetryMenu.MenuElement rootMenu = new TelemetryMenu.MenuElement("OctoQuad Config Menu", true);
TelemetryMenu.MenuElement menuHwInfo = new TelemetryMenu.MenuElement("Hardware Information", false);
TelemetryMenu.EnumOption optionI2cResetMode;
TelemetryMenu.EnumOption optionChannelBankConfig;
TelemetryMenu.MenuElement menuEncoderDirections = new TelemetryMenu.MenuElement("Set Encoder Directions", false);
TelemetryMenu.BooleanOption[] optionsEncoderDirections = new TelemetryMenu.BooleanOption[OctoQuad.NUM_ENCODERS];
TelemetryMenu.MenuElement menuVelocityIntervals = new TelemetryMenu.MenuElement("Velocity Measurement Intervals", false);
TelemetryMenu.IntegerOption[] optionsVelocityIntervals = new TelemetryMenu.IntegerOption[OctoQuad.NUM_ENCODERS];
TelemetryMenu.MenuElement menuAbsParams = new TelemetryMenu.MenuElement("Abs. Encoder Pulse Width Params", false);
TelemetryMenu.IntegerOption[] optionsAbsParamsMax = new TelemetryMenu.IntegerOption[OctoQuad.NUM_ENCODERS];
TelemetryMenu.IntegerOption[] optionsAbsParamsMin = new TelemetryMenu.IntegerOption[OctoQuad.NUM_ENCODERS];
TelemetryMenu.OptionElement optionProgramToFlash;
TelemetryMenu.OptionElement optionSendToRAM;
TelemetryMenu.StaticClickableOption optionExit;
OctoQuad octoquad;
boolean error = false;
@Override
public void runOpMode()
{
octoquad = hardwareMap.getAll(OctoQuad.class).get(0);
if(octoquad.getChipId() != OctoQuad.OCTOQUAD_CHIP_ID)
{
telemetry.addLine("Error: cannot communicate with OctoQuad. Check your wiring and configuration and try again");
telemetry.update();
error = true;
}
else
{
if(octoquad.getFirmwareVersion().maj != OctoQuad.SUPPORTED_FW_VERSION_MAJ)
{
telemetry.addLine("Error: The OctoQuad is running a different major firmware version than this driver was built for. Cannot run configuration tool");
telemetry.update();
error = true;
}
}
if(error)
{
waitForStart();
return;
}
telemetry.addLine("Retrieving current configuration from OctoQuad");
telemetry.update();
optionExit = new TelemetryMenu.StaticClickableOption("Exit configuration menu")
{
@Override
void onClick() // called on OpMode thread
{
requestOpModeStop();
}
};
optionI2cResetMode = new TelemetryMenu.EnumOption("I2C Reset Mode", OctoQuad.I2cRecoveryMode.values(), octoquad.getI2cRecoveryMode());
optionChannelBankConfig = new TelemetryMenu.EnumOption("Channel Bank Modes", OctoQuad.ChannelBankConfig.values(), octoquad.getChannelBankConfig());
menuHwInfo.addChild(new TelemetryMenu.StaticItem("Board Firmware: v" + octoquad.getFirmwareVersion()));
//menuHwInfo.addChild(new TelemetryMenu.StaticItem("Board unique ID: FIXME"));
for(int i = 0; i < OctoQuad.NUM_ENCODERS; i++)
{
optionsEncoderDirections[i] = new TelemetryMenu.BooleanOption(
String.format("Encoder %d direction", i),
octoquad.getSingleEncoderDirection(i) == OctoQuad.EncoderDirection.REVERSE,
"-",
"+");
}
menuEncoderDirections.addChildren(optionsEncoderDirections);
for(int i = 0; i < OctoQuad.NUM_ENCODERS; i++)
{
optionsVelocityIntervals[i] = new TelemetryMenu.IntegerOption(
String.format("Chan %d velocity intvl", i),
OctoQuad.MIN_VELOCITY_MEASUREMENT_INTERVAL_MS,
OctoQuad.MAX_VELOCITY_MEASUREMENT_INTERVAL_MS,
octoquad.getSingleVelocitySampleInterval(i));
}
menuVelocityIntervals.addChildren(optionsVelocityIntervals);
for(int i = 0; i < OctoQuad.NUM_ENCODERS; i++)
{
OctoQuad.ChannelPulseWidthParams params = octoquad.getSingleChannelPulseWidthParams(i);
optionsAbsParamsMax[i] = new TelemetryMenu.IntegerOption(
String.format("Chan %d max pulse length", i),
OctoQuad.MIN_PULSE_WIDTH_US,
OctoQuad.MAX_PULSE_WIDTH_US,
params.max_length_us);
optionsAbsParamsMin[i] = new TelemetryMenu.IntegerOption(
String.format("Chan %d min pulse length", i),
OctoQuad.MIN_PULSE_WIDTH_US,
OctoQuad.MAX_PULSE_WIDTH_US,
params.min_length_us);
}
menuAbsParams.addChildren(optionsAbsParamsMin);
menuAbsParams.addChildren(optionsAbsParamsMax);
optionProgramToFlash = new TelemetryMenu.OptionElement()
{
String name = "Program Settings to FLASH";
long lastClickTime = 0;
@Override
protected String getDisplayText()
{
if(lastClickTime == 0)
{
return name;
}
else
{
if(System.currentTimeMillis() - lastClickTime < 1000)
{
return name + " **OK**";
}
else
{
lastClickTime = 0;
return name;
}
}
}
@Override
void onClick()
{
sendSettingsToRam();
octoquad.saveParametersToFlash();
lastClickTime = System.currentTimeMillis();
}
};
optionSendToRAM = new TelemetryMenu.OptionElement()
{
String name = "Send Settings to RAM";
long lastClickTime = 0;
@Override
protected String getDisplayText()
{
if(lastClickTime == 0)
{
return name;
}
else
{
if(System.currentTimeMillis() - lastClickTime < 1000)
{
return name + " **OK**";
}
else
{
lastClickTime = 0;
return name;
}
}
}
@Override
void onClick()
{
sendSettingsToRam();
lastClickTime = System.currentTimeMillis();
}
};
rootMenu.addChild(menuHwInfo);
rootMenu.addChild(optionI2cResetMode);
rootMenu.addChild(optionChannelBankConfig);
rootMenu.addChild(menuEncoderDirections);
rootMenu.addChild(menuVelocityIntervals);
rootMenu.addChild(menuAbsParams);
rootMenu.addChild(optionProgramToFlash);
rootMenu.addChild(optionSendToRAM);
rootMenu.addChild(optionExit);
TelemetryMenu menu = new TelemetryMenu(telemetry, rootMenu);
while (!isStopRequested())
{
menu.loop(gamepad1);
telemetry.update();
sleep(20);
}
}
void sendSettingsToRam()
{
for(int i = 0; i < OctoQuad.NUM_ENCODERS; i++)
{
octoquad.setSingleEncoderDirection(i, optionsEncoderDirections[i].getValue() ? OctoQuad.EncoderDirection.REVERSE : OctoQuad.EncoderDirection.FORWARD);
octoquad.setSingleVelocitySampleInterval(i, optionsVelocityIntervals[i].getValue());
OctoQuad.ChannelPulseWidthParams params = new OctoQuad.ChannelPulseWidthParams();
params.max_length_us = optionsAbsParamsMax[i].getValue();
params.min_length_us = optionsAbsParamsMin[i].getValue();
octoquad.setSingleChannelPulseWidthParams(i, params);
}
octoquad.setI2cRecoveryMode((OctoQuad.I2cRecoveryMode) optionI2cResetMode.getValue());
octoquad.setChannelBankConfig((OctoQuad.ChannelBankConfig) optionChannelBankConfig.getValue());
}
/*
* Copyright (c) 2023 OpenFTC Team
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
public static class TelemetryMenu
{
private final MenuElement root;
private MenuElement currentLevel;
private boolean dpadUpPrev;
private boolean dpadDnPrev;
private boolean dpadRightPrev;
private boolean dpadLeftPrev;
private boolean xPrev;
private boolean lbPrev;
private int selectedIdx = 0;
private Stack<Integer> selectedIdxStack = new Stack<>();
private final Telemetry telemetry;
/**
* TelemetryMenu constructor
* @param telemetry pass in 'telemetry' from your OpMode
* @param root the root menu element
*/
public TelemetryMenu(Telemetry telemetry, MenuElement root)
{
this.root = root;
this.currentLevel = root;
this.telemetry = telemetry;
telemetry.setDisplayFormat(Telemetry.DisplayFormat.HTML);
telemetry.setMsTransmissionInterval(50);
}
/**
* Call this from inside your loop to put the current menu state into
* telemetry, and process gamepad inputs for navigating the menu
* @param gamepad the gamepad you want to use to navigate the menu
*/
public void loop(Gamepad gamepad)
{
// Capture current state of the gamepad buttons we care about;
// We can only look once or we risk a race condition
boolean dpadUp = gamepad.dpad_up;
boolean dpadDn = gamepad.dpad_down;
boolean dpadRight = gamepad.dpad_right;
boolean dpadLeft = gamepad.dpad_left;
boolean x = gamepad.x;
boolean lb = gamepad.left_bumper;
// Figure out who our children our at this level
// and figure out which item is currently highlighted
// with the selection pointer
ArrayList<Element> children = currentLevel.children();
Element currentSelection = children.get(selectedIdx);
// Left and right are inputs to the selected item (if it's an Option)
if (currentSelection instanceof OptionElement)
{
if (dpadRight && !dpadRightPrev) // rising edge
{
((OptionElement) currentSelection).onRightInput();
}
else if (dpadLeft && !dpadLeftPrev) // rising edge
{
((OptionElement) currentSelection).onLeftInput();
}
}
// Up and down navigate the current selection pointer
if (dpadUp && !dpadUpPrev) // rising edge
{
selectedIdx--; // Move selection pointer up
}
else if (dpadDn && !dpadDnPrev) // rising edge
{
selectedIdx++; // Move selection pointer down
}
// Make selected index sane (don't let it go out of bounds) :eyes:
if (selectedIdx >= children.size())
{
selectedIdx = children.size()-1;
}
else if (selectedIdx < 0)
{
selectedIdx = 0;
}
// Select: either enter submenu or input to option
else if (x && !xPrev) // rising edge
{
// Select up element
if (currentSelection instanceof SpecialUpElement)
{
// We can only go up if we're not at the root level
if (currentLevel != root)
{
// Restore selection pointer to where it was before
selectedIdx = selectedIdxStack.pop();
// Change to the parent level
currentLevel = currentLevel.parent();
}
}
// Input to option
else if (currentSelection instanceof OptionElement)
{
((OptionElement) currentSelection).onClick();
}
// Enter submenu
else if (currentSelection instanceof MenuElement)
{
// Save our current selection pointer so we can restore it
// later if the user navigates back up a level
selectedIdxStack.push(selectedIdx);
// We have no idea what's in the submenu :monkey: so best to
// just set the selection pointer to the first element
selectedIdx = 0;
// Now the current level becomes the submenu that the selection
// pointer was on
currentLevel = (MenuElement) currentSelection;
}
}
// Go up a level
else if (lb && !lbPrev)
{
// We can only go up if we're not at the root level
if (currentLevel != root)
{
// Restore selection pointer to where it was before
selectedIdx = selectedIdxStack.pop();
// Change to the parent level
currentLevel = currentLevel.parent();
}
}
// Save the current button states so that we can look for
// the rising edge the next time around the loop :)
dpadUpPrev = dpadUp;
dpadDnPrev = dpadDn;
dpadRightPrev = dpadRight;
dpadLeftPrev = dpadLeft;
xPrev = x;
lbPrev = lb;
// Start building the text display.
// First, we add the static directions for gamepad operation
StringBuilder builder = new StringBuilder();
builder.append("<font color='#119af5' face=monospace>");
builder.append("Navigate items.....dpad up/down\n")
.append("Select.............X\n")
.append("Edit option........dpad left/right\n")
.append("Up one level.......left bumper\n");
builder.append("</font>");
builder.append("\n");
// Now actually add the menu options. We start by adding the name of the current menu level.
builder.append("<font face=monospace>");
builder.append("Current Menu: ").append(currentLevel.name).append("\n");
// Now we loop through all the child elements of this level and add them
for (int i = 0; i < children.size(); i++)
{
// If the selection pointer is at this index, put a green dot in the box :)
if (selectedIdx == i)
{
builder.append("[<font color=green face=monospace>•</font>] ");
}
// Otherwise, just put an empty box
else
{
builder.append("[ ] ");
}
// Figure out who the selection pointer is pointing at :eyes:
Element e = children.get(i);
// If it's pointing at a submenu, indicate that it's a submenu to the user
// by prefixing "> " to the name.
if (e instanceof MenuElement)
{
builder.append("> ");
}
// Finally, add the element's name
builder.append(e.getDisplayText());
// We musn't forget the newline
builder.append("\n");
}
// Don't forget to close the font tag either
builder.append("</font>");
// Build the string!!!! :nerd:
String menu = builder.toString();
// Add it to telemetry
telemetry.addLine(menu);
}
public static class MenuElement extends Element
{
private String name;
private ArrayList<Element> children = new ArrayList<>();
/**
* Create a new MenuElement; may either be the root menu, or a submenu (set isRoot accordingly)
* @param name the name for this menu
* @param isRoot whether this is a root menu, or a submenu
*/
public MenuElement(String name, boolean isRoot)
{
this.name = name;
// If it's not the root menu, we add the up one level option as the first element
if (!isRoot)
{
children.add(new SpecialUpElement());
}
}
/**
* Add a child element to this menu (may either be an Option or another menu)
* @param child the child element to add
*/
public void addChild(Element child)
{
child.setParent(this);
children.add(child);
}
/**
* Add multiple child elements to this menu (may either be option, or another menu)
* @param children the children to add
*/
public void addChildren(Element[] children)
{
for (Element e : children)
{
e.setParent(this);
this.children.add(e);
}
}
@Override
protected String getDisplayText()
{
return name;
}
private ArrayList<Element> children()
{
return children;
}
}
public static abstract class OptionElement extends Element
{
/**
* Override this to get notified when the element is clicked
*/
void onClick() {}
/**
* Override this to get notified when the element gets a "left edit" input
*/
protected void onLeftInput() {}
/**
* Override this to get notified when the element gets a "right edit" input
*/
protected void onRightInput() {}
}
public static class EnumOption extends OptionElement
{
protected int idx = 0;
protected Enum[] e;
protected String name;
public EnumOption(String name, Enum[] e)
{
this.e = e;
this.name = name;
}
public EnumOption(String name, Enum[] e, Enum def)
{
this(name, e);
idx = def.ordinal();
}
@Override
public void onLeftInput()
{
idx++;
if(idx > e.length-1)
{
idx = 0;
}
}
@Override
public void onRightInput()
{
idx--;
if(idx < 0)
{
idx = e.length-1;
}
}
@Override
public void onClick()
{
onRightInput();
}
@Override
protected String getDisplayText()
{
return String.format("%s: <font color='#e37c07' face=monospace>%s</font>", name, e[idx].name());
}
public Enum getValue()
{
return e[idx];
}
}
public static class IntegerOption extends OptionElement
{
protected int i;
protected int min;
protected int max;
protected String name;
public IntegerOption(String name, int min, int max, int def)
{
this.name = name;
this.min = min;
this.max = max;
this.i = def;
}
@Override
public void onLeftInput()
{
i--;
if(i < min)
{
i = max;
}
}
@Override
public void onRightInput()
{
i++;
if(i > max)
{
i = min;
}
}
@Override
public void onClick()
{
onRightInput();
}
@Override
protected String getDisplayText()
{
return String.format("%s: <font color='#e37c07' face=monospace>%d</font>", name, i);
}
public int getValue()
{
return i;
}
}
static class BooleanOption extends OptionElement
{
private String name;
private boolean val = true;
private String customTrue;
private String customFalse;
BooleanOption(String name, boolean def)
{
this.name = name;
this.val = def;
}
BooleanOption(String name, boolean def, String customTrue, String customFalse)
{
this(name, def);
this.customTrue = customTrue;
this.customFalse = customFalse;
}
@Override
public void onLeftInput()
{
val = !val;
}
@Override
public void onRightInput()
{
val = !val;
}
@Override
public void onClick()
{
val = !val;
}
@Override
protected String getDisplayText()
{
String valStr;
if(customTrue != null && customFalse != null)
{
valStr = val ? customTrue : customFalse;
}
else
{
valStr = val ? "true" : "false";
}
return String.format("%s: <font color='#e37c07' face=monospace>%s</font>", name, valStr);
}
public boolean getValue()
{
return val;
}
}
/**
*
*/
public static class StaticItem extends OptionElement
{
private String name;
public StaticItem(String name)
{
this.name = name;
}
@Override
protected String getDisplayText()
{
return name;
}
}
public static abstract class StaticClickableOption extends OptionElement
{
private String name;
public StaticClickableOption(String name)
{
this.name = name;
}
abstract void onClick();
@Override
protected String getDisplayText()
{
return name;
}
}
private static abstract class Element
{
private MenuElement parent;
protected void setParent(MenuElement parent)
{
this.parent = parent;
}
protected MenuElement parent()
{
return parent;
}
protected abstract String getDisplayText();
}
private static class SpecialUpElement extends Element
{
@Override
protected String getDisplayText()
{
return "<font color='#119af5' face=monospace>.. ↰ Up One Level</font>";
}
}
}
}

View File

@ -29,6 +29,11 @@ Concept: This is a sample OpMode that illustrates performing a specific function
Each OpMode should try to only demonstrate a single concept so they are easy to
locate based on their name. These OpModes may not produce a drivable robot.
Utility: This sample type is provided as a useful tool, or aide, to perform some specific development task.
It is not expected to be something you would include in your own robot code.
To use the tool, comment out the @Disabled annotation and build the App.
Read the comments found in the sample for an explanation of its intended use.
After the prefix, other conventions will apply:
* Sensor class names should constructed as: Sensor - Company - Type

View File

@ -34,8 +34,6 @@ package org.firstinspires.ftc.robotcontroller.internal;
import com.qualcomm.robotcore.eventloop.opmode.OpModeManager;
import com.qualcomm.robotcore.eventloop.opmode.OpModeRegister;
import org.firstinspires.ftc.robotcontroller.external.samples.ConceptNullOp;
/**
* {@link FtcOpModeRegister} is responsible for registering OpModes for use in an FTC game.
* @see #register(OpModeManager)
@ -49,7 +47,7 @@ public class FtcOpModeRegister implements OpModeRegister {
* There are two mechanisms by which an OpMode may be registered.
*
* 1) The preferred method is by means of class annotations in the OpMode itself.
* See, for example the class annotations in {@link ConceptNullOp}.
* See, for example the class annotations in {@link org.firstinspires.ftc.robotcontroller.external.samples.ConceptNullOp}.
*
* 2) The other, retired, method is to modify this {@link #register(OpModeManager)}
* method to include explicit calls to OpModeManager.register().

1442
README.md

File diff suppressed because it is too large Load Diff

View File

@ -23,7 +23,18 @@ android {
}
}
repositories {
maven {
url = 'https://maven.brott.dev/'
}
}
dependencies {
implementation project(':FtcRobotController')
annotationProcessor files('lib/OpModeAnnotationProcessor.jar')
implementation "com.acmerobotics.roadrunner:ftc:0.1.13"
implementation "com.acmerobotics.roadrunner:core:1.0.0"
implementation "com.acmerobotics.roadrunner:actions:1.0.0"
implementation "com.acmerobotics.dashboard:dashboard:0.4.14"
}

View File

@ -0,0 +1,22 @@
package org.firstinspires.ftc.teamcode;
import com.acmerobotics.dashboard.canvas.Canvas;
import com.acmerobotics.roadrunner.Pose2d;
import com.acmerobotics.roadrunner.Vector2d;
public final class Drawing {
private Drawing() {}
public static void drawRobot(Canvas c, Pose2d t) {
final double ROBOT_RADIUS = 9;
c.setStrokeWidth(1);
c.strokeCircle(t.position.x, t.position.y, ROBOT_RADIUS);
Vector2d halfv = t.heading.vec().times(0.5 * ROBOT_RADIUS);
Vector2d p1 = t.position.plus(halfv);
Vector2d p2 = p1.plus(halfv);
c.strokeLine(p1.x, p1.y, p2.x, p2.y);
}
}

View File

@ -0,0 +1,8 @@
package org.firstinspires.ftc.teamcode;
import com.acmerobotics.roadrunner.Time;
import com.acmerobotics.roadrunner.Twist2dDual;
public interface Localizer {
Twist2dDual<Time> update();
}

View File

@ -0,0 +1,495 @@
package org.firstinspires.ftc.teamcode;
import static org.firstinspires.ftc.teamcode.MecanumDriveParameters.*;
import androidx.annotation.NonNull;
import com.acmerobotics.dashboard.canvas.Canvas;
import com.acmerobotics.dashboard.config.Config;
import com.acmerobotics.dashboard.telemetry.TelemetryPacket;
import com.acmerobotics.roadrunner.*;
import com.acmerobotics.roadrunner.AngularVelConstraint;
import com.acmerobotics.roadrunner.DualNum;
import com.acmerobotics.roadrunner.HolonomicController;
import com.acmerobotics.roadrunner.MecanumKinematics;
import com.acmerobotics.roadrunner.MinVelConstraint;
import com.acmerobotics.roadrunner.MotorFeedforward;
import com.acmerobotics.roadrunner.Pose2d;
import com.acmerobotics.roadrunner.Pose2dDual;
import com.acmerobotics.roadrunner.ProfileAccelConstraint;
import com.acmerobotics.roadrunner.Time;
import com.acmerobotics.roadrunner.TimeTrajectory;
import com.acmerobotics.roadrunner.TimeTurn;
import com.acmerobotics.roadrunner.TrajectoryActionBuilder;
import com.acmerobotics.roadrunner.TurnConstraints;
import com.acmerobotics.roadrunner.Twist2dDual;
import com.acmerobotics.roadrunner.VelConstraint;
import com.acmerobotics.roadrunner.ftc.DownsampledWriter;
import com.acmerobotics.roadrunner.ftc.Encoder;
import com.acmerobotics.roadrunner.ftc.FlightRecorder;
import com.acmerobotics.roadrunner.ftc.LazyImu;
import com.acmerobotics.roadrunner.ftc.LynxFirmware;
import com.acmerobotics.roadrunner.ftc.OverflowEncoder;
import com.acmerobotics.roadrunner.ftc.PositionVelocityPair;
import com.acmerobotics.roadrunner.ftc.RawEncoder;
import com.qualcomm.hardware.lynx.LynxModule;
import com.qualcomm.hardware.rev.RevHubOrientationOnRobot;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.DcMotorEx;
import com.qualcomm.robotcore.hardware.DcMotorSimple;
import com.qualcomm.robotcore.hardware.HardwareMap;
import com.qualcomm.robotcore.hardware.IMU;
import com.qualcomm.robotcore.hardware.VoltageSensor;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
import org.firstinspires.ftc.teamcode.messages.DriveCommandMessage;
import org.firstinspires.ftc.teamcode.messages.MecanumCommandMessage;
import org.firstinspires.ftc.teamcode.messages.MecanumLocalizerInputsMessage;
import org.firstinspires.ftc.teamcode.messages.PoseMessage;
import java.lang.Math;
import java.util.Arrays;
import java.util.LinkedList;
import java.util.List;
@Config
public final class MecanumDrive {
public static class Params {
/*
IMU Orientation
See https://ftc-docs.firstinspires.org/en/latest/programming_resources/imu/imu.html?highlight=imu#physical-hub-mounting
*/
public RevHubOrientationOnRobot.LogoFacingDirection logoFacingDirection = RevHubOrientationOnRobot.LogoFacingDirection.UP;
public RevHubOrientationOnRobot.UsbFacingDirection usbFacingDirection = RevHubOrientationOnRobot.UsbFacingDirection.FORWARD;
// drive model parameters
public double inPerTick = INCHES_PER_TICK_VALUE;
public double lateralInPerTick = LATERAL_INCHES_PER_TICK_VALUE;
public double trackWidthTicks = TRACK_WIDTH_IN_TICKS_VALUE;
// feedforward parameters (in tick units)
public double kS = KS_VALUE;
public double kV = KV_VALUE;
public double kA = KA_VALUE;
// path profile parameters (in inches)
public double maxWheelVel = 50;
public double minProfileAccel = -30;
public double maxProfileAccel = 50;
// turn profile parameters (in radians)
public double maxAngVel = Math.PI; // shared with path
public double maxAngAccel = Math.PI;
// path controller gains
public double axialGain = 0.0;
public double lateralGain = 0.0;
public double headingGain = 0.0; // shared with turn
public double axialVelGain = 0.0;
public double lateralVelGain = 0.0;
public double headingVelGain = 0.0; // shared with turn
}
public static Params PARAMS = new Params();
public final MecanumKinematics kinematics = new MecanumKinematics(
PARAMS.inPerTick * PARAMS.trackWidthTicks,
PARAMS.inPerTick / PARAMS.lateralInPerTick);
public final TurnConstraints defaultTurnConstraints = new TurnConstraints(
PARAMS.maxAngVel, -PARAMS.maxAngAccel, PARAMS.maxAngAccel);
public final VelConstraint defaultVelConstraint =
new MinVelConstraint(Arrays.asList(
kinematics.new WheelVelConstraint(PARAMS.maxWheelVel),
new AngularVelConstraint(PARAMS.maxAngVel)
));
public final AccelConstraint defaultAccelConstraint =
new ProfileAccelConstraint(PARAMS.minProfileAccel, PARAMS.maxProfileAccel);
public final DcMotorEx leftFront, leftBack, rightBack, rightFront;
public final VoltageSensor voltageSensor;
public final LazyImu lazyImu;
public final Localizer localizer;
public Pose2d pose;
private final LinkedList<Pose2d> poseHistory = new LinkedList<>();
private final DownsampledWriter estimatedPoseWriter = new DownsampledWriter("ESTIMATED_POSE", 50_000_000);
private final DownsampledWriter targetPoseWriter = new DownsampledWriter("TARGET_POSE", 50_000_000);
private final DownsampledWriter driveCommandWriter = new DownsampledWriter("DRIVE_COMMAND", 50_000_000);
private final DownsampledWriter mecanumCommandWriter = new DownsampledWriter("MECANUM_COMMAND", 50_000_000);
public class DriveLocalizer implements Localizer {
public final Encoder leftFront, leftBack, rightBack, rightFront;
public final IMU imu;
private int lastLeftFrontPos, lastLeftBackPos, lastRightBackPos, lastRightFrontPos;
private Rotation2d lastHeading;
private boolean initialized;
public DriveLocalizer() {
leftFront = new OverflowEncoder(new RawEncoder(MecanumDrive.this.leftFront));
leftBack = new OverflowEncoder(new RawEncoder(MecanumDrive.this.leftBack));
rightBack = new OverflowEncoder(new RawEncoder(MecanumDrive.this.rightBack));
rightFront = new OverflowEncoder(new RawEncoder(MecanumDrive.this.rightFront));
imu = lazyImu.get();
}
@Override
public Twist2dDual<Time> update() {
PositionVelocityPair leftFrontPosVel = leftFront.getPositionAndVelocity();
PositionVelocityPair leftBackPosVel = leftBack.getPositionAndVelocity();
PositionVelocityPair rightBackPosVel = rightBack.getPositionAndVelocity();
PositionVelocityPair rightFrontPosVel = rightFront.getPositionAndVelocity();
YawPitchRollAngles angles = imu.getRobotYawPitchRollAngles();
FlightRecorder.write("MECANUM_LOCALIZER_INPUTS", new MecanumLocalizerInputsMessage(
leftFrontPosVel, leftBackPosVel, rightBackPosVel, rightFrontPosVel, angles));
Rotation2d heading = Rotation2d.exp(angles.getYaw(AngleUnit.RADIANS));
if (!initialized) {
initialized = true;
lastLeftFrontPos = leftFrontPosVel.position;
lastLeftBackPos = leftBackPosVel.position;
lastRightBackPos = rightBackPosVel.position;
lastRightFrontPos = rightFrontPosVel.position;
lastHeading = heading;
return new Twist2dDual<>(
Vector2dDual.constant(new Vector2d(0.0, 0.0), 2),
DualNum.constant(0.0, 2)
);
}
double headingDelta = heading.minus(lastHeading);
Twist2dDual<Time> twist = kinematics.forward(new MecanumKinematics.WheelIncrements<>(
new DualNum<Time>(new double[]{
(leftFrontPosVel.position - lastLeftFrontPos),
leftFrontPosVel.velocity,
}).times(PARAMS.inPerTick),
new DualNum<Time>(new double[]{
(leftBackPosVel.position - lastLeftBackPos),
leftBackPosVel.velocity,
}).times(PARAMS.inPerTick),
new DualNum<Time>(new double[]{
(rightBackPosVel.position - lastRightBackPos),
rightBackPosVel.velocity,
}).times(PARAMS.inPerTick),
new DualNum<Time>(new double[]{
(rightFrontPosVel.position - lastRightFrontPos),
rightFrontPosVel.velocity,
}).times(PARAMS.inPerTick)
));
lastLeftFrontPos = leftFrontPosVel.position;
lastLeftBackPos = leftBackPosVel.position;
lastRightBackPos = rightBackPosVel.position;
lastRightFrontPos = rightFrontPosVel.position;
lastHeading = heading;
return new Twist2dDual<>(
twist.line,
DualNum.cons(headingDelta, twist.angle.drop(1))
);
}
}
public MecanumDrive(HardwareMap hardwareMap, Pose2d pose) {
this.pose = pose;
LynxFirmware.throwIfModulesAreOutdated(hardwareMap);
for (LynxModule module : hardwareMap.getAll(LynxModule.class)) {
module.setBulkCachingMode(LynxModule.BulkCachingMode.AUTO);
}
/*
Configure motors definitions
*/
leftFront = hardwareMap.get(DcMotorEx.class, FRONT_LEFT_MOTOR);
leftBack = hardwareMap.get(DcMotorEx.class, BACK_LEFT_MOTOR);
rightFront = hardwareMap.get(DcMotorEx.class, FRONT_RIGHT_MOTOR);
rightBack = hardwareMap.get(DcMotorEx.class, BACK_RIGHT_MOTOR);
leftFront.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
leftBack.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
rightBack.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
rightFront.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
/*
Configure motors direction
*/
//leftFront.setDirection(DcMotorSimple.Direction.REVERSE);
/*
IMU Orientation Application
See https://ftc-docs.firstinspires.org/en/latest/hardware_and_software_configuration/configuring/index.html
*/
lazyImu = new LazyImu(hardwareMap, "imu", new RevHubOrientationOnRobot(
PARAMS.logoFacingDirection, PARAMS.usbFacingDirection));
voltageSensor = hardwareMap.voltageSensor.iterator().next();
localizer = new DriveLocalizer();
FlightRecorder.write("MECANUM_PARAMS", PARAMS);
}
public void setDrivePowers(PoseVelocity2d powers) {
MecanumKinematics.WheelVelocities<Time> wheelVels = new MecanumKinematics(1).inverse(
PoseVelocity2dDual.constant(powers, 1));
double maxPowerMag = 1;
for (DualNum<Time> power : wheelVels.all()) {
maxPowerMag = Math.max(maxPowerMag, power.value());
}
leftFront.setPower(wheelVels.leftFront.get(0) / maxPowerMag);
leftBack.setPower(wheelVels.leftBack.get(0) / maxPowerMag);
rightBack.setPower(wheelVels.rightBack.get(0) / maxPowerMag);
rightFront.setPower(wheelVels.rightFront.get(0) / maxPowerMag);
}
public final class FollowTrajectoryAction implements Action {
public final TimeTrajectory timeTrajectory;
private double beginTs = -1;
private final double[] xPoints, yPoints;
public FollowTrajectoryAction(TimeTrajectory t) {
timeTrajectory = t;
List<Double> disps = com.acmerobotics.roadrunner.Math.range(
0, t.path.length(),
Math.max(2, (int) Math.ceil(t.path.length() / 2)));
xPoints = new double[disps.size()];
yPoints = new double[disps.size()];
for (int i = 0; i < disps.size(); i++) {
Pose2d p = t.path.get(disps.get(i), 1).value();
xPoints[i] = p.position.x;
yPoints[i] = p.position.y;
}
}
@Override
public boolean run(@NonNull TelemetryPacket p) {
double t;
if (beginTs < 0) {
beginTs = Actions.now();
t = 0;
} else {
t = Actions.now() - beginTs;
}
if (t >= timeTrajectory.duration) {
leftFront.setPower(0);
leftBack.setPower(0);
rightBack.setPower(0);
rightFront.setPower(0);
return false;
}
Pose2dDual<Time> txWorldTarget = timeTrajectory.get(t);
targetPoseWriter.write(new PoseMessage(txWorldTarget.value()));
PoseVelocity2d robotVelRobot = updatePoseEstimate();
PoseVelocity2dDual<Time> command = new HolonomicController(
PARAMS.axialGain, PARAMS.lateralGain, PARAMS.headingGain,
PARAMS.axialVelGain, PARAMS.lateralVelGain, PARAMS.headingVelGain
)
.compute(txWorldTarget, pose, robotVelRobot);
driveCommandWriter.write(new DriveCommandMessage(command));
MecanumKinematics.WheelVelocities<Time> wheelVels = kinematics.inverse(command);
double voltage = voltageSensor.getVoltage();
final MotorFeedforward feedforward = new MotorFeedforward(PARAMS.kS,
PARAMS.kV / PARAMS.inPerTick, PARAMS.kA / PARAMS.inPerTick);
double leftFrontPower = feedforward.compute(wheelVels.leftFront) / voltage;
double leftBackPower = feedforward.compute(wheelVels.leftBack) / voltage;
double rightBackPower = feedforward.compute(wheelVels.rightBack) / voltage;
double rightFrontPower = feedforward.compute(wheelVels.rightFront) / voltage;
mecanumCommandWriter.write(new MecanumCommandMessage(
voltage, leftFrontPower, leftBackPower, rightBackPower, rightFrontPower
));
leftFront.setPower(leftFrontPower);
leftBack.setPower(leftBackPower);
rightBack.setPower(rightBackPower);
rightFront.setPower(rightFrontPower);
p.put("x", pose.position.x);
p.put("y", pose.position.y);
p.put("heading (deg)", Math.toDegrees(pose.heading.toDouble()));
Pose2d error = txWorldTarget.value().minusExp(pose);
p.put("xError", error.position.x);
p.put("yError", error.position.y);
p.put("headingError (deg)", Math.toDegrees(error.heading.toDouble()));
// only draw when active; only one drive action should be active at a time
Canvas c = p.fieldOverlay();
drawPoseHistory(c);
c.setStroke("#4CAF50");
Drawing.drawRobot(c, txWorldTarget.value());
c.setStroke("#3F51B5");
Drawing.drawRobot(c, pose);
c.setStroke("#4CAF50FF");
c.setStrokeWidth(1);
c.strokePolyline(xPoints, yPoints);
return true;
}
@Override
public void preview(Canvas c) {
c.setStroke("#4CAF507A");
c.setStrokeWidth(1);
c.strokePolyline(xPoints, yPoints);
}
}
public final class TurnAction implements Action {
private final TimeTurn turn;
private double beginTs = -1;
public TurnAction(TimeTurn turn) {
this.turn = turn;
}
@Override
public boolean run(@NonNull TelemetryPacket p) {
double t;
if (beginTs < 0) {
beginTs = Actions.now();
t = 0;
} else {
t = Actions.now() - beginTs;
}
if (t >= turn.duration) {
leftFront.setPower(0);
leftBack.setPower(0);
rightBack.setPower(0);
rightFront.setPower(0);
return false;
}
Pose2dDual<Time> txWorldTarget = turn.get(t);
targetPoseWriter.write(new PoseMessage(txWorldTarget.value()));
PoseVelocity2d robotVelRobot = updatePoseEstimate();
PoseVelocity2dDual<Time> command = new HolonomicController(
PARAMS.axialGain, PARAMS.lateralGain, PARAMS.headingGain,
PARAMS.axialVelGain, PARAMS.lateralVelGain, PARAMS.headingVelGain
)
.compute(txWorldTarget, pose, robotVelRobot);
driveCommandWriter.write(new DriveCommandMessage(command));
MecanumKinematics.WheelVelocities<Time> wheelVels = kinematics.inverse(command);
double voltage = voltageSensor.getVoltage();
final MotorFeedforward feedforward = new MotorFeedforward(PARAMS.kS,
PARAMS.kV / PARAMS.inPerTick, PARAMS.kA / PARAMS.inPerTick);
double leftFrontPower = feedforward.compute(wheelVels.leftFront) / voltage;
double leftBackPower = feedforward.compute(wheelVels.leftBack) / voltage;
double rightBackPower = feedforward.compute(wheelVels.rightBack) / voltage;
double rightFrontPower = feedforward.compute(wheelVels.rightFront) / voltage;
mecanumCommandWriter.write(new MecanumCommandMessage(
voltage, leftFrontPower, leftBackPower, rightBackPower, rightFrontPower
));
leftFront.setPower(feedforward.compute(wheelVels.leftFront) / voltage);
leftBack.setPower(feedforward.compute(wheelVels.leftBack) / voltage);
rightBack.setPower(feedforward.compute(wheelVels.rightBack) / voltage);
rightFront.setPower(feedforward.compute(wheelVels.rightFront) / voltage);
Canvas c = p.fieldOverlay();
drawPoseHistory(c);
c.setStroke("#4CAF50");
Drawing.drawRobot(c, txWorldTarget.value());
c.setStroke("#3F51B5");
Drawing.drawRobot(c, pose);
c.setStroke("#7C4DFFFF");
c.fillCircle(turn.beginPose.position.x, turn.beginPose.position.y, 2);
return true;
}
@Override
public void preview(Canvas c) {
c.setStroke("#7C4DFF7A");
c.fillCircle(turn.beginPose.position.x, turn.beginPose.position.y, 2);
}
}
public PoseVelocity2d updatePoseEstimate() {
Twist2dDual<Time> twist = localizer.update();
pose = pose.plus(twist.value());
poseHistory.add(pose);
while (poseHistory.size() > 100) {
poseHistory.removeFirst();
}
estimatedPoseWriter.write(new PoseMessage(pose));
return twist.velocity().value();
}
private void drawPoseHistory(Canvas c) {
double[] xPoints = new double[poseHistory.size()];
double[] yPoints = new double[poseHistory.size()];
int i = 0;
for (Pose2d t : poseHistory) {
xPoints[i] = t.position.x;
yPoints[i] = t.position.y;
i++;
}
c.setStrokeWidth(1);
c.setStroke("#3F51B5");
c.strokePolyline(xPoints, yPoints);
}
public TrajectoryActionBuilder actionBuilder(Pose2d beginPose) {
return new TrajectoryActionBuilder(
TurnAction::new,
FollowTrajectoryAction::new,
new TrajectoryBuilderParams(
1e-6,
new ProfileParams(
0.25, 0.1, 1e-2
)
),
beginPose, 0.0,
defaultTurnConstraints,
defaultVelConstraint, defaultAccelConstraint
);
}
}

View File

@ -0,0 +1,22 @@
package org.firstinspires.ftc.teamcode;
import com.qualcomm.robotcore.hardware.DcMotorEx;
public class MecanumDriveParameters {
// Roadrunner tuning parameters
public static final double INCHES_PER_TICK_VALUE = 125.5/42126.8333333;
public static final double LATERAL_INCHES_PER_TICK_VALUE = 0.0026542073080477006;
public static final double TRACK_WIDTH_IN_TICKS_VALUE = 4012.7667474312616;
// FeedForward parameters (in tick units)
public static final double KS_VALUE = 0.7579547290234911;
public static final double KV_VALUE = 0.0005724562271687877;
public static final double KA_VALUE = 0;
// Robot motor configurations
public static final String FRONT_LEFT_MOTOR = "Drive front lt";
public static final String BACK_LEFT_MOTOR = "Drive back lt";
public static final String FRONT_RIGHT_MOTOR = "Drive front rt";
public static final String BACK_RIGHT_MOTOR = "Drive back rt";
}

View File

@ -0,0 +1,144 @@
/* Copyright (c) 2022 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.teamcode;
import com.qualcomm.hardware.rev.RevHubOrientationOnRobot;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.IMU;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
/*
* This OpMode shows how to use the new universal IMU interface. This
* interface may be used with the BNO055 IMU or the BHI260 IMU. It assumes that an IMU is configured
* on the robot with the name "imu".
*
* The sample will display the current Yaw, Pitch and Roll of the robot.<br>
* With the correct orientation parameters selected, pitch/roll/yaw should act as follows:
* Pitch value should INCREASE as the robot is tipped UP at the front. (Rotation about X) <br>
* Roll value should INCREASE as the robot is tipped UP at the left side. (Rotation about Y) <br>
* Yaw value should INCREASE as the robot is rotated Counter Clockwise. (Rotation about Z) <br>
*
* The yaw can be reset (to zero) by pressing the Y button on the gamepad (Triangle on a PS4 controller)
*
* This specific sample assumes that the Hub is mounted on one of the three orthogonal planes
* (X/Y, X/Z or Y/Z) and that the Hub has only been rotated in a range of 90 degree increments.
*
* Note: if your Hub is mounted on a surface angled at some non-90 Degree multiple (like 30) look at
* the alternative SensorImuNonOrthogonal sample in this folder.
*
* This "Orthogonal" requirement means that:
*
* 1) The Logo printed on the top of the Hub can ONLY be pointing in one of six directions:
* FORWARD, BACKWARD, UP, DOWN, LEFT and RIGHT.
*
* 2) The USB ports can only be pointing in one of the same six directions:<br>
* FORWARD, BACKWARD, UP, DOWN, LEFT and RIGHT.
*
* So, To fully define how your Hub is mounted to the robot, you must simply specify:<br>
* logoFacingDirection<br>
* usbFacingDirection
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
* Finally, choose the two correct parameters to define how your Hub is mounted and edit this OpMode
* to use those parameters.
*/
@TeleOp(name = "Sensor: IMU Orthogonal", group = "Sensor")
@Disabled // Comment this out to add to the OpMode list
public class SensorIMUOrthogonal extends LinearOpMode
{
// The IMU sensor object
IMU imu;
//----------------------------------------------------------------------------------------------
// Main logic
//----------------------------------------------------------------------------------------------
@Override public void runOpMode() throws InterruptedException {
// Retrieve and initialize the IMU.
// This sample expects the IMU to be in a REV Hub and named "imu".
imu = hardwareMap.get(IMU.class, "imu");
/* Define how the hub is mounted on the robot to get the correct Yaw, Pitch and Roll values.
*
* Two input parameters are required to fully specify the Orientation.
* The first parameter specifies the direction the printed logo on the Hub is pointing.
* The second parameter specifies the direction the USB connector on the Hub is pointing.
* All directions are relative to the robot, and left/right is as-viewed from behind the robot.
*/
/* The next two lines define Hub orientation.
* The Default Orientation (shown) is when a hub is mounted horizontally with the printed logo pointing UP and the USB port pointing FORWARD.
*
* To Do: EDIT these two lines to match YOUR mounting configuration.
*/
RevHubOrientationOnRobot.LogoFacingDirection logoDirection = RevHubOrientationOnRobot.LogoFacingDirection.UP;
RevHubOrientationOnRobot.UsbFacingDirection usbDirection = RevHubOrientationOnRobot.UsbFacingDirection.FORWARD;
RevHubOrientationOnRobot orientationOnRobot = new RevHubOrientationOnRobot(logoDirection, usbDirection);
// Now initialize the IMU with this mounting orientation
// Note: if you choose two conflicting directions, this initialization will cause a code exception.
imu.initialize(new IMU.Parameters(orientationOnRobot));
// Loop and update the dashboard
while (!isStopRequested()) {
telemetry.addData("Hub orientation", "Logo=%s USB=%s\n ", logoDirection, usbDirection);
// Check to see if heading reset is requested
if (gamepad1.y) {
telemetry.addData("Yaw", "Resetting\n");
imu.resetYaw();
} else {
telemetry.addData("Yaw", "Press Y (triangle) on Gamepad to reset\n");
}
// Retrieve Rotational Angles and Velocities
YawPitchRollAngles orientation = imu.getRobotYawPitchRollAngles();
AngularVelocity angularVelocity = imu.getRobotAngularVelocity(AngleUnit.DEGREES);
telemetry.addData("Yaw (Z)", "%.2f Deg. (Heading)", orientation.getYaw(AngleUnit.DEGREES));
telemetry.addData("Pitch (X)", "%.2f Deg.", orientation.getPitch(AngleUnit.DEGREES));
telemetry.addData("Roll (Y)", "%.2f Deg.\n", orientation.getRoll(AngleUnit.DEGREES));
telemetry.addData("Yaw (Z) velocity", "%.2f Deg/Sec", angularVelocity.zRotationRate);
telemetry.addData("Pitch (X) velocity", "%.2f Deg/Sec", angularVelocity.xRotationRate);
telemetry.addData("Roll (Y) velocity", "%.2f Deg/Sec", angularVelocity.yRotationRate);
telemetry.update();
}
}
}

View File

@ -0,0 +1,498 @@
package org.firstinspires.ftc.teamcode;
import androidx.annotation.NonNull;
import com.acmerobotics.dashboard.canvas.Canvas;
import com.acmerobotics.dashboard.config.Config;
import com.acmerobotics.dashboard.telemetry.TelemetryPacket;
import com.acmerobotics.roadrunner.AccelConstraint;
import com.acmerobotics.roadrunner.Action;
import com.acmerobotics.roadrunner.Actions;
import com.acmerobotics.roadrunner.AngularVelConstraint;
import com.acmerobotics.roadrunner.Arclength;
import com.acmerobotics.roadrunner.DualNum;
import com.acmerobotics.roadrunner.MinVelConstraint;
import com.acmerobotics.roadrunner.MotorFeedforward;
import com.acmerobotics.roadrunner.Pose2d;
import com.acmerobotics.roadrunner.Pose2dDual;
import com.acmerobotics.roadrunner.PoseVelocity2d;
import com.acmerobotics.roadrunner.PoseVelocity2dDual;
import com.acmerobotics.roadrunner.ProfileAccelConstraint;
import com.acmerobotics.roadrunner.ProfileParams;
import com.acmerobotics.roadrunner.RamseteController;
import com.acmerobotics.roadrunner.TankKinematics;
import com.acmerobotics.roadrunner.Time;
import com.acmerobotics.roadrunner.TimeTrajectory;
import com.acmerobotics.roadrunner.TimeTurn;
import com.acmerobotics.roadrunner.TrajectoryActionBuilder;
import com.acmerobotics.roadrunner.TrajectoryBuilderParams;
import com.acmerobotics.roadrunner.TurnConstraints;
import com.acmerobotics.roadrunner.Twist2dDual;
import com.acmerobotics.roadrunner.Vector2d;
import com.acmerobotics.roadrunner.Vector2dDual;
import com.acmerobotics.roadrunner.VelConstraint;
import com.acmerobotics.roadrunner.ftc.DownsampledWriter;
import com.acmerobotics.roadrunner.ftc.Encoder;
import com.acmerobotics.roadrunner.ftc.FlightRecorder;
import com.acmerobotics.roadrunner.ftc.LazyImu;
import com.acmerobotics.roadrunner.ftc.LynxFirmware;
import com.acmerobotics.roadrunner.ftc.OverflowEncoder;
import com.acmerobotics.roadrunner.ftc.PositionVelocityPair;
import com.acmerobotics.roadrunner.ftc.RawEncoder;
import com.qualcomm.hardware.lynx.LynxModule;
import com.qualcomm.hardware.rev.RevHubOrientationOnRobot;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.DcMotorEx;
import com.qualcomm.robotcore.hardware.DcMotorSimple;
import com.qualcomm.robotcore.hardware.HardwareMap;
import com.qualcomm.robotcore.hardware.VoltageSensor;
import org.firstinspires.ftc.teamcode.messages.DriveCommandMessage;
import org.firstinspires.ftc.teamcode.messages.PoseMessage;
import org.firstinspires.ftc.teamcode.messages.TankCommandMessage;
import org.firstinspires.ftc.teamcode.messages.TankLocalizerInputsMessage;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.LinkedList;
import java.util.List;
@Config
public final class TankDrive {
public static class Params {
// IMU orientation
// TODO: fill in these values based on
// see https://ftc-docs.firstinspires.org/en/latest/programming_resources/imu/imu.html?highlight=imu#physical-hub-mounting
public RevHubOrientationOnRobot.LogoFacingDirection logoFacingDirection =
RevHubOrientationOnRobot.LogoFacingDirection.UP;
public RevHubOrientationOnRobot.UsbFacingDirection usbFacingDirection =
RevHubOrientationOnRobot.UsbFacingDirection.FORWARD;
// drive model parameters
public double inPerTick = 0;
public double trackWidthTicks = 0;
// feedforward parameters (in tick units)
public double kS = 0;
public double kV = 0;
public double kA = 0;
// path profile parameters (in inches)
public double maxWheelVel = 50;
public double minProfileAccel = -30;
public double maxProfileAccel = 50;
// turn profile parameters (in radians)
public double maxAngVel = Math.PI; // shared with path
public double maxAngAccel = Math.PI;
// path controller gains
public double ramseteZeta = 0.7; // in the range (0, 1)
public double ramseteBBar = 2.0; // positive
// turn controller gains
public double turnGain = 0.0;
public double turnVelGain = 0.0;
}
public static Params PARAMS = new Params();
public final TankKinematics kinematics = new TankKinematics(PARAMS.inPerTick * PARAMS.trackWidthTicks);
public final TurnConstraints defaultTurnConstraints = new TurnConstraints(
PARAMS.maxAngVel, -PARAMS.maxAngVel, PARAMS.maxAngAccel);
public final VelConstraint defaultVelConstraint =
new MinVelConstraint(Arrays.asList(
kinematics.new WheelVelConstraint(PARAMS.maxWheelVel),
new AngularVelConstraint(PARAMS.maxAngVel)
));
public final AccelConstraint defaultAccelConstraint =
new ProfileAccelConstraint(PARAMS.minProfileAccel, PARAMS.maxProfileAccel);
public final List<DcMotorEx> leftMotors, rightMotors;
public final LazyImu lazyImu;
public final VoltageSensor voltageSensor;
public final Localizer localizer;
public Pose2d pose;
private final LinkedList<Pose2d> poseHistory = new LinkedList<>();
private final DownsampledWriter estimatedPoseWriter = new DownsampledWriter("ESTIMATED_POSE", 50_000_000);
private final DownsampledWriter targetPoseWriter = new DownsampledWriter("TARGET_POSE", 50_000_000);
private final DownsampledWriter driveCommandWriter = new DownsampledWriter("DRIVE_COMMAND", 50_000_000);
private final DownsampledWriter tankCommandWriter = new DownsampledWriter("TANK_COMMAND", 50_000_000);
public class DriveLocalizer implements Localizer {
public final List<Encoder> leftEncs, rightEncs;
private double lastLeftPos, lastRightPos;
private boolean initialized;
public DriveLocalizer() {
{
List<Encoder> leftEncs = new ArrayList<>();
for (DcMotorEx m : leftMotors) {
Encoder e = new OverflowEncoder(new RawEncoder(m));
leftEncs.add(e);
}
this.leftEncs = Collections.unmodifiableList(leftEncs);
}
{
List<Encoder> rightEncs = new ArrayList<>();
for (DcMotorEx m : rightMotors) {
Encoder e = new OverflowEncoder(new RawEncoder(m));
rightEncs.add(e);
}
this.rightEncs = Collections.unmodifiableList(rightEncs);
}
// TODO: reverse encoder directions if needed
// leftEncs.get(0).setDirection(DcMotorSimple.Direction.REVERSE);
}
@Override
public Twist2dDual<Time> update() {
List<PositionVelocityPair> leftReadings = new ArrayList<>(), rightReadings = new ArrayList<>();
double meanLeftPos = 0.0, meanLeftVel = 0.0;
for (Encoder e : leftEncs) {
PositionVelocityPair p = e.getPositionAndVelocity();
meanLeftPos += p.position;
meanLeftVel += p.velocity;
leftReadings.add(p);
}
meanLeftPos /= leftEncs.size();
meanLeftVel /= leftEncs.size();
double meanRightPos = 0.0, meanRightVel = 0.0;
for (Encoder e : rightEncs) {
PositionVelocityPair p = e.getPositionAndVelocity();
meanRightPos += p.position;
meanRightVel += p.velocity;
rightReadings.add(p);
}
meanRightPos /= rightEncs.size();
meanRightVel /= rightEncs.size();
FlightRecorder.write("TANK_LOCALIZER_INPUTS",
new TankLocalizerInputsMessage(leftReadings, rightReadings));
if (!initialized) {
initialized = true;
lastLeftPos = meanLeftPos;
lastRightPos = meanRightPos;
return new Twist2dDual<>(
Vector2dDual.constant(new Vector2d(0.0, 0.0), 2),
DualNum.constant(0.0, 2)
);
}
TankKinematics.WheelIncrements<Time> twist = new TankKinematics.WheelIncrements<>(
new DualNum<Time>(new double[] {
meanLeftPos - lastLeftPos,
meanLeftVel
}).times(PARAMS.inPerTick),
new DualNum<Time>(new double[] {
meanRightPos - lastRightPos,
meanRightVel,
}).times(PARAMS.inPerTick)
);
lastLeftPos = meanLeftPos;
lastRightPos = meanRightPos;
return kinematics.forward(twist);
}
}
public TankDrive(HardwareMap hardwareMap, Pose2d pose) {
this.pose = pose;
LynxFirmware.throwIfModulesAreOutdated(hardwareMap);
for (LynxModule module : hardwareMap.getAll(LynxModule.class)) {
module.setBulkCachingMode(LynxModule.BulkCachingMode.AUTO);
}
// TODO: make sure your config has motors with these names (or change them)
// add additional motors on each side if you have them
// see https://ftc-docs.firstinspires.org/en/latest/hardware_and_software_configuration/configuring/index.html
leftMotors = Arrays.asList(hardwareMap.get(DcMotorEx.class, "left"));
rightMotors = Arrays.asList(hardwareMap.get(DcMotorEx.class, "right"));
for (DcMotorEx m : leftMotors) {
m.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
}
for (DcMotorEx m : rightMotors) {
m.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
}
// TODO: reverse motor directions if needed
// leftMotors.get(0).setDirection(DcMotorSimple.Direction.REVERSE);
// TODO: make sure your config has an IMU with this name (can be BNO or BHI)
// see https://ftc-docs.firstinspires.org/en/latest/hardware_and_software_configuration/configuring/index.html
lazyImu = new LazyImu(hardwareMap, "imu", new RevHubOrientationOnRobot(
PARAMS.logoFacingDirection, PARAMS.usbFacingDirection));
voltageSensor = hardwareMap.voltageSensor.iterator().next();
localizer = new TankDrive.DriveLocalizer();
FlightRecorder.write("TANK_PARAMS", PARAMS);
}
public void setDrivePowers(PoseVelocity2d powers) {
TankKinematics.WheelVelocities<Time> wheelVels = new TankKinematics(2).inverse(
PoseVelocity2dDual.constant(powers, 1));
double maxPowerMag = 1;
for (DualNum<Time> power : wheelVels.all()) {
maxPowerMag = Math.max(maxPowerMag, power.value());
}
for (DcMotorEx m : leftMotors) {
m.setPower(wheelVels.left.get(0) / maxPowerMag);
}
for (DcMotorEx m : rightMotors) {
m.setPower(wheelVels.right.get(0) / maxPowerMag);
}
}
public final class FollowTrajectoryAction implements Action {
public final TimeTrajectory timeTrajectory;
private double beginTs = -1;
private final double[] xPoints, yPoints;
public FollowTrajectoryAction(TimeTrajectory t) {
timeTrajectory = t;
List<Double> disps = com.acmerobotics.roadrunner.Math.range(
0, t.path.length(),
Math.max(2, (int) Math.ceil(t.path.length() / 2)));
xPoints = new double[disps.size()];
yPoints = new double[disps.size()];
for (int i = 0; i < disps.size(); i++) {
Pose2d p = t.path.get(disps.get(i), 1).value();
xPoints[i] = p.position.x;
yPoints[i] = p.position.y;
}
}
@Override
public boolean run(@NonNull TelemetryPacket p) {
double t;
if (beginTs < 0) {
beginTs = Actions.now();
t = 0;
} else {
t = Actions.now() - beginTs;
}
if (t >= timeTrajectory.duration) {
for (DcMotorEx m : leftMotors) {
m.setPower(0);
}
for (DcMotorEx m : rightMotors) {
m.setPower(0);
}
return false;
}
DualNum<Time> x = timeTrajectory.profile.get(t);
Pose2dDual<Arclength> txWorldTarget = timeTrajectory.path.get(x.value(), 3);
targetPoseWriter.write(new PoseMessage(txWorldTarget.value()));
updatePoseEstimate();
PoseVelocity2dDual<Time> command = new RamseteController(kinematics.trackWidth, PARAMS.ramseteZeta, PARAMS.ramseteBBar)
.compute(x, txWorldTarget, pose);
driveCommandWriter.write(new DriveCommandMessage(command));
TankKinematics.WheelVelocities<Time> wheelVels = kinematics.inverse(command);
double voltage = voltageSensor.getVoltage();
final MotorFeedforward feedforward = new MotorFeedforward(PARAMS.kS,
PARAMS.kV / PARAMS.inPerTick, PARAMS.kA / PARAMS.inPerTick);
double leftPower = feedforward.compute(wheelVels.left) / voltage;
double rightPower = feedforward.compute(wheelVels.right) / voltage;
tankCommandWriter.write(new TankCommandMessage(voltage, leftPower, rightPower));
for (DcMotorEx m : leftMotors) {
m.setPower(leftPower);
}
for (DcMotorEx m : rightMotors) {
m.setPower(rightPower);
}
p.put("x", pose.position.x);
p.put("y", pose.position.y);
p.put("heading (deg)", Math.toDegrees(pose.heading.toDouble()));
Pose2d error = txWorldTarget.value().minusExp(pose);
p.put("xError", error.position.x);
p.put("yError", error.position.y);
p.put("headingError (deg)", Math.toDegrees(error.heading.toDouble()));
// only draw when active; only one drive action should be active at a time
Canvas c = p.fieldOverlay();
drawPoseHistory(c);
c.setStroke("#4CAF50");
Drawing.drawRobot(c, txWorldTarget.value());
c.setStroke("#3F51B5");
Drawing.drawRobot(c, pose);
c.setStroke("#4CAF50FF");
c.setStrokeWidth(1);
c.strokePolyline(xPoints, yPoints);
return true;
}
@Override
public void preview(Canvas c) {
c.setStroke("#4CAF507A");
c.setStrokeWidth(1);
c.strokePolyline(xPoints, yPoints);
}
}
public final class TurnAction implements Action {
private final TimeTurn turn;
private double beginTs = -1;
public TurnAction(TimeTurn turn) {
this.turn = turn;
}
@Override
public boolean run(@NonNull TelemetryPacket p) {
double t;
if (beginTs < 0) {
beginTs = Actions.now();
t = 0;
} else {
t = Actions.now() - beginTs;
}
if (t >= turn.duration) {
for (DcMotorEx m : leftMotors) {
m.setPower(0);
}
for (DcMotorEx m : rightMotors) {
m.setPower(0);
}
return false;
}
Pose2dDual<Time> txWorldTarget = turn.get(t);
targetPoseWriter.write(new PoseMessage(txWorldTarget.value()));
PoseVelocity2d robotVelRobot = updatePoseEstimate();
PoseVelocity2dDual<Time> command = new PoseVelocity2dDual<>(
Vector2dDual.constant(new Vector2d(0, 0), 3),
txWorldTarget.heading.velocity().plus(
PARAMS.turnGain * pose.heading.minus(txWorldTarget.heading.value()) +
PARAMS.turnVelGain * (robotVelRobot.angVel - txWorldTarget.heading.velocity().value())
)
);
driveCommandWriter.write(new DriveCommandMessage(command));
TankKinematics.WheelVelocities<Time> wheelVels = kinematics.inverse(command);
double voltage = voltageSensor.getVoltage();
final MotorFeedforward feedforward = new MotorFeedforward(PARAMS.kS,
PARAMS.kV / PARAMS.inPerTick, PARAMS.kA / PARAMS.inPerTick);
double leftPower = feedforward.compute(wheelVels.left) / voltage;
double rightPower = feedforward.compute(wheelVels.right) / voltage;
tankCommandWriter.write(new TankCommandMessage(voltage, leftPower, rightPower));
for (DcMotorEx m : leftMotors) {
m.setPower(leftPower);
}
for (DcMotorEx m : rightMotors) {
m.setPower(rightPower);
}
Canvas c = p.fieldOverlay();
drawPoseHistory(c);
c.setStroke("#4CAF50");
Drawing.drawRobot(c, txWorldTarget.value());
c.setStroke("#3F51B5");
Drawing.drawRobot(c, pose);
c.setStroke("#7C4DFFFF");
c.fillCircle(turn.beginPose.position.x, turn.beginPose.position.y, 2);
return true;
}
@Override
public void preview(Canvas c) {
c.setStroke("#7C4DFF7A");
c.fillCircle(turn.beginPose.position.x, turn.beginPose.position.y, 2);
}
}
public PoseVelocity2d updatePoseEstimate() {
Twist2dDual<Time> twist = localizer.update();
pose = pose.plus(twist.value());
poseHistory.add(pose);
while (poseHistory.size() > 100) {
poseHistory.removeFirst();
}
estimatedPoseWriter.write(new PoseMessage(pose));
return twist.velocity().value();
}
private void drawPoseHistory(Canvas c) {
double[] xPoints = new double[poseHistory.size()];
double[] yPoints = new double[poseHistory.size()];
int i = 0;
for (Pose2d t : poseHistory) {
xPoints[i] = t.position.x;
yPoints[i] = t.position.y;
i++;
}
c.setStrokeWidth(1);
c.setStroke("#3F51B5");
c.strokePolyline(xPoints, yPoints);
}
public TrajectoryActionBuilder actionBuilder(Pose2d beginPose) {
return new TrajectoryActionBuilder(
TurnAction::new,
FollowTrajectoryAction::new,
new TrajectoryBuilderParams(
1e-6,
new ProfileParams(
0.25, 0.1, 1e-2
)
),
beginPose, 0.0,
defaultTurnConstraints,
defaultVelConstraint, defaultAccelConstraint
);
}
}

View File

@ -0,0 +1,100 @@
package org.firstinspires.ftc.teamcode;
import com.acmerobotics.dashboard.config.Config;
import com.acmerobotics.roadrunner.DualNum;
import com.acmerobotics.roadrunner.Time;
import com.acmerobotics.roadrunner.Twist2dDual;
import com.acmerobotics.roadrunner.Vector2d;
import com.acmerobotics.roadrunner.Vector2dDual;
import com.acmerobotics.roadrunner.ftc.Encoder;
import com.acmerobotics.roadrunner.ftc.FlightRecorder;
import com.acmerobotics.roadrunner.ftc.OverflowEncoder;
import com.acmerobotics.roadrunner.ftc.PositionVelocityPair;
import com.acmerobotics.roadrunner.ftc.RawEncoder;
import com.qualcomm.robotcore.hardware.DcMotorEx;
import com.qualcomm.robotcore.hardware.DcMotorSimple;
import com.qualcomm.robotcore.hardware.HardwareMap;
import org.firstinspires.ftc.teamcode.messages.ThreeDeadWheelInputsMessage;
@Config
public final class ThreeDeadWheelLocalizer implements Localizer {
public static class Params {
public double par0YTicks = 0.0; // y position of the first parallel encoder (in tick units)
public double par1YTicks = 1.0; // y position of the second parallel encoder (in tick units)
public double perpXTicks = 0.0; // x position of the perpendicular encoder (in tick units)
}
public static Params PARAMS = new Params();
public final Encoder par0, par1, perp;
public final double inPerTick;
private int lastPar0Pos, lastPar1Pos, lastPerpPos;
private boolean initialized;
public ThreeDeadWheelLocalizer(HardwareMap hardwareMap, double inPerTick) {
// TODO: make sure your config has **motors** with these names (or change them)
// the encoders should be plugged into the slot matching the named motor
// see https://ftc-docs.firstinspires.org/en/latest/hardware_and_software_configuration/configuring/index.html
par0 = new OverflowEncoder(new RawEncoder(hardwareMap.get(DcMotorEx.class, "encoder left")));
par1 = new OverflowEncoder(new RawEncoder(hardwareMap.get(DcMotorEx.class, "encoder right")));
perp = new OverflowEncoder(new RawEncoder(hardwareMap.get(DcMotorEx.class, "encoder back")));
// TODO: reverse encoder directions if needed
// par0.setDirection(DcMotorSimple.Direction.REVERSE);
this.inPerTick = inPerTick;
FlightRecorder.write("THREE_DEAD_WHEEL_PARAMS", PARAMS);
}
public Twist2dDual<Time> update() {
PositionVelocityPair par0PosVel = par0.getPositionAndVelocity();
PositionVelocityPair par1PosVel = par1.getPositionAndVelocity();
PositionVelocityPair perpPosVel = perp.getPositionAndVelocity();
FlightRecorder.write("THREE_DEAD_WHEEL_INPUTS", new ThreeDeadWheelInputsMessage(par0PosVel, par1PosVel, perpPosVel));
if (!initialized) {
initialized = true;
lastPar0Pos = par0PosVel.position;
lastPar1Pos = par1PosVel.position;
lastPerpPos = perpPosVel.position;
return new Twist2dDual<>(
Vector2dDual.constant(new Vector2d(0.0, 0.0), 2),
DualNum.constant(0.0, 2)
);
}
int par0PosDelta = par0PosVel.position - lastPar0Pos;
int par1PosDelta = par1PosVel.position - lastPar1Pos;
int perpPosDelta = perpPosVel.position - lastPerpPos;
Twist2dDual<Time> twist = new Twist2dDual<>(
new Vector2dDual<>(
new DualNum<Time>(new double[] {
(PARAMS.par0YTicks * par1PosDelta - PARAMS.par1YTicks * par0PosDelta) / (PARAMS.par0YTicks - PARAMS.par1YTicks),
(PARAMS.par0YTicks * par1PosVel.velocity - PARAMS.par1YTicks * par0PosVel.velocity) / (PARAMS.par0YTicks - PARAMS.par1YTicks),
}).times(inPerTick),
new DualNum<Time>(new double[] {
(PARAMS.perpXTicks / (PARAMS.par0YTicks - PARAMS.par1YTicks) * (par1PosDelta - par0PosDelta) + perpPosDelta),
(PARAMS.perpXTicks / (PARAMS.par0YTicks - PARAMS.par1YTicks) * (par1PosVel.velocity - par0PosVel.velocity) + perpPosVel.velocity),
}).times(inPerTick)
),
new DualNum<>(new double[] {
(par0PosDelta - par1PosDelta) / (PARAMS.par0YTicks - PARAMS.par1YTicks),
(par0PosVel.velocity - par1PosVel.velocity) / (PARAMS.par0YTicks - PARAMS.par1YTicks),
})
);
lastPar0Pos = par0PosVel.position;
lastPar1Pos = par1PosVel.position;
lastPerpPos = perpPosVel.position;
return twist;
}
}

View File

@ -0,0 +1,121 @@
package org.firstinspires.ftc.teamcode;
import com.acmerobotics.dashboard.config.Config;
import com.acmerobotics.roadrunner.DualNum;
import com.acmerobotics.roadrunner.Rotation2d;
import com.acmerobotics.roadrunner.Time;
import com.acmerobotics.roadrunner.Twist2dDual;
import com.acmerobotics.roadrunner.Vector2d;
import com.acmerobotics.roadrunner.Vector2dDual;
import com.acmerobotics.roadrunner.ftc.Encoder;
import com.acmerobotics.roadrunner.ftc.FlightRecorder;
import com.acmerobotics.roadrunner.ftc.OverflowEncoder;
import com.acmerobotics.roadrunner.ftc.PositionVelocityPair;
import com.acmerobotics.roadrunner.ftc.RawEncoder;
import com.qualcomm.robotcore.hardware.DcMotorEx;
import com.qualcomm.robotcore.hardware.DcMotorSimple;
import com.qualcomm.robotcore.hardware.HardwareMap;
import com.qualcomm.robotcore.hardware.IMU;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
import org.firstinspires.ftc.teamcode.messages.TwoDeadWheelInputsMessage;
@Config
public final class TwoDeadWheelLocalizer implements Localizer {
public static class Params {
public double parYTicks = 0.0; // y position of the parallel encoder (in tick units)
public double perpXTicks = 0.0; // x position of the perpendicular encoder (in tick units)
}
public static Params PARAMS = new Params();
public final Encoder par, perp;
public final IMU imu;
private int lastParPos, lastPerpPos;
private Rotation2d lastHeading;
private final double inPerTick;
private double lastRawHeadingVel, headingVelOffset;
private boolean initialized;
public TwoDeadWheelLocalizer(HardwareMap hardwareMap, IMU imu, double inPerTick) {
// TODO: make sure your config has **motors** with these names (or change them)
// the encoders should be plugged into the slot matching the named motor
// see https://ftc-docs.firstinspires.org/en/latest/hardware_and_software_configuration/configuring/index.html
par = new OverflowEncoder(new RawEncoder(hardwareMap.get(DcMotorEx.class, "par")));
perp = new OverflowEncoder(new RawEncoder(hardwareMap.get(DcMotorEx.class, "perp")));
// TODO: reverse encoder directions if needed
// par.setDirection(DcMotorSimple.Direction.REVERSE);
this.imu = imu;
this.inPerTick = inPerTick;
FlightRecorder.write("TWO_DEAD_WHEEL_PARAMS", PARAMS);
}
public Twist2dDual<Time> update() {
PositionVelocityPair parPosVel = par.getPositionAndVelocity();
PositionVelocityPair perpPosVel = perp.getPositionAndVelocity();
YawPitchRollAngles angles = imu.getRobotYawPitchRollAngles();
AngularVelocity angularVelocity = imu.getRobotAngularVelocity(AngleUnit.RADIANS);
FlightRecorder.write("TWO_DEAD_WHEEL_INPUTS", new TwoDeadWheelInputsMessage(parPosVel, perpPosVel, angles, angularVelocity));
Rotation2d heading = Rotation2d.exp(angles.getYaw(AngleUnit.RADIANS));
// see https://github.com/FIRST-Tech-Challenge/FtcRobotController/issues/617
double rawHeadingVel = angularVelocity.zRotationRate;
if (Math.abs(rawHeadingVel - lastRawHeadingVel) > Math.PI) {
headingVelOffset -= Math.signum(rawHeadingVel) * 2 * Math.PI;
}
lastRawHeadingVel = rawHeadingVel;
double headingVel = headingVelOffset + rawHeadingVel;
if (!initialized) {
initialized = true;
lastParPos = parPosVel.position;
lastPerpPos = perpPosVel.position;
lastHeading = heading;
return new Twist2dDual<>(
Vector2dDual.constant(new Vector2d(0.0, 0.0), 2),
DualNum.constant(0.0, 2)
);
}
int parPosDelta = parPosVel.position - lastParPos;
int perpPosDelta = perpPosVel.position - lastPerpPos;
double headingDelta = heading.minus(lastHeading);
Twist2dDual<Time> twist = new Twist2dDual<>(
new Vector2dDual<>(
new DualNum<Time>(new double[] {
parPosDelta - PARAMS.parYTicks * headingDelta,
parPosVel.velocity - PARAMS.parYTicks * headingVel,
}).times(inPerTick),
new DualNum<Time>(new double[] {
perpPosDelta - PARAMS.perpXTicks * headingDelta,
perpPosVel.velocity - PARAMS.perpXTicks * headingVel,
}).times(inPerTick)
),
new DualNum<>(new double[] {
headingDelta,
headingVel,
})
);
lastParPos = parPosVel.position;
lastPerpPos = perpPosVel.position;
lastHeading = heading;
return twist;
}
}

View File

@ -0,0 +1,24 @@
package org.firstinspires.ftc.teamcode.messages;
import com.acmerobotics.roadrunner.PoseVelocity2dDual;
import com.acmerobotics.roadrunner.Time;
public final class DriveCommandMessage {
public long timestamp;
public double forwardVelocity;
public double forwardAcceleration;
public double lateralVelocity;
public double lateralAcceleration;
public double angularVelocity;
public double angularAcceleration;
public DriveCommandMessage(PoseVelocity2dDual<Time> poseVelocity) {
this.timestamp = System.nanoTime();
this.forwardVelocity = poseVelocity.linearVel.x.get(0);
this.forwardAcceleration = poseVelocity.linearVel.x.get(1);
this.lateralVelocity = poseVelocity.linearVel.y.get(0);
this.lateralAcceleration = poseVelocity.linearVel.y.get(1);
this.angularVelocity = poseVelocity.angVel.get(0);
this.angularAcceleration = poseVelocity.angVel.get(1);
}
}

View File

@ -0,0 +1,19 @@
package org.firstinspires.ftc.teamcode.messages;
public final class MecanumCommandMessage {
public long timestamp;
public double voltage;
public double leftFrontPower;
public double leftBackPower;
public double rightBackPower;
public double rightFrontPower;
public MecanumCommandMessage(double voltage, double leftFrontPower, double leftBackPower, double rightBackPower, double rightFrontPower) {
this.timestamp = System.nanoTime();
this.voltage = voltage;
this.leftFrontPower = leftFrontPower;
this.leftBackPower = leftBackPower;
this.rightBackPower = rightBackPower;
this.rightFrontPower = rightFrontPower;
}
}

View File

@ -0,0 +1,30 @@
package org.firstinspires.ftc.teamcode.messages;
import com.acmerobotics.roadrunner.ftc.PositionVelocityPair;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
public final class MecanumLocalizerInputsMessage {
public long timestamp;
public PositionVelocityPair leftFront;
public PositionVelocityPair leftBack;
public PositionVelocityPair rightBack;
public PositionVelocityPair rightFront;
public double yaw;
public double pitch;
public double roll;
public MecanumLocalizerInputsMessage(PositionVelocityPair leftFront, PositionVelocityPair leftBack, PositionVelocityPair rightBack, PositionVelocityPair rightFront, YawPitchRollAngles angles) {
this.timestamp = System.nanoTime();
this.leftFront = leftFront;
this.leftBack = leftBack;
this.rightBack = rightBack;
this.rightFront = rightFront;
{
this.yaw = angles.getYaw(AngleUnit.RADIANS);
this.pitch = angles.getPitch(AngleUnit.RADIANS);
this.roll = angles.getRoll(AngleUnit.RADIANS);
}
}
}

View File

@ -0,0 +1,18 @@
package org.firstinspires.ftc.teamcode.messages;
import com.acmerobotics.roadrunner.Pose2d;
public final class PoseMessage {
public long timestamp;
public double x;
public double y;
public double heading;
public PoseMessage(Pose2d pose) {
this.timestamp = System.nanoTime();
this.x = pose.position.x;
this.y = pose.position.y;
this.heading = pose.heading.toDouble();
}
}

View File

@ -0,0 +1,15 @@
package org.firstinspires.ftc.teamcode.messages;
public final class TankCommandMessage {
public long timestamp;
public double voltage;
public double leftPower;
public double rightPower;
public TankCommandMessage(double voltage, double leftPower, double rightPower) {
this.timestamp = System.nanoTime();
this.voltage = voltage;
this.leftPower = leftPower;
this.rightPower = rightPower;
}
}

View File

@ -0,0 +1,17 @@
package org.firstinspires.ftc.teamcode.messages;
import com.acmerobotics.roadrunner.ftc.PositionVelocityPair;
import java.util.List;
public final class TankLocalizerInputsMessage {
public long timestamp;
public PositionVelocityPair[] left;
public PositionVelocityPair[] right;
public TankLocalizerInputsMessage(List<PositionVelocityPair> left, List<PositionVelocityPair> right) {
this.timestamp = System.nanoTime();
this.left = left.toArray(new PositionVelocityPair[0]);
this.right = right.toArray(new PositionVelocityPair[0]);
}
}

View File

@ -0,0 +1,17 @@
package org.firstinspires.ftc.teamcode.messages;
import com.acmerobotics.roadrunner.ftc.PositionVelocityPair;
public final class ThreeDeadWheelInputsMessage {
public long timestamp;
public PositionVelocityPair par0;
public PositionVelocityPair par1;
public PositionVelocityPair perp;
public ThreeDeadWheelInputsMessage(PositionVelocityPair par0, PositionVelocityPair par1, PositionVelocityPair perp) {
this.timestamp = System.nanoTime();
this.par0 = par0;
this.par1 = par1;
this.perp = perp;
}
}

View File

@ -0,0 +1,35 @@
package org.firstinspires.ftc.teamcode.messages;
import com.acmerobotics.roadrunner.ftc.PositionVelocityPair;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
public final class TwoDeadWheelInputsMessage {
public long timestamp;
public PositionVelocityPair par;
public PositionVelocityPair perp;
public double yaw;
public double pitch;
public double roll;
public double xRotationRate;
public double yRotationRate;
public double zRotationRate;
public TwoDeadWheelInputsMessage(PositionVelocityPair par, PositionVelocityPair perp, YawPitchRollAngles angles, AngularVelocity angularVelocity) {
this.timestamp = System.nanoTime();
this.par = par;
this.perp = perp;
{
this.yaw = angles.getYaw(AngleUnit.RADIANS);
this.pitch = angles.getPitch(AngleUnit.RADIANS);
this.roll = angles.getRoll(AngleUnit.RADIANS);
}
{
this.xRotationRate = angularVelocity.xRotationRate;
this.yRotationRate = angularVelocity.yRotationRate;
this.zRotationRate = angularVelocity.zRotationRate;
}
}
}

View File

@ -1,131 +0,0 @@
## TeamCode Module
Welcome!
This module, TeamCode, is the place where you will write/paste the code for your team's
robot controller App. This module is currently empty (a clean slate) but the
process for adding OpModes is straightforward.
## Creating your own OpModes
The easiest way to create your own OpMode is to copy a Sample OpMode and make it your own.
Sample opmodes exist in the FtcRobotController module.
To locate these samples, find the FtcRobotController module in the "Project/Android" tab.
Expand the following tree elements:
FtcRobotController/java/org.firstinspires.ftc.robotcontroller/external/samples
### Naming of Samples
To gain a better understanding of how the samples are organized, and how to interpret the
naming system, it will help to understand the conventions that were used during their creation.
These conventions are described (in detail) in the sample_conventions.md file in this folder.
To summarize: A range of different samples classes will reside in the java/external/samples.
The class names will follow a naming convention which indicates the purpose of each class.
The prefix of the name will be one of the following:
Basic: This is a minimally functional OpMode used to illustrate the skeleton/structure
of a particular style of OpMode. These are bare bones examples.
Sensor: This is a Sample OpMode that shows how to use a specific sensor.
It is not intended to drive a functioning robot, it is simply showing the minimal code
required to read and display the sensor values.
Robot: This is a Sample OpMode that assumes a simple two-motor (differential) drive base.
It may be used to provide a common baseline driving OpMode, or
to demonstrate how a particular sensor or concept can be used to navigate.
Concept: This is a sample OpMode that illustrates performing a specific function or concept.
These may be complex, but their operation should be explained clearly in the comments,
or the comments should reference an external doc, guide or tutorial.
Each OpMode should try to only demonstrate a single concept so they are easy to
locate based on their name. These OpModes may not produce a drivable robot.
After the prefix, other conventions will apply:
* Sensor class names are constructed as: Sensor - Company - Type
* Robot class names are constructed as: Robot - Mode - Action - OpModetype
* Concept class names are constructed as: Concept - Topic - OpModetype
Once you are familiar with the range of samples available, you can choose one to be the
basis for your own robot. In all cases, the desired sample(s) needs to be copied into
your TeamCode module to be used.
This is done inside Android Studio directly, using the following steps:
1) Locate the desired sample class in the Project/Android tree.
2) Right click on the sample class and select "Copy"
3) Expand the TeamCode/java folder
4) Right click on the org.firstinspires.ftc.teamcode folder and select "Paste"
5) You will be prompted for a class name for the copy.
Choose something meaningful based on the purpose of this class.
Start with a capital letter, and remember that there may be more similar classes later.
Once your copy has been created, you should prepare it for use on your robot.
This is done by adjusting the OpMode's name, and enabling it to be displayed on the
Driver Station's OpMode list.
Each OpMode sample class begins with several lines of code like the ones shown below:
```
@TeleOp(name="Template: Linear OpMode", group="Linear Opmode")
@Disabled
```
The name that will appear on the driver station's "opmode list" is defined by the code:
``name="Template: Linear OpMode"``
You can change what appears between the quotes to better describe your opmode.
The "group=" portion of the code can be used to help organize your list of OpModes.
As shown, the current OpMode will NOT appear on the driver station's OpMode list because of the
``@Disabled`` annotation which has been included.
This line can simply be deleted , or commented out, to make the OpMode visible.
## ADVANCED Multi-Team App management: Cloning the TeamCode Module
In some situations, you have multiple teams in your club and you want them to all share
a common code organization, with each being able to *see* the others code but each having
their own team module with their own code that they maintain themselves.
In this situation, you might wish to clone the TeamCode module, once for each of these teams.
Each of the clones would then appear along side each other in the Android Studio module list,
together with the FtcRobotController module (and the original TeamCode module).
Selective Team phones can then be programmed by selecting the desired Module from the pulldown list
prior to clicking to the green Run arrow.
Warning: This is not for the inexperienced Software developer.
You will need to be comfortable with File manipulations and managing Android Studio Modules.
These changes are performed OUTSIDE of Android Studios, so close Android Studios before you do this.
Also.. Make a full project backup before you start this :)
To clone TeamCode, do the following:
Note: Some names start with "Team" and others start with "team". This is intentional.
1) Using your operating system file management tools, copy the whole "TeamCode"
folder to a sibling folder with a corresponding new name, eg: "Team0417".
2) In the new Team0417 folder, delete the TeamCode.iml file.
3) the new Team0417 folder, rename the "src/main/java/org/firstinspires/ftc/teamcode" folder
to a matching name with a lowercase 'team' eg: "team0417".
4) In the new Team0417/src/main folder, edit the "AndroidManifest.xml" file, change the line that contains
package="org.firstinspires.ftc.teamcode"
to be
package="org.firstinspires.ftc.team0417"
5) Add: include ':Team0417' to the "/settings.gradle" file.
6) Open up Android Studios and clean out any old files by using the menu to "Build/Clean Project""

View File

@ -0,0 +1,76 @@
package org.firstinspires.ftc.teamcode.tuning;
import com.acmerobotics.dashboard.FtcDashboard;
import com.acmerobotics.dashboard.telemetry.MultipleTelemetry;
import com.acmerobotics.dashboard.telemetry.TelemetryPacket;
import com.acmerobotics.roadrunner.Pose2d;
import com.acmerobotics.roadrunner.PoseVelocity2d;
import com.acmerobotics.roadrunner.Vector2d;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import org.firstinspires.ftc.teamcode.Drawing;
import org.firstinspires.ftc.teamcode.MecanumDrive;
import org.firstinspires.ftc.teamcode.TankDrive;
public class LocalizationTest extends LinearOpMode {
@Override
public void runOpMode() throws InterruptedException {
telemetry = new MultipleTelemetry(telemetry, FtcDashboard.getInstance().getTelemetry());
if (TuningOpModes.DRIVE_CLASS.equals(MecanumDrive.class)) {
MecanumDrive drive = new MecanumDrive(hardwareMap, new Pose2d(0, 0, 0));
waitForStart();
while (opModeIsActive()) {
drive.setDrivePowers(new PoseVelocity2d(
new Vector2d(
-gamepad1.left_stick_y,
-gamepad1.left_stick_x
),
-gamepad1.right_stick_x
));
drive.updatePoseEstimate();
telemetry.addData("x", drive.pose.position.x);
telemetry.addData("y", drive.pose.position.y);
telemetry.addData("heading (deg)", Math.toDegrees(drive.pose.heading.toDouble()));
telemetry.update();
TelemetryPacket packet = new TelemetryPacket();
packet.fieldOverlay().setStroke("#3F51B5");
Drawing.drawRobot(packet.fieldOverlay(), drive.pose);
FtcDashboard.getInstance().sendTelemetryPacket(packet);
}
} else if (TuningOpModes.DRIVE_CLASS.equals(TankDrive.class)) {
TankDrive drive = new TankDrive(hardwareMap, new Pose2d(0, 0, 0));
waitForStart();
while (opModeIsActive()) {
drive.setDrivePowers(new PoseVelocity2d(
new Vector2d(
-gamepad1.left_stick_y,
0.0
),
-gamepad1.right_stick_x
));
drive.updatePoseEstimate();
telemetry.addData("x", drive.pose.position.x);
telemetry.addData("y", drive.pose.position.y);
telemetry.addData("heading (deg)", Math.toDegrees(drive.pose.heading.toDouble()));
telemetry.update();
TelemetryPacket packet = new TelemetryPacket();
packet.fieldOverlay().setStroke("#3F51B5");
Drawing.drawRobot(packet.fieldOverlay(), drive.pose);
FtcDashboard.getInstance().sendTelemetryPacket(packet);
}
} else {
throw new RuntimeException();
}
}
}

View File

@ -0,0 +1,63 @@
package org.firstinspires.ftc.teamcode.tuning;
import com.acmerobotics.roadrunner.Pose2d;
import com.acmerobotics.roadrunner.ftc.Actions;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import org.firstinspires.ftc.teamcode.MecanumDrive;
import org.firstinspires.ftc.teamcode.TankDrive;
import org.firstinspires.ftc.teamcode.ThreeDeadWheelLocalizer;
import org.firstinspires.ftc.teamcode.TwoDeadWheelLocalizer;
public final class ManualFeedbackTuner extends LinearOpMode {
public static double DISTANCE = 64;
@Override
public void runOpMode() throws InterruptedException {
if (TuningOpModes.DRIVE_CLASS.equals(MecanumDrive.class)) {
MecanumDrive drive = new MecanumDrive(hardwareMap, new Pose2d(0, 0, 0));
if (drive.localizer instanceof TwoDeadWheelLocalizer) {
if (TwoDeadWheelLocalizer.PARAMS.perpXTicks == 0 && TwoDeadWheelLocalizer.PARAMS.parYTicks == 0) {
throw new RuntimeException("Odometry wheel locations not set! Run AngularRampLogger to tune them.");
}
} else if (drive.localizer instanceof ThreeDeadWheelLocalizer) {
if (ThreeDeadWheelLocalizer.PARAMS.perpXTicks == 0 && ThreeDeadWheelLocalizer.PARAMS.par0YTicks == 0 && ThreeDeadWheelLocalizer.PARAMS.par1YTicks == 1) {
throw new RuntimeException("Odometry wheel locations not set! Run AngularRampLogger to tune them.");
}
}
waitForStart();
while (opModeIsActive()) {
Actions.runBlocking(
drive.actionBuilder(new Pose2d(0, 0, 0))
.lineToX(DISTANCE)
.lineToX(0)
.build());
}
} else if (TuningOpModes.DRIVE_CLASS.equals(TankDrive.class)) {
TankDrive drive = new TankDrive(hardwareMap, new Pose2d(0, 0, 0));
if (drive.localizer instanceof TwoDeadWheelLocalizer) {
if (TwoDeadWheelLocalizer.PARAMS.perpXTicks == 0 && TwoDeadWheelLocalizer.PARAMS.parYTicks == 0) {
throw new RuntimeException("Odometry wheel locations not set! Run AngularRampLogger to tune them.");
}
} else if (drive.localizer instanceof ThreeDeadWheelLocalizer) {
if (ThreeDeadWheelLocalizer.PARAMS.perpXTicks == 0 && ThreeDeadWheelLocalizer.PARAMS.par0YTicks == 0 && ThreeDeadWheelLocalizer.PARAMS.par1YTicks == 1) {
throw new RuntimeException("Odometry wheel locations not set! Run AngularRampLogger to tune them.");
}
}
waitForStart();
while (opModeIsActive()) {
Actions.runBlocking(
drive.actionBuilder(new Pose2d(0, 0, 0))
.lineToX(DISTANCE)
.lineToX(0)
.build());
}
} else {
throw new RuntimeException();
}
}
}

View File

@ -0,0 +1,39 @@
package org.firstinspires.ftc.teamcode.tuning;
import com.acmerobotics.roadrunner.Pose2d;
import com.acmerobotics.roadrunner.Vector2d;
import com.acmerobotics.roadrunner.ftc.Actions;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import org.firstinspires.ftc.teamcode.MecanumDrive;
import org.firstinspires.ftc.teamcode.TankDrive;
public final class SplineTest extends LinearOpMode {
@Override
public void runOpMode() throws InterruptedException {
Pose2d beginPose = new Pose2d(0, 0, 0);
if (TuningOpModes.DRIVE_CLASS.equals(MecanumDrive.class)) {
MecanumDrive drive = new MecanumDrive(hardwareMap, beginPose);
waitForStart();
Actions.runBlocking(
drive.actionBuilder(beginPose)
.splineTo(new Vector2d(30, 30), Math.PI / 2)
.splineTo(new Vector2d(0, 60), Math.PI)
.build());
} else if (TuningOpModes.DRIVE_CLASS.equals(TankDrive.class)) {
TankDrive drive = new TankDrive(hardwareMap, beginPose);
waitForStart();
Actions.runBlocking(
drive.actionBuilder(beginPose)
.splineTo(new Vector2d(30, 30), Math.PI / 2)
.splineTo(new Vector2d(0, 60), Math.PI)
.build());
} else {
throw new RuntimeException();
}
}
}

View File

@ -0,0 +1,180 @@
package org.firstinspires.ftc.teamcode.tuning;
import com.acmerobotics.dashboard.FtcDashboard;
import com.acmerobotics.dashboard.config.reflection.ReflectionConfig;
import com.acmerobotics.roadrunner.MotorFeedforward;
import com.acmerobotics.roadrunner.Pose2d;
import com.acmerobotics.roadrunner.ftc.AngularRampLogger;
import com.acmerobotics.roadrunner.ftc.DeadWheelDirectionDebugger;
import com.acmerobotics.roadrunner.ftc.DriveType;
import com.acmerobotics.roadrunner.ftc.DriveView;
import com.acmerobotics.roadrunner.ftc.DriveViewFactory;
import com.acmerobotics.roadrunner.ftc.Encoder;
import com.acmerobotics.roadrunner.ftc.ForwardPushTest;
import com.acmerobotics.roadrunner.ftc.ForwardRampLogger;
import com.acmerobotics.roadrunner.ftc.LateralPushTest;
import com.acmerobotics.roadrunner.ftc.LateralRampLogger;
import com.acmerobotics.roadrunner.ftc.ManualFeedforwardTuner;
import com.acmerobotics.roadrunner.ftc.MecanumMotorDirectionDebugger;
import com.qualcomm.hardware.lynx.LynxModule;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.eventloop.opmode.OpModeManager;
import com.qualcomm.robotcore.eventloop.opmode.OpModeRegistrar;
import org.firstinspires.ftc.robotcore.internal.opmode.OpModeMeta;
import org.firstinspires.ftc.teamcode.MecanumDrive;
import org.firstinspires.ftc.teamcode.TankDrive;
import org.firstinspires.ftc.teamcode.ThreeDeadWheelLocalizer;
import org.firstinspires.ftc.teamcode.TwoDeadWheelLocalizer;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
public final class TuningOpModes {
// TODO: change this to TankDrive.class if you're using tank
public static final Class<?> DRIVE_CLASS = MecanumDrive.class;
public static final String GROUP = "quickstart";
public static final boolean DISABLED = false;
private TuningOpModes() {}
private static OpModeMeta metaForClass(Class<? extends OpMode> cls) {
return new OpModeMeta.Builder()
.setName(cls.getSimpleName())
.setGroup(GROUP)
.setFlavor(OpModeMeta.Flavor.TELEOP)
.build();
}
@OpModeRegistrar
public static void register(OpModeManager manager) {
if (DISABLED) return;
DriveViewFactory dvf;
if (DRIVE_CLASS.equals(MecanumDrive.class)) {
dvf = hardwareMap -> {
MecanumDrive md = new MecanumDrive(hardwareMap, new Pose2d(0, 0, 0));
List<Encoder> leftEncs = new ArrayList<>(), rightEncs = new ArrayList<>();
List<Encoder> parEncs = new ArrayList<>(), perpEncs = new ArrayList<>();
if (md.localizer instanceof MecanumDrive.DriveLocalizer) {
MecanumDrive.DriveLocalizer dl = (MecanumDrive.DriveLocalizer) md.localizer;
leftEncs.add(dl.leftFront);
leftEncs.add(dl.leftBack);
rightEncs.add(dl.rightFront);
rightEncs.add(dl.rightBack);
} else if (md.localizer instanceof ThreeDeadWheelLocalizer) {
ThreeDeadWheelLocalizer dl = (ThreeDeadWheelLocalizer) md.localizer;
parEncs.add(dl.par0);
parEncs.add(dl.par1);
perpEncs.add(dl.perp);
} else if (md.localizer instanceof TwoDeadWheelLocalizer) {
TwoDeadWheelLocalizer dl = (TwoDeadWheelLocalizer) md.localizer;
parEncs.add(dl.par);
perpEncs.add(dl.perp);
} else {
throw new RuntimeException("unknown localizer: " + md.localizer.getClass().getName());
}
return new DriveView(
DriveType.MECANUM,
MecanumDrive.PARAMS.inPerTick,
MecanumDrive.PARAMS.maxWheelVel,
MecanumDrive.PARAMS.minProfileAccel,
MecanumDrive.PARAMS.maxProfileAccel,
hardwareMap.getAll(LynxModule.class),
Arrays.asList(
md.leftFront,
md.leftBack
),
Arrays.asList(
md.rightFront,
md.rightBack
),
leftEncs,
rightEncs,
parEncs,
perpEncs,
md.lazyImu,
md.voltageSensor,
() -> new MotorFeedforward(MecanumDrive.PARAMS.kS,
MecanumDrive.PARAMS.kV / MecanumDrive.PARAMS.inPerTick,
MecanumDrive.PARAMS.kA / MecanumDrive.PARAMS.inPerTick)
);
};
} else if (DRIVE_CLASS.equals(TankDrive.class)) {
dvf = hardwareMap -> {
TankDrive td = new TankDrive(hardwareMap, new Pose2d(0, 0, 0));
List<Encoder> leftEncs = new ArrayList<>(), rightEncs = new ArrayList<>();
List<Encoder> parEncs = new ArrayList<>(), perpEncs = new ArrayList<>();
if (td.localizer instanceof TankDrive.DriveLocalizer) {
TankDrive.DriveLocalizer dl = (TankDrive.DriveLocalizer) td.localizer;
leftEncs.addAll(dl.leftEncs);
rightEncs.addAll(dl.rightEncs);
} else if (td.localizer instanceof ThreeDeadWheelLocalizer) {
ThreeDeadWheelLocalizer dl = (ThreeDeadWheelLocalizer) td.localizer;
parEncs.add(dl.par0);
parEncs.add(dl.par1);
perpEncs.add(dl.perp);
} else if (td.localizer instanceof TwoDeadWheelLocalizer) {
TwoDeadWheelLocalizer dl = (TwoDeadWheelLocalizer) td.localizer;
parEncs.add(dl.par);
perpEncs.add(dl.perp);
} else {
throw new RuntimeException("unknown localizer: " + td.localizer.getClass().getName());
}
return new DriveView(
DriveType.TANK,
TankDrive.PARAMS.inPerTick,
TankDrive.PARAMS.maxWheelVel,
TankDrive.PARAMS.minProfileAccel,
TankDrive.PARAMS.maxProfileAccel,
hardwareMap.getAll(LynxModule.class),
td.leftMotors,
td.rightMotors,
leftEncs,
rightEncs,
parEncs,
perpEncs,
td.lazyImu,
td.voltageSensor,
() -> new MotorFeedforward(TankDrive.PARAMS.kS,
TankDrive.PARAMS.kV / TankDrive.PARAMS.inPerTick,
TankDrive.PARAMS.kA / TankDrive.PARAMS.inPerTick)
);
};
} else {
throw new RuntimeException();
}
manager.register(metaForClass(AngularRampLogger.class), new AngularRampLogger(dvf));
manager.register(metaForClass(ForwardPushTest.class), new ForwardPushTest(dvf));
manager.register(metaForClass(ForwardRampLogger.class), new ForwardRampLogger(dvf));
manager.register(metaForClass(LateralPushTest.class), new LateralPushTest(dvf));
manager.register(metaForClass(LateralRampLogger.class), new LateralRampLogger(dvf));
manager.register(metaForClass(ManualFeedforwardTuner.class), new ManualFeedforwardTuner(dvf));
manager.register(metaForClass(MecanumMotorDirectionDebugger.class), new MecanumMotorDirectionDebugger(dvf));
manager.register(metaForClass(DeadWheelDirectionDebugger.class), new DeadWheelDirectionDebugger(dvf));
manager.register(metaForClass(ManualFeedbackTuner.class), ManualFeedbackTuner.class);
manager.register(metaForClass(SplineTest.class), SplineTest.class);
manager.register(metaForClass(LocalizationTest.class), LocalizationTest.class);
FtcDashboard.getInstance().withConfigRoot(configRoot -> {
for (Class<?> c : Arrays.asList(
AngularRampLogger.class,
ForwardRampLogger.class,
LateralRampLogger.class,
ManualFeedforwardTuner.class,
MecanumMotorDirectionDebugger.class,
ManualFeedbackTuner.class
)) {
configRoot.putVariable(c.getSimpleName(), ReflectionConfig.createVariableFromClass(c));
}
});
}
}

View File

@ -4,15 +4,15 @@ repositories {
}
dependencies {
implementation 'org.firstinspires.ftc:Inspection:9.0.1'
implementation 'org.firstinspires.ftc:Blocks:9.0.1'
implementation 'org.firstinspires.ftc:Tfod:9.0.1'
implementation 'org.firstinspires.ftc:RobotCore:9.0.1'
implementation 'org.firstinspires.ftc:RobotServer:9.0.1'
implementation 'org.firstinspires.ftc:OnBotJava:9.0.1'
implementation 'org.firstinspires.ftc:Hardware:9.0.1'
implementation 'org.firstinspires.ftc:FtcCommon:9.0.1'
implementation 'org.firstinspires.ftc:Vision:9.0.1'
implementation 'org.firstinspires.ftc:Inspection:9.2.0'
implementation 'org.firstinspires.ftc:Blocks:9.2.0'
implementation 'org.firstinspires.ftc:Tfod:9.2.0'
implementation 'org.firstinspires.ftc:RobotCore:9.2.0'
implementation 'org.firstinspires.ftc:RobotServer:9.2.0'
implementation 'org.firstinspires.ftc:OnBotJava:9.2.0'
implementation 'org.firstinspires.ftc:Hardware:9.2.0'
implementation 'org.firstinspires.ftc:FtcCommon:9.2.0'
implementation 'org.firstinspires.ftc:Vision:9.2.0'
implementation 'org.firstinspires.ftc:gameAssets-CenterStage:1.0.0'
implementation 'org.tensorflow:tensorflow-lite-task-vision:0.4.3'
runtimeOnly 'org.tensorflow:tensorflow-lite:2.12.0'

View File

@ -3,8 +3,8 @@
# https://developer.android.com/topic/libraries/support-library/androidx-rn
android.useAndroidX=true
# Automatically convert third-party libraries to use AndroidX
android.enableJetifier=true
# We no longer need to auto-convert third-party libraries to use AndroidX, which slowed down the build
android.enableJetifier=false
# Allow Gradle to use up to 1 GB of RAM
org.gradle.jvmargs=-Xmx1024M