23 Commits

Author SHA1 Message Date
6d0eadecd1 Add list of IMUs on current robot roster 2024-06-30 10:30:57 -07:00
b779a61a7c Updating to use Roadrunner 2024-06-30 09:57:10 -07:00
1a1ef3b2dc Add first pass of Roadrunner class files 2024-06-22 11:10:36 -07:00
4ecd419511 Updated movement library to the updated orientation of the motors, added speed control via bumpers 2024-06-22 11:02:18 -07:00
9e8609e68d Complete refactoring of classes; added additional demo for 04/20/2024 event at SRJC 2024-04-20 10:45:09 -07:00
832de95fc1 Added dashboard to project and commented out a class that doesn't exist (yet) 2024-04-19 22:46:45 -07:00
63158215d0 Added sample Autonomous class file 2024-04-01 22:21:28 -07:00
6e6e626816 Add sample MovementLibrary file 2024-04-01 22:21:01 -07:00
b5aca8b93c Add sample Constants file 2024-04-01 22:20:44 -07:00
8a0aa126b6 Merge branch 'branch-carlos' of https://gitea.scdsrobotics.org/SCDS/FtcRobotControllerSDK into branch-carlos 2024-04-01 21:48:13 -07:00
6938e09b2b Add sample file 2024-04-01 21:45:45 -07:00
e86c31771e Added dashboard to project 2024-03-26 15:54:40 -07:00
0baf961680 Adding sample files 2024-03-26 15:52:51 -07:00
1da45a36c6 Merge pull request #941 from FIRST-Tech-Challenge/20240215-115542-release-candidate
FtcRobotController v9.1
2024-02-16 14:13:01 -08:00
c303962469 FtcRobotController v9.1 2024-02-15 13:10:49 -08:00
f326c0d033 Merge pull request #731 from FIRST-Tech-Challenge/20230929-083754-release-candidate
FtcRobotController v9.0.1
2023-09-29 19:13:38 +09:00
c023e97a6a FtcRobotController v9.0.1 2023-09-29 19:11:41 +09:00
f3a5a54f67 Merge pull request #674 from FIRST-Tech-Challenge/20230830-154348-release-candidate
FtcRobotController v9.0
2023-09-10 01:34:35 +09:00
4de7b1e7c7 FtcRobotController v9.0 2023-08-30 17:44:45 +09:00
660a2f63bc Merge pull request #654 from FIRST-Tech-Challenge/20230707-131020-release-candidate
FtcRobotController v8.2
2023-07-11 09:06:22 -07:00
5c8b4c448b FtcRobotController v8.2 2023-07-07 15:15:11 -07:00
0879b4797f Merge pull request #506 from FIRST-Tech-Challenge/20221201-150726-release-candidate
FtcRobotController v8.1.1
2022-12-02 15:39:27 -08:00
3383440328 FtcRobotController v8.1.1 2022-12-02 15:13:28 -08:00
108 changed files with 4523 additions and 2430 deletions

View File

@ -18,7 +18,7 @@ If what you've read so far makes little sense, there are some very good git lear
[Git Book](https://git-scm.com/book/en/v2)
[Interactive Git Tutorial](https://try.github.io)
##### Guidlines for experienced GIT users.
### Guidlines for experienced GIT users.
If you are absolutely certain that you want to push the big green button above, read on. Otherwise back _slowly away from keyboard_.
@ -38,9 +38,9 @@ This section guides you through filing a bug report. The better the report the
#### Before submitting a bug report
- Check the [forums](http://ftcforum.usfirst.org/forum.php) to see if someone else has run into the problem and whether there is an official solution that doesn't require a new SDK.
- Check the [forums](http://ftcforum.firstinspires.org/forum.php) to see if someone else has run into the problem and whether there is an official solution that doesn't require a new SDK.
- Perform a search of current [issues](https://github.com/ftctechnh/ftc_app/issues) to see if the problem has already been reported. If so, add a comment to the existing issue instead of creating a new one.
- Perform a search of current [issues](https://github.com/FIRST-Tech-Challenge/FtcRobotController/issues) to see if the problem has already been reported. If so, add a comment to the existing issue instead of creating a new one.
#### How Do I Submit A (Good) Bug Report?
@ -65,4 +65,4 @@ FIRST volunteers are awesome. You all have great ideas and we want to hear them
Enhancements should be broadly applicable to a large majority of teams, should not force teams to change their workflow, and should provide real value to the mission of FIRST as it relates to engaging youth in engineering activities.
The best way to get momentum behind new features is to post a description of your idea in the forums. Build community support for it. The FTC Technology Team monitors the forums. We'll hear you and if there's a large enough call for the feature it's very likely to get put on the list for a future release.
The best way to get momentum behind new features is to post a description of your idea in the discussions section of this repository. Build community support for it. The FTC Technology Team monitors the discussions. We'll hear you and if there's a large enough call for the feature it's very likely to get put on the list for a future release.

18
.gitignore vendored
View File

@ -9,8 +9,9 @@
# Files for the ART/Dalvik VM
*.dex
# Java class files
# Java/JDK files
*.class
*.hprof
# Generated files
bin/
@ -40,17 +41,10 @@ captures/
# IntelliJ
*.iml
.idea/workspace.xml
.idea/tasks.xml
.idea/gradle.xml
.idea/assetWizardSettings.xml
.idea/dictionaries
.idea/libraries
# Android Studio 3 in .gitignore file.
.idea/caches
.idea/modules.xml
# Comment next line if keeping position of elements in Navigation Editor is relevant for you
.idea/navEditor.xml
.idea/
# For Mac users
.DS_Store
# Keystore files
# Uncomment the following lines if you do not want to check your keystore files in.

View File

@ -8,7 +8,7 @@ apply plugin: 'com.android.library'
android {
defaultConfig {
minSdkVersion 23
minSdkVersion 24
//noinspection ExpiredTargetSdkVersion
targetSdkVersion 28
buildConfigField "String", "APP_BUILD_TIME", '"' + (new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSSZ", Locale.ROOT).format(new Date())) + '"'
@ -17,8 +17,8 @@ android {
compileSdkVersion 29
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_7
targetCompatibility JavaVersion.VERSION_1_7
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
namespace = 'com.qualcomm.ftcrobotcontroller'
}

View File

@ -1,8 +1,8 @@
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:versionCode="48"
android:versionName="8.1">
android:versionCode="53"
android:versionName="9.1">
<uses-permission android:name="android.permission.RECEIVE_BOOT_COMPLETED" />

View File

@ -1,9 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<QCARConfig xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="qcar_config.xsd">
<Tracking>
<ImageTarget name="Wheels" size="254.000000 184.154922" />
<ImageTarget name="Tools" size="254.000000 184.154922" />
<ImageTarget name="Legos" size="254.000000 184.154922" />
<ImageTarget name="Gears" size="254.000000 184.154922" />
</Tracking>
</QCARConfig>

View File

@ -1,6 +0,0 @@
<?xml version='1.0' encoding='UTF-8'?>
<QCARConfig>
<Tracking>
<VuMark name="RelicRecovery" size="304.80000376701355 223.630235354" />
</Tracking>
</QCARConfig>

View File

@ -1,7 +0,0 @@
<?xml version="1.0"?>
<QCARConfig xsi:noNamespaceSchemaLocation="qcar_config.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<Tracking>
<ImageTarget name="stones" size="247 173"/>
<ImageTarget name="chips" size="247 173"/>
</Tracking>
</QCARConfig>

View File

@ -1,10 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<QCARConfig xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="qcar_config.xsd">
<Tracking>
<ImageTarget name="BlueTowerGoal" size="257.299988 171.533325" />
<ImageTarget name="RedTowerGoal" size="257.299988 171.533325" />
<ImageTarget name="RedAlliance" size="242.600006 171.430405" />
<ImageTarget name="BlueAlliance" size="252.500000 171.466522" />
<ImageTarget name="FrontWall" size="177.800003 177.800003" />
</Tracking>
</QCARConfig>

View File

@ -35,7 +35,7 @@ import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
/**
/*
* This file contains an example of a Linear "OpMode".
* An OpMode is a 'program' that runs in either the autonomous or the teleop period of an FTC match.
* The names of OpModes appear on the menu of the FTC Driver Station.
@ -60,10 +60,10 @@ import com.qualcomm.robotcore.util.ElapsedTime;
* the direction of all 4 motors (see code below).
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="Basic: Omni Linear OpMode", group="Linear Opmode")
@TeleOp(name="Basic: Omni Linear OpMode", group="Linear OpMode")
@Disabled
public class BasicOmniOpMode_Linear extends LinearOpMode {

View File

@ -36,7 +36,7 @@ import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
import com.qualcomm.robotcore.util.Range;
/**
/*
* This file contains an example of an iterative (Non-Linear) "OpMode".
* An OpMode is a 'program' that runs in either the autonomous or the teleop period of an FTC match.
* The names of OpModes appear on the menu of the FTC Driver Station.
@ -47,10 +47,10 @@ import com.qualcomm.robotcore.util.Range;
* It includes all the skeletal structure that all iterative OpModes contain.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="Basic: Iterative OpMode", group="Iterative Opmode")
@TeleOp(name="Basic: Iterative OpMode", group="Iterative OpMode")
@Disabled
public class BasicOpMode_Iterative extends OpMode
{

View File

@ -29,15 +29,15 @@
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
import com.qualcomm.robotcore.util.Range;
/**
/*
* This file contains an minimal example of a Linear "OpMode". An OpMode is a 'program' that runs in either
* the autonomous or the teleop period of an FTC match. The names of OpModes appear on the menu
* of the FTC Driver Station. When a selection is made from the menu, the corresponding OpMode
@ -47,10 +47,10 @@ import com.qualcomm.robotcore.util.Range;
* It includes all the skeletal structure that all linear OpModes contain.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="Basic: Linear OpMode", group="Linear Opmode")
@TeleOp(name="Basic: Linear OpMode", group="Linear OpMode")
@Disabled
public class BasicOpMode_Linear extends LinearOpMode {

View File

@ -0,0 +1,217 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.util.Size;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of AprilTag recognition and pose estimation,
* including Java Builder structures for specifying Vision parameters.
*
* For an introduction to AprilTags, see the FTC-DOCS link below:
* https://ftc-docs.firstinspires.org/en/latest/apriltag/vision_portal/apriltag_intro/apriltag-intro.html
*
* In this sample, any visible tag ID will be detected and displayed, but only tags that are included in the default
* "TagLibrary" will have their position and orientation information displayed. This default TagLibrary contains
* the current Season's AprilTags and a small set of "test Tags" in the high number range.
*
* When an AprilTag in the TagLibrary is detected, the SDK provides location and orientation of the tag, relative to the camera.
* This information is provided in the "ftcPose" member of the returned "detection", and is explained in the ftc-docs page linked below.
* https://ftc-docs.firstinspires.org/apriltag-detection-values
*
* To experiment with using AprilTags to navigate, try out these two driving samples:
* RobotAutoDriveToAprilTagOmni and RobotAutoDriveToAprilTagTank
*
* There are many "default" VisionPortal and AprilTag configuration parameters that may be overridden if desired.
* These default parameters are shown as comments in the code below.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: AprilTag", group = "Concept")
@Disabled
public class ConceptAprilTag extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
/**
* The variable to store our instance of the AprilTag processor.
*/
private AprilTagProcessor aprilTag;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initAprilTag();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryAprilTag();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end method runOpMode()
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor.
aprilTag = new AprilTagProcessor.Builder()
// The following default settings are available to un-comment and edit as needed.
//.setDrawAxes(false)
//.setDrawCubeProjection(false)
//.setDrawTagOutline(true)
//.setTagFamily(AprilTagProcessor.TagFamily.TAG_36h11)
//.setTagLibrary(AprilTagGameDatabase.getCenterStageTagLibrary())
//.setOutputUnits(DistanceUnit.INCH, AngleUnit.DEGREES)
// == CAMERA CALIBRATION ==
// If you do not manually specify calibration parameters, the SDK will attempt
// to load a predefined calibration for your camera.
//.setLensIntrinsics(578.272, 578.272, 402.145, 221.506)
// ... these parameters are fx, fy, cx, cy.
.build();
// Adjust Image Decimation to trade-off detection-range for detection-rate.
// eg: Some typical detection data using a Logitech C920 WebCam
// Decimation = 1 .. Detect 2" Tag from 10 feet away at 10 Frames per second
// Decimation = 2 .. Detect 2" Tag from 6 feet away at 22 Frames per second
// Decimation = 3 .. Detect 2" Tag from 4 feet away at 30 Frames Per Second (default)
// Decimation = 3 .. Detect 5" Tag from 10 feet away at 30 Frames Per Second (default)
// Note: Decimation can be changed on-the-fly to adapt during a match.
//aprilTag.setDecimation(3);
// Create the vision portal by using a builder.
VisionPortal.Builder builder = new VisionPortal.Builder();
// Set the camera (webcam vs. built-in RC phone camera).
if (USE_WEBCAM) {
builder.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"));
} else {
builder.setCamera(BuiltinCameraDirection.BACK);
}
// Choose a camera resolution. Not all cameras support all resolutions.
//builder.setCameraResolution(new Size(640, 480));
// Enable the RC preview (LiveView). Set "false" to omit camera monitoring.
//builder.enableLiveView(true);
// Set the stream format; MJPEG uses less bandwidth than default YUY2.
//builder.setStreamFormat(VisionPortal.StreamFormat.YUY2);
// Choose whether or not LiveView stops if no processors are enabled.
// If set "true", monitor shows solid orange screen if no processors enabled.
// If set "false", monitor shows camera view without annotations.
//builder.setAutoStopLiveView(false);
// Set and enable the processor.
builder.addProcessor(aprilTag);
// Build the Vision Portal, using the above settings.
visionPortal = builder.build();
// Disable or re-enable the aprilTag processor at any time.
//visionPortal.setProcessorEnabled(aprilTag, true);
} // end method initAprilTag()
/**
* Add telemetry about AprilTag detections.
*/
private void telemetryAprilTag() {
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
telemetry.addData("# AprilTags Detected", currentDetections.size());
// Step through the list of detections and display info for each one.
for (AprilTagDetection detection : currentDetections) {
if (detection.metadata != null) {
telemetry.addLine(String.format("\n==== (ID %d) %s", detection.id, detection.metadata.name));
telemetry.addLine(String.format("XYZ %6.1f %6.1f %6.1f (inch)", detection.ftcPose.x, detection.ftcPose.y, detection.ftcPose.z));
telemetry.addLine(String.format("PRY %6.1f %6.1f %6.1f (deg)", detection.ftcPose.pitch, detection.ftcPose.roll, detection.ftcPose.yaw));
telemetry.addLine(String.format("RBE %6.1f %6.1f %6.1f (inch, deg, deg)", detection.ftcPose.range, detection.ftcPose.bearing, detection.ftcPose.elevation));
} else {
telemetry.addLine(String.format("\n==== (ID %d) Unknown", detection.id));
telemetry.addLine(String.format("Center %6.0f %6.0f (pixels)", detection.center.x, detection.center.y));
}
} // end for() loop
// Add "key" information to telemetry
telemetry.addLine("\nkey:\nXYZ = X (Right), Y (Forward), Z (Up) dist.");
telemetry.addLine("PRY = Pitch, Roll & Yaw (XYZ Rotation)");
telemetry.addLine("RBE = Range, Bearing & Elevation");
} // end method telemetryAprilTag()
} // end class

View File

@ -0,0 +1,163 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of AprilTag recognition and pose estimation, using
* the easy way.
*
* For an introduction to AprilTags, see the FTC-DOCS link below:
* https://ftc-docs.firstinspires.org/en/latest/apriltag/vision_portal/apriltag_intro/apriltag-intro.html
*
* In this sample, any visible tag ID will be detected and displayed, but only tags that are included in the default
* "TagLibrary" will have their position and orientation information displayed. This default TagLibrary contains
* the current Season's AprilTags and a small set of "test Tags" in the high number range.
*
* When an AprilTag in the TagLibrary is detected, the SDK provides location and orientation of the tag, relative to the camera.
* This information is provided in the "ftcPose" member of the returned "detection", and is explained in the ftc-docs page linked below.
* https://ftc-docs.firstinspires.org/apriltag-detection-values
*
* To experiment with using AprilTags to navigate, try out these two driving samples:
* RobotAutoDriveToAprilTagOmni and RobotAutoDriveToAprilTagTank
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: AprilTag Easy", group = "Concept")
@Disabled
public class ConceptAprilTagEasy extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
/**
* The variable to store our instance of the AprilTag processor.
*/
private AprilTagProcessor aprilTag;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initAprilTag();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryAprilTag();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end method runOpMode()
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor the easy way.
aprilTag = AprilTagProcessor.easyCreateWithDefaults();
// Create the vision portal the easy way.
if (USE_WEBCAM) {
visionPortal = VisionPortal.easyCreateWithDefaults(
hardwareMap.get(WebcamName.class, "Webcam 1"), aprilTag);
} else {
visionPortal = VisionPortal.easyCreateWithDefaults(
BuiltinCameraDirection.BACK, aprilTag);
}
} // end method initAprilTag()
/**
* Add telemetry about AprilTag detections.
*/
private void telemetryAprilTag() {
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
telemetry.addData("# AprilTags Detected", currentDetections.size());
// Step through the list of detections and display info for each one.
for (AprilTagDetection detection : currentDetections) {
if (detection.metadata != null) {
telemetry.addLine(String.format("\n==== (ID %d) %s", detection.id, detection.metadata.name));
telemetry.addLine(String.format("XYZ %6.1f %6.1f %6.1f (inch)", detection.ftcPose.x, detection.ftcPose.y, detection.ftcPose.z));
telemetry.addLine(String.format("PRY %6.1f %6.1f %6.1f (deg)", detection.ftcPose.pitch, detection.ftcPose.roll, detection.ftcPose.yaw));
telemetry.addLine(String.format("RBE %6.1f %6.1f %6.1f (inch, deg, deg)", detection.ftcPose.range, detection.ftcPose.bearing, detection.ftcPose.elevation));
} else {
telemetry.addLine(String.format("\n==== (ID %d) Unknown", detection.id));
telemetry.addLine(String.format("Center %6.0f %6.0f (pixels)", detection.center.x, detection.center.y));
}
} // end for() loop
// Add "key" information to telemetry
telemetry.addLine("\nkey:\nXYZ = X (Right), Y (Forward), Z (Up) dist.");
telemetry.addLine("PRY = Pitch, Roll & Yaw (XYZ Rotation)");
telemetry.addLine("RBE = Range, Bearing & Elevation");
} // end method telemetryAprilTag()
} // end class

View File

@ -0,0 +1,246 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.Range;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.ExposureControl;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.GainControl;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
import java.util.concurrent.TimeUnit;
/*
* This OpMode determines the best Exposure for minimizing image motion-blur on a Webcam
* Note that it is not possible to control the exposure for a Phone Camera, so if you are using a Phone for the Robot Controller
* this OpMode/Feature only applies to an externally connected Webcam
*
* The goal is to determine the smallest (shortest) Exposure value that still provides reliable Tag Detection.
* Starting with the minimum Exposure and maximum Gain, the exposure is slowly increased until the Tag is
* detected reliably from the likely operational distance.
*
*
* The best way to run this optimization is to view the camera preview screen while changing the exposure and gains.
*
* To do this, you need to view the RobotController screen directly (not from Driver Station)
* This can be done directly from a RC phone screen (if you are using an external Webcam), but for a Control Hub you must either plug an
* HDMI monitor into the Control Hub HDMI port, or use an external viewer program like ScrCpy (https://scrcpy.org/)
*
* Use Android Studio to Copy this Class, and Paste it into the TeamCode/src/main/java/org/firstinspires/ftc/teamcode folder.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name="Optimize AprilTag Exposure", group = "Concept")
@Disabled
public class ConceptAprilTagOptimizeExposure extends LinearOpMode
{
private VisionPortal visionPortal = null; // Used to manage the video source.
private AprilTagProcessor aprilTag; // Used for managing the AprilTag detection process.
private int myExposure ;
private int minExposure ;
private int maxExposure ;
private int myGain ;
private int minGain ;
private int maxGain ;
boolean thisExpUp = false;
boolean thisExpDn = false;
boolean thisGainUp = false;
boolean thisGainDn = false;
boolean lastExpUp = false;
boolean lastExpDn = false;
boolean lastGainUp = false;
boolean lastGainDn = false;
@Override public void runOpMode()
{
// Initialize the Apriltag Detection process
initAprilTag();
// Establish Min and Max Gains and Exposure. Then set a low exposure with high gain
getCameraSetting();
myExposure = Math.min(5, minExposure);
myGain = maxGain;
setManualExposure(myExposure, myGain);
// Wait for the match to begin.
telemetry.addData("Camera preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
while (opModeIsActive())
{
telemetry.addLine("Find lowest Exposure that gives reliable detection.");
telemetry.addLine("Use Left bump/trig to adjust Exposure.");
telemetry.addLine("Use Right bump/trig to adjust Gain.\n");
// Display how many Tags Detected
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
int numTags = currentDetections.size();
if (numTags > 0 )
telemetry.addData("Tag", "####### %d Detected ######", currentDetections.size());
else
telemetry.addData("Tag", "----------- none - ----------");
telemetry.addData("Exposure","%d (%d - %d)", myExposure, minExposure, maxExposure);
telemetry.addData("Gain","%d (%d - %d)", myGain, minGain, maxGain);
telemetry.update();
// check to see if we need to change exposure or gain.
thisExpUp = gamepad1.left_bumper;
thisExpDn = gamepad1.left_trigger > 0.25;
thisGainUp = gamepad1.right_bumper;
thisGainDn = gamepad1.right_trigger > 0.25;
// look for clicks to change exposure
if (thisExpUp && !lastExpUp) {
myExposure = Range.clip(myExposure + 1, minExposure, maxExposure);
setManualExposure(myExposure, myGain);
} else if (thisExpDn && !lastExpDn) {
myExposure = Range.clip(myExposure - 1, minExposure, maxExposure);
setManualExposure(myExposure, myGain);
}
// look for clicks to change the gain
if (thisGainUp && !lastGainUp) {
myGain = Range.clip(myGain + 1, minGain, maxGain );
setManualExposure(myExposure, myGain);
} else if (thisGainDn && !lastGainDn) {
myGain = Range.clip(myGain - 1, minGain, maxGain );
setManualExposure(myExposure, myGain);
}
lastExpUp = thisExpUp;
lastExpDn = thisExpDn;
lastGainUp = thisGainUp;
lastGainDn = thisGainDn;
sleep(20);
}
}
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor by using a builder.
aprilTag = new AprilTagProcessor.Builder().build();
// Create the WEBCAM vision portal by using a builder.
visionPortal = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"))
.addProcessor(aprilTag)
.build();
}
/*
Manually set the camera gain and exposure.
Can only be called AFTER calling initAprilTag();
Returns true if controls are set.
*/
private boolean setManualExposure(int exposureMS, int gain) {
// Ensure Vision Portal has been setup.
if (visionPortal == null) {
return false;
}
// Wait for the camera to be open
if (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING) {
telemetry.addData("Camera", "Waiting");
telemetry.update();
while (!isStopRequested() && (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING)) {
sleep(20);
}
telemetry.addData("Camera", "Ready");
telemetry.update();
}
// Set camera controls unless we are stopping.
if (!isStopRequested())
{
// Set exposure. Make sure we are in Manual Mode for these values to take effect.
ExposureControl exposureControl = visionPortal.getCameraControl(ExposureControl.class);
if (exposureControl.getMode() != ExposureControl.Mode.Manual) {
exposureControl.setMode(ExposureControl.Mode.Manual);
sleep(50);
}
exposureControl.setExposure((long)exposureMS, TimeUnit.MILLISECONDS);
sleep(20);
// Set Gain.
GainControl gainControl = visionPortal.getCameraControl(GainControl.class);
gainControl.setGain(gain);
sleep(20);
return (true);
} else {
return (false);
}
}
/*
Read this camera's minimum and maximum Exposure and Gain settings.
Can only be called AFTER calling initAprilTag();
*/
private void getCameraSetting() {
// Ensure Vision Portal has been setup.
if (visionPortal == null) {
return;
}
// Wait for the camera to be open
if (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING) {
telemetry.addData("Camera", "Waiting");
telemetry.update();
while (!isStopRequested() && (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING)) {
sleep(20);
}
telemetry.addData("Camera", "Ready");
telemetry.update();
}
// Get camera control values unless we are stopping.
if (!isStopRequested()) {
ExposureControl exposureControl = visionPortal.getCameraControl(ExposureControl.class);
minExposure = (int)exposureControl.getMinExposure(TimeUnit.MILLISECONDS) + 1;
maxExposure = (int)exposureControl.getMaxExposure(TimeUnit.MILLISECONDS);
GainControl gainControl = visionPortal.getCameraControl(GainControl.class);
minGain = gainControl.getMinGain();
maxGain = gainControl.getMaxGain();
}
}
}

View File

@ -0,0 +1,196 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.hardware.camera.CameraName;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.VisionPortal.CameraState;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of AprilTag recognition and pose estimation, using
* two webcams.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: AprilTag Switchable Cameras", group = "Concept")
@Disabled
public class ConceptAprilTagSwitchableCameras extends LinearOpMode {
/*
* Variables used for switching cameras.
*/
private WebcamName webcam1, webcam2;
private boolean oldLeftBumper;
private boolean oldRightBumper;
/**
* The variable to store our instance of the AprilTag processor.
*/
private AprilTagProcessor aprilTag;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initAprilTag();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryCameraSwitching();
telemetryAprilTag();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
doCameraSwitching();
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end runOpMode()
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor by using a builder.
aprilTag = new AprilTagProcessor.Builder().build();
webcam1 = hardwareMap.get(WebcamName.class, "Webcam 1");
webcam2 = hardwareMap.get(WebcamName.class, "Webcam 2");
CameraName switchableCamera = ClassFactory.getInstance()
.getCameraManager().nameForSwitchableCamera(webcam1, webcam2);
// Create the vision portal by using a builder.
visionPortal = new VisionPortal.Builder()
.setCamera(switchableCamera)
.addProcessor(aprilTag)
.build();
} // end method initAprilTag()
/**
* Add telemetry about camera switching.
*/
private void telemetryCameraSwitching() {
if (visionPortal.getActiveCamera().equals(webcam1)) {
telemetry.addData("activeCamera", "Webcam 1");
telemetry.addData("Press RightBumper", "to switch to Webcam 2");
} else {
telemetry.addData("activeCamera", "Webcam 2");
telemetry.addData("Press LeftBumper", "to switch to Webcam 1");
}
} // end method telemetryCameraSwitching()
/**
* Add telemetry about AprilTag detections.
*/
private void telemetryAprilTag() {
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
telemetry.addData("# AprilTags Detected", currentDetections.size());
// Step through the list of detections and display info for each one.
for (AprilTagDetection detection : currentDetections) {
if (detection.metadata != null) {
telemetry.addLine(String.format("\n==== (ID %d) %s", detection.id, detection.metadata.name));
telemetry.addLine(String.format("XYZ %6.1f %6.1f %6.1f (inch)", detection.ftcPose.x, detection.ftcPose.y, detection.ftcPose.z));
telemetry.addLine(String.format("PRY %6.1f %6.1f %6.1f (deg)", detection.ftcPose.pitch, detection.ftcPose.roll, detection.ftcPose.yaw));
telemetry.addLine(String.format("RBE %6.1f %6.1f %6.1f (inch, deg, deg)", detection.ftcPose.range, detection.ftcPose.bearing, detection.ftcPose.elevation));
} else {
telemetry.addLine(String.format("\n==== (ID %d) Unknown", detection.id));
telemetry.addLine(String.format("Center %6.0f %6.0f (pixels)", detection.center.x, detection.center.y));
}
} // end for() loop
// Add "key" information to telemetry
telemetry.addLine("\nkey:\nXYZ = X (Right), Y (Forward), Z (Up) dist.");
telemetry.addLine("PRY = Pitch, Roll & Yaw (XYZ Rotation)");
telemetry.addLine("RBE = Range, Bearing & Elevation");
} // end method telemetryAprilTag()
/**
* Set the active camera according to input from the gamepad.
*/
private void doCameraSwitching() {
if (visionPortal.getCameraState() == CameraState.STREAMING) {
// If the left bumper is pressed, use Webcam 1.
// If the right bumper is pressed, use Webcam 2.
boolean newLeftBumper = gamepad1.left_bumper;
boolean newRightBumper = gamepad1.right_bumper;
if (newLeftBumper && !oldLeftBumper) {
visionPortal.setActiveCamera(webcam1);
} else if (newRightBumper && !oldRightBumper) {
visionPortal.setActiveCamera(webcam2);
}
oldLeftBumper = newLeftBumper;
oldRightBumper = newRightBumper;
}
} // end method doCameraSwitching()
} // end class

View File

@ -1,125 +0,0 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.CompassSensor;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
/**
* This file illustrates the concept of calibrating a MR Compass
* This code assumes there is a compass configured with the name "compass"
*
* This code will put the compass into calibration mode, wait three seconds and then attempt
* to rotate two full turns clockwise. This will allow the compass to do a magnetic calibration.
*
* Once compete, the program will put the compass back into measurement mode and check to see if the
* calibration was successful.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@TeleOp(name="Concept: Compass Calibration", group="Concept")
@Disabled
public class ConceptCompassCalibration extends LinearOpMode {
/* Declare OpMode members. */
public DcMotor leftDrive = null;
public DcMotor rightDrive = null;
private ElapsedTime runtime = new ElapsedTime();
CompassSensor compass;
final static double MOTOR_POWER = 0.2; // scale from 0 to 1
static final long HOLD_TIME_MS = 3000;
static final double CAL_TIME_SEC = 20;
@Override
public void runOpMode() {
// Initialize the drive system variables.
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// Pushing the left stick forward MUST make robot go forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
// get a reference to our Compass Sensor object.
compass = hardwareMap.get(CompassSensor.class, "compass");
// Send telemetry message to signify robot waiting;
telemetry.addData("Status", "Ready to cal"); //
telemetry.update();
// Wait for the game to start (driver presses PLAY)
waitForStart();
// Set the compass to calibration mode
compass.setMode(CompassSensor.CompassMode.CALIBRATION_MODE);
telemetry.addData("Compass", "Compass in calibration mode");
telemetry.update();
sleep(HOLD_TIME_MS); // Just do a sleep while we switch modes
// Start the robot rotating clockwise
telemetry.addData("Compass", "Calibration mode. Turning the robot...");
telemetry.update();
leftDrive.setPower(MOTOR_POWER);
rightDrive.setPower(-MOTOR_POWER);
// run until time expires OR the driver presses STOP;
runtime.reset();
while (opModeIsActive() && (runtime.time() < CAL_TIME_SEC)) {
idle();
}
// Stop all motors and turn off claibration
leftDrive.setPower(0);
rightDrive.setPower(0);
compass.setMode(CompassSensor.CompassMode.MEASUREMENT_MODE);
telemetry.addData("Compass", "Returning to measurement mode");
telemetry.update();
sleep(HOLD_TIME_MS); // Just do a sleep while we switch modes
// Report whether the Calibration was successful or not.
if (compass.calibrationFailed())
telemetry.addData("Compass", "Calibrate Failed. Try Again!");
else
telemetry.addData("Compass", "Calibrate Passed.");
telemetry.update();
}
}

View File

@ -0,0 +1,202 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of using both AprilTag recognition and TensorFlow
* Object Detection.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: Double Vision", group = "Concept")
@Disabled
public class ConceptDoubleVision extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
/**
* The variable to store our instance of the AprilTag processor.
*/
private AprilTagProcessor aprilTag;
/**
* The variable to store our instance of the TensorFlow Object Detection processor.
*/
private TfodProcessor tfod;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal myVisionPortal;
@Override
public void runOpMode() {
initDoubleVision();
// This OpMode loops continuously, allowing the user to switch between
// AprilTag and TensorFlow Object Detection (TFOD) image processors.
while (!isStopRequested()) {
if (opModeInInit()) {
telemetry.addData("DS preview on/off","3 dots, Camera Stream");
telemetry.addLine();
telemetry.addLine("----------------------------------------");
}
if (myVisionPortal.getProcessorEnabled(aprilTag)) {
// User instructions: Dpad left or Dpad right.
telemetry.addLine("Dpad Left to disable AprilTag");
telemetry.addLine();
telemetryAprilTag();
} else {
telemetry.addLine("Dpad Right to enable AprilTag");
}
telemetry.addLine();
telemetry.addLine("----------------------------------------");
if (myVisionPortal.getProcessorEnabled(tfod)) {
telemetry.addLine("Dpad Down to disable TFOD");
telemetry.addLine();
telemetryTfod();
} else {
telemetry.addLine("Dpad Up to enable TFOD");
}
// Push telemetry to the Driver Station.
telemetry.update();
if (gamepad1.dpad_left) {
myVisionPortal.setProcessorEnabled(aprilTag, false);
} else if (gamepad1.dpad_right) {
myVisionPortal.setProcessorEnabled(aprilTag, true);
}
if (gamepad1.dpad_down) {
myVisionPortal.setProcessorEnabled(tfod, false);
} else if (gamepad1.dpad_up) {
myVisionPortal.setProcessorEnabled(tfod, true);
}
sleep(20);
} // end while loop
} // end method runOpMode()
/**
* Initialize AprilTag and TFOD.
*/
private void initDoubleVision() {
// -----------------------------------------------------------------------------------------
// AprilTag Configuration
// -----------------------------------------------------------------------------------------
aprilTag = new AprilTagProcessor.Builder()
.build();
// -----------------------------------------------------------------------------------------
// TFOD Configuration
// -----------------------------------------------------------------------------------------
tfod = new TfodProcessor.Builder()
.build();
// -----------------------------------------------------------------------------------------
// Camera Configuration
// -----------------------------------------------------------------------------------------
if (USE_WEBCAM) {
myVisionPortal = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"))
.addProcessors(tfod, aprilTag)
.build();
} else {
myVisionPortal = new VisionPortal.Builder()
.setCamera(BuiltinCameraDirection.BACK)
.addProcessors(tfod, aprilTag)
.build();
}
} // end initDoubleVision()
/**
* Add telemetry about AprilTag detections.
*/
private void telemetryAprilTag() {
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
telemetry.addData("# AprilTags Detected", currentDetections.size());
// Step through the list of detections and display info for each one.
for (AprilTagDetection detection : currentDetections) {
if (detection.metadata != null) {
telemetry.addLine(String.format("\n==== (ID %d) %s", detection.id, detection.metadata.name));
telemetry.addLine(String.format("XYZ %6.1f %6.1f %6.1f (inch)", detection.ftcPose.x, detection.ftcPose.y, detection.ftcPose.z));
telemetry.addLine(String.format("PRY %6.1f %6.1f %6.1f (deg)", detection.ftcPose.pitch, detection.ftcPose.roll, detection.ftcPose.yaw));
telemetry.addLine(String.format("RBE %6.1f %6.1f %6.1f (inch, deg, deg)", detection.ftcPose.range, detection.ftcPose.bearing, detection.ftcPose.elevation));
} else {
telemetry.addLine(String.format("\n==== (ID %d) Unknown", detection.id));
telemetry.addLine(String.format("Center %6.0f %6.0f (pixels)", detection.center.x, detection.center.y));
}
} // end for() loop
} // end method telemetryAprilTag()
/**
* Add telemetry about TensorFlow Object Detection (TFOD) recognitions.
*/
private void telemetryTfod() {
List<Recognition> currentRecognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", currentRecognitions.size());
// Step through the list of recognitions and display info for each one.
for (Recognition recognition : currentRecognitions) {
double x = (recognition.getLeft() + recognition.getRight()) / 2 ;
double y = (recognition.getTop() + recognition.getBottom()) / 2 ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100);
telemetry.addData("- Position", "%.0f / %.0f", x, y);
telemetry.addData("- Size", "%.0f x %.0f", recognition.getWidth(), recognition.getHeight());
} // end for() loop
} // end method telemetryTfod()
} // end class

View File

@ -37,36 +37,35 @@ import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.IMU;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
/**
* This file demonstrates the impact of setting the IMU orientation correctly or incorrectly. This
/*
* This OpMode demonstrates the impact of setting the IMU orientation correctly or incorrectly. This
* code assumes there is an IMU configured with the name "imu".
* <p>
*
* Note: This OpMode is more of a tool than a code sample. The User Interface portion of this code
* goes beyond simply showing how to interface to the IMU.<br>
* For a minimal example of interfacing to an IMU, please see the SensorIMUOrthogonal or SensorIMUNonOrthogonal sample OpModes.
* <p>
* This sample enables you to re-specify the Hub Mounting orientation dynamically by using gamepad controls.
*
* This OpMode enables you to re-specify the Hub Mounting orientation dynamically by using gamepad controls.
* While doing so, the sample will display how Pitch, Roll and Yaw angles change as the hub is moved.
* <p>
*
* The gamepad controls let you change the two parameters that specify how the Control/Expansion Hub is mounted. <br>
* The first parameter specifies which direction the printed logo on the Hub is pointing. <br>
* The second parameter specifies which direction the USB connector on the Hub is pointing. <br>
* All directions are relative to the robot, and left/right is as viewed from behind the robot.
* <p>
*
* How will you know if you have chosen the correct Orientation? With the correct orientation
* parameters selected, pitch/roll/yaw should act as follows:
* <p>
*
* Pitch value should INCREASE as the robot is tipped UP at the front. (Rotation about X) <br>
* Roll value should INCREASE as the robot is tipped UP at the left side. (Rotation about Y) <br>
* Yaw value should INCREASE as the robot is rotated Counter Clockwise. (Rotation about Z) <br>
* <p>
*
* The Yaw can be reset (to zero) by pressing the Y button on the gamepad (Triangle on a PS4 controller)
* <p>
*
* The rotational velocities should follow the change in corresponding axes.
*/

View File

@ -34,8 +34,8 @@ import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.Range;
/**
* This OpMode Sample illustrates how to use an external "hardware" class to modularize all the robot's sensors and actuators.
/*
* This OpMode illustrates how to use an external "hardware" class to modularize all the robot's sensors and actuators.
* This approach is very efficient because the same hardware class can be used by all of your teleop and autonomous OpModes
* without requiring many copy & paste operations. Once you have defined and tested the hardware class with one OpMode,
* it is instantly available to other OpModes.
@ -53,15 +53,15 @@ import com.qualcomm.robotcore.util.Range;
* must also be copied to the same location (maintaining its name).
*
* For comparison purposes, this sample and its accompanying hardware class duplicates the functionality of the
* RobotTelopPOV_Linear opmode. It assumes three motors (left_drive, right_drive and arm) and two servos (left_hand and right_hand)
* RobotTelopPOV_Linear OpMode. It assumes three motors (left_drive, right_drive and arm) and two servos (left_hand and right_hand)
*
* View the RobotHardware.java class file for more details
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* In OnBot Java, add a new OpMode, drawing from this Sample; select TeleOp.
* Also add another new file named RobotHardware.java, drawing from the Sample with that name; select Not an OpMode.
* In OnBot Java, add a new OpMode, select this sample, and select TeleOp.
* Also add another new file named RobotHardware.java, select the sample with that name, and select Not an OpMode.
*/
@TeleOp(name="Concept: Robot Hardware Class", group="Robot")

View File

@ -6,13 +6,13 @@ import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.Gamepad;
import com.qualcomm.robotcore.util.ElapsedTime;
/**
* This sample illustrates using the rumble feature of many gamepads.
/*
* This OpMode illustrates using the rumble feature of many gamepads.
*
* Note: Some gamepads "rumble" better than others.
* The Xbox & PS4 have a left (rumble1) and right (rumble2) rumble motor.
* These two gamepads have two distinct rumble modes: Large on the left, and small on the right
* The ETpark gamepad may only respond to rumble1, and may only run at full power.
* The Etpark gamepad may only respond to rumble1, and may only run at full power.
* The Logitech F310 gamepad does not have *any* rumble ability.
*
* Moral: You should use this sample to experiment with your specific gamepads to explore their rumble features.

View File

@ -1,13 +1,12 @@
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import org.firstinspires.ftc.robotcore.external.Telemetry;
/**
* This sample illustrates using the touchpad feature found on some gamepads.
/*
* This OpMode illustrates using the touchpad feature found on some gamepads.
*
* The Sony PS4 gamepad can detect two distinct touches on the central touchpad.
* Other gamepads with different touchpads may provide mixed results.

View File

@ -36,50 +36,48 @@ import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotorEx;
import com.qualcomm.robotcore.util.ElapsedTime;
import java.util.Iterator;
import java.util.List;
/*
This sample illustrates how to use the Expansion Hub's Bulk-Read feature to speed up control cycle times.
In this example there are 4 motors that need their encoder positions, and velocities read.
The sample is written to work with one or two expansion hubs, with no assumption as to where the motors are located.
Three scenarios are tested:
Cache Mode = OFF This is the normal default, where no cache is used, and every read produces a discrete transaction with
an expansion hub, which is the slowest approach, but guarentees that the value is as fresh (recent) as possible..
Cache Mode = AUTO This mode will attempt to minimize the number of discrete read commands, by performing bulk-reads
and then returning values that have been cached. The cache is updated automatically whenever any specific encoder is re-read.
This mode will always return new data, but it may perform more bulk-reads than absolutely required.
Extra reads will be performed if multiple encoder/velocity reads are performed on the same encoder in one control cycle.
This mode is a good compromise between the OFF and MANUAL modes.
Note: If there are significant user-program delays between encoder reads, the cached value may not be fresh (recent).
You can issue a clearBulkCache() call at any time force a fresh bulk-read on the next encoder read.
Cache Mode = MANUAL This mode requires the user's code to determine the best time to clear the cached bulk-read data.
Well organized code will reset the cache once at the beginning of the control cycle, and then immediately read and store all the encoder values.
This approach will produce the shortest cycle times, but it does require the user to manually clear the cache.
Since NO automatic Bulk-Reads are performed, neglecting to clear the bulk cache will result in the same values being returned
each time an encoder read is performed.
-------------------------------------
General tip to speed up your control cycles:
No matter what method you use to read encoders and other inputs, you should try to
avoid reading the same encoder input multiple times around a control loop.
Under normal conditions, this will slow down the control loop.
The preferred method is to read all the required inputs ONCE at the beginning of the loop,
and save the values in variable that can be used by other parts of the control code.
eg: if you are sending encoder positions to your telemetry display, putting a getCurrentPosition()
call in the telemetry statement will force the code to go and get another copy which will take time.
It's much better read the position into a variable once, and use that variable for control AND display.
Reading saved variables takes no time at all.
Once you put all your sensor reads at the beginning of the control cycle, it's very easy to use
the bulk-read AUTO mode to streamline your cycle timing.
* This OpMode illustrates how to use the Expansion Hub's Bulk-Read feature to speed up control cycle times.
* In this example there are 4 motors that need their encoder positions, and velocities read.
* The sample is written to work with one or two expansion hubs, with no assumption as to where the motors are located.
*
* Three scenarios are tested:
* Cache Mode = OFF This is the normal default, where no cache is used, and every read produces a discrete transaction with
* an expansion hub, which is the slowest approach, but guarentees that the value is as fresh (recent) as possible..
*
* Cache Mode = AUTO This mode will attempt to minimize the number of discrete read commands, by performing bulk-reads
* and then returning values that have been cached. The cache is updated automatically whenever any specific encoder is re-read.
* This mode will always return new data, but it may perform more bulk-reads than absolutely required.
* Extra reads will be performed if multiple encoder/velocity reads are performed on the same encoder in one control cycle.
* This mode is a good compromise between the OFF and MANUAL modes.
* Note: If there are significant user-program delays between encoder reads, the cached value may not be fresh (recent).
* You can issue a clearBulkCache() call at any time force a fresh bulk-read on the next encoder read.
*
* Cache Mode = MANUAL This mode requires the user's code to determine the best time to clear the cached bulk-read data.
* Well organized code will reset the cache once at the beginning of the control cycle, and then immediately read and store all the encoder values.
* This approach will produce the shortest cycle times, but it does require the user to manually clear the cache.
* Since NO automatic Bulk-Reads are performed, neglecting to clear the bulk cache will result in the same values being returned
* each time an encoder read is performed.
*
* -------------------------------------
*
* General tip to speed up your control cycles:
*
* No matter what method you use to read encoders and other inputs, you should try to
* avoid reading the same encoder input multiple times around a control loop.
* Under normal conditions, this will slow down the control loop.
* The preferred method is to read all the required inputs ONCE at the beginning of the loop,
* and save the values in variable that can be used by other parts of the control code.
*
* eg: if you are sending encoder positions to your telemetry display, putting a getCurrentPosition()
* call in the telemetry statement will force the code to go and get another copy which will take time.
* It's much better read the position into a variable once, and use that variable for control AND display.
* Reading saved variables takes no time at all.
*
* Once you put all your sensor reads at the beginning of the control cycle, it's very easy to use
* the bulk-read AUTO mode to streamline your cycle timing.
*/
@TeleOp (name = "Motor Bulk Reads", group = "Tests")
@Disabled

View File

@ -34,11 +34,8 @@ import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.ElapsedTime;
import java.text.SimpleDateFormat;
import java.util.Date;
/**
* Demonstrates empty OpMode
/*
* Demonstrates an empty iterative OpMode
*/
@TeleOp(name = "Concept: NullOp", group = "Concept")
@Disabled
@ -46,34 +43,47 @@ public class ConceptNullOp extends OpMode {
private ElapsedTime runtime = new ElapsedTime();
/**
* This method will be called once, when the INIT button is pressed.
*/
@Override
public void init() {
telemetry.addData("Status", "Initialized");
}
/*
* Code to run when the op mode is first enabled goes here
* @see com.qualcomm.robotcore.eventloop.opmode.OpMode#start()
/**
* This method will be called repeatedly during the period between when
* the init button is pressed and when the play button is pressed (or the
* OpMode is stopped).
*/
@Override
public void init_loop() {
}
/*
* This method will be called ONCE when start is pressed
* @see com.qualcomm.robotcore.eventloop.opmode.OpMode#loop()
/**
* This method will be called once, when the play button is pressed.
*/
@Override
public void start() {
runtime.reset();
}
/*
* This method will be called repeatedly in a loop
* @see com.qualcomm.robotcore.eventloop.opmode.OpMode#loop()
/**
* This method will be called repeatedly during the period between when
* the play button is pressed and when the OpMode is stopped.
*/
@Override
public void loop() {
telemetry.addData("Status", "Run Time: " + runtime.toString());
}
/**
* This method will be called once, when this OpMode is stopped.
* <p>
* Your ability to control hardware from this method will be limited.
*/
@Override
public void stop() {
}
}

View File

@ -34,7 +34,7 @@ import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
/**
/*
* This OpMode ramps a single motor speed up and down repeatedly until Stop is pressed.
* The code is structured as a LinearOpMode
*
@ -44,7 +44,7 @@ import com.qualcomm.robotcore.hardware.DcMotor;
* CYCLE_MS sets the update period.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Concept: Ramp Motor Speed", group = "Concept")
@Disabled

View File

@ -37,8 +37,7 @@ import com.qualcomm.robotcore.util.ElapsedTime;
import com.qualcomm.robotcore.util.Range;
/**
*
/*
* This OpMode executes a basic Tank Drive Teleop for a two wheeled robot using two REV SPARKminis.
* To use this example, connect two REV SPARKminis into servo ports on the Expansion Hub. On the
* robot configuration, use the drop down list under 'Servos' to select 'REV SPARKmini Controller'

View File

@ -34,7 +34,7 @@ import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.Servo;
/**
/*
* This OpMode scans a single servo back and forward until Stop is pressed.
* The code is structured as a LinearOpMode
* INCREMENT sets how much to increase/decrease the servo position each cycle
@ -46,7 +46,7 @@ import com.qualcomm.robotcore.hardware.Servo;
* connected servos are able to move freely before running this test.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Concept: Scan Servo", group = "Concept")
@Disabled

View File

@ -34,17 +34,15 @@ import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import java.io.File;
/**
* This file demonstrates how to play simple sounds on both the RC and DS phones.
/*
* This OpMode demonstrates how to play simple sounds on both the RC and DS phones.
* It illustrates how to build sounds into your application as a resource.
* This technique is best suited for use with Android Studio since it assumes you will be creating a new application
*
* If you are using OnBotJava, please see the ConceptSoundsOnBotJava sample
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* Operation:
*

View File

@ -33,10 +33,11 @@ import com.qualcomm.ftccommon.SoundPlayer;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import java.io.File;
/**
* This file demonstrates how to play simple sounds on both the RC and DS phones.
/*
* This OpMode demonstrates how to play simple sounds on both the RC and DS phones.
* It illustrates how to play sound files that have been copied to the RC Phone
* This technique is best suited for use with OnBotJava since it does not require the app to be modified.
*

View File

@ -30,14 +30,13 @@
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.content.Context;
import com.qualcomm.ftccommon.SoundPlayer;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
/**
* This file demonstrates how to play one of the several SKYSTONE/Star Wars sounds loaded into the SDK.
/*
* This OpMode demonstrates how to play one of the several SKYSTONE/Star Wars sounds loaded into the SDK.
* It does this by creating a simple "chooser" controlled by the gamepad Up Down buttons.
* This code also prevents sounds from stacking up by setting a "playing" flag, which is cleared when the sound finishes playing.
*

View File

@ -34,26 +34,25 @@ import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.VoltageSensor;
import com.qualcomm.robotcore.util.ElapsedTime;
import org.firstinspires.ftc.robotcore.external.Func;
import org.firstinspires.ftc.robotcore.external.Telemetry;
/**
* {@link ConceptTelemetry} illustrates various ways in which telemetry can be
/*
* This OpMode illustrates various ways in which telemetry can be
* transmitted from the robot controller to the driver station. The sample illustrates
* numeric and text data, formatted output, and optimized evaluation of expensive-to-acquire
* information. The telemetry {@link Telemetry#log() log} is illustrated by scrolling a poem
* information. The telemetry log is illustrated by scrolling a poem
* to the driver station.
*
* @see Telemetry
* Also see the Telemetry javadocs.
*/
@TeleOp(name = "Concept: Telemetry", group = "Concept")
@Disabled
public class ConceptTelemetry extends LinearOpMode {
/** keeps track of the line of the poem which is to be emitted next */
/** Keeps track of the line of the poem which is to be emitted next */
int poemLine = 0;
/** keeps track of how long it's been since we last emitted a line of poetry */
/** Keeps track of how long it's been since we last emitted a line of poetry */
ElapsedTime poemElapsed = new ElapsedTime();
static final String[] poem = new String[] {
@ -94,10 +93,10 @@ public class ConceptTelemetry extends LinearOpMode {
// The interval between lines of poetry, in seconds
double sPoemInterval = 0.6;
/**
/*
* Wait until we've been given the ok to go. For something to do, we emit the
* elapsed time as we sit here and wait. If we didn't want to do anything while
* we waited, we would just call {@link #waitForStart()}.
* we waited, we would just call waitForStart().
*/
while (!isStarted()) {
telemetry.addData("time", "%.1f seconds", opmodeRunTime.seconds());
@ -107,10 +106,10 @@ public class ConceptTelemetry extends LinearOpMode {
// Ok, we've been given the ok to go
/**
/*
* As an illustration, the first line on our telemetry display will display the battery voltage.
* The idea here is that it's expensive to compute the voltage (at least for purposes of illustration)
* so you don't want to do it unless the data is <em>actually</em> going to make it to the
* so you don't want to do it unless the data is _actually_ going to make it to the
* driver station (recall that telemetry transmission is throttled to reduce bandwidth use.
* Note that getBatteryVoltage() below returns 'Infinity' if there's no voltage sensor attached.
*
@ -122,7 +121,7 @@ public class ConceptTelemetry extends LinearOpMode {
}
});
// Reset to keep some timing stats for the post-'start' part of the opmode
// Reset to keep some timing stats for the post-'start' part of the OpMode
opmodeRunTime.reset();
int loopCount = 1;
@ -146,13 +145,13 @@ public class ConceptTelemetry extends LinearOpMode {
.addData("x", gamepad1.right_stick_x)
.addData("y", gamepad1.right_stick_y);
/**
/*
* Transmit the telemetry to the driver station, subject to throttling.
* @see Telemetry#getMsTransmissionInterval()
* See the documentation for Telemetry.getMsTransmissionInterval() for more information.
*/
telemetry.update();
/** Update loop info and play nice with the rest of the {@link Thread}s in the system */
// Update loop info
loopCount++;
}
}

View File

@ -29,161 +29,171 @@
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.util.Size;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import java.util.List;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer.CameraDirection;
import org.firstinspires.ftc.robotcore.external.tfod.TFObjectDetector;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
/**
* This 2022-2023 OpMode illustrates the basics of using the TensorFlow Object Detection API to
* determine which image is being presented to the robot.
import java.util.List;
/*
* This OpMode illustrates the basics of TensorFlow Object Detection,
* including Java Builder structures for specifying Vision parameters.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
* IMPORTANT: In order to use this OpMode, you need to obtain your own Vuforia license key as
* is explained below.
*/
@TeleOp(name = "Concept: TensorFlow Object Detection", group = "Concept")
@Disabled
public class ConceptTensorFlowObjectDetection extends LinearOpMode {
/*
* Specify the source for the Tensor Flow Model.
* If the TensorFlowLite object model is included in the Robot Controller App as an "asset",
* the OpMode must to load it using loadModelFromAsset(). However, if a team generated model
* has been downloaded to the Robot Controller's SD FLASH memory, it must to be loaded using loadModelFromFile()
* Here we assume it's an Asset. Also see method initTfod() below .
*/
private static final String TFOD_MODEL_ASSET = "PowerPlay.tflite";
// private static final String TFOD_MODEL_FILE = "/sdcard/FIRST/tflitemodels/CustomTeamModel.tflite";
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
// TFOD_MODEL_ASSET points to a model file stored in the project Asset location,
// this is only used for Android Studio when using models in Assets.
private static final String TFOD_MODEL_ASSET = "MyModelStoredAsAsset.tflite";
// TFOD_MODEL_FILE points to a model file stored onboard the Robot Controller's storage,
// this is used when uploading models directly to the RC using the model upload interface.
private static final String TFOD_MODEL_FILE = "/sdcard/FIRST/tflitemodels/myCustomModel.tflite";
// Define the labels recognized in the model for TFOD (must be in training order!)
private static final String[] LABELS = {
"1 Bolt",
"2 Bulb",
"3 Panel"
"Pixel",
};
/*
* IMPORTANT: You need to obtain your own license key to use Vuforia. The string below with which
* 'parameters.vuforiaLicenseKey' is initialized is for illustration only, and will not function.
* A Vuforia 'Development' license key, can be obtained free of charge from the Vuforia developer
* web site at https://developer.vuforia.com/license-manager.
*
* Vuforia license keys are always 380 characters long, and look as if they contain mostly
* random data. As an example, here is a example of a fragment of a valid key:
* ... yIgIzTqZ4mWjk9wd3cZO9T1axEqzuhxoGlfOOI2dRzKS4T0hQ8kT ...
* Once you've obtained a license key, copy the string from the Vuforia web site
* and paste it in to your code on the next line, between the double quotes.
/**
* The variable to store our instance of the TensorFlow Object Detection processor.
*/
private static final String VUFORIA_KEY =
" -- YOUR NEW VUFORIA KEY GOES HERE --- ";
private TfodProcessor tfod;
/**
* {@link #vuforia} is the variable we will use to store our instance of the Vuforia
* localization engine.
* The variable to store our instance of the vision portal.
*/
private VuforiaLocalizer vuforia;
/**
* {@link #tfod} is the variable we will use to store our instance of the TensorFlow Object
* Detection engine.
*/
private TFObjectDetector tfod;
private VisionPortal visionPortal;
@Override
public void runOpMode() {
// The TFObjectDetector uses the camera frames from the VuforiaLocalizer, so we create that
// first.
initVuforia();
initTfod();
/**
* Activate TensorFlow Object Detection before we wait for the start command.
* Do it here so that the Camera Stream window will have the TensorFlow annotations visible.
**/
if (tfod != null) {
tfod.activate();
// The TensorFlow software will scale the input images from the camera to a lower resolution.
// This can result in lower detection accuracy at longer distances (> 55cm or 22").
// If your target is at distance greater than 50 cm (20") you can increase the magnification value
// to artificially zoom in to the center of image. For best results, the "aspectRatio" argument
// should be set to the value of the images used to create the TensorFlow Object Detection model
// (typically 16/9).
tfod.setZoom(1.0, 16.0/9.0);
}
/** Wait for the game to begin */
telemetry.addData(">", "Press Play to start op mode");
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
if (tfod != null) {
// getUpdatedRecognitions() will return null if no new information is available since
// the last time that call was made.
List<Recognition> updatedRecognitions = tfod.getUpdatedRecognitions();
if (updatedRecognitions != null) {
telemetry.addData("# Objects Detected", updatedRecognitions.size());
// step through the list of recognitions and display image position/size information for each one
// Note: "Image number" refers to the randomized image orientation/number
for (Recognition recognition : updatedRecognitions) {
double col = (recognition.getLeft() + recognition.getRight()) / 2 ;
double row = (recognition.getTop() + recognition.getBottom()) / 2 ;
double width = Math.abs(recognition.getRight() - recognition.getLeft()) ;
double height = Math.abs(recognition.getTop() - recognition.getBottom()) ;
telemetryTfod();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end runOpMode()
/**
* Initialize the TensorFlow Object Detection processor.
*/
private void initTfod() {
// Create the TensorFlow processor by using a builder.
tfod = new TfodProcessor.Builder()
// With the following lines commented out, the default TfodProcessor Builder
// will load the default model for the season. To define a custom model to load,
// choose one of the following:
// Use setModelAssetName() if the custom TF Model is built in as an asset (AS only).
// Use setModelFileName() if you have downloaded a custom team model to the Robot Controller.
//.setModelAssetName(TFOD_MODEL_ASSET)
//.setModelFileName(TFOD_MODEL_FILE)
// The following default settings are available to un-comment and edit as needed to
// set parameters for custom models.
//.setModelLabels(LABELS)
//.setIsModelTensorFlow2(true)
//.setIsModelQuantized(true)
//.setModelInputSize(300)
//.setModelAspectRatio(16.0 / 9.0)
.build();
// Create the vision portal by using a builder.
VisionPortal.Builder builder = new VisionPortal.Builder();
// Set the camera (webcam vs. built-in RC phone camera).
if (USE_WEBCAM) {
builder.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"));
} else {
builder.setCamera(BuiltinCameraDirection.BACK);
}
// Choose a camera resolution. Not all cameras support all resolutions.
//builder.setCameraResolution(new Size(640, 480));
// Enable the RC preview (LiveView). Set "false" to omit camera monitoring.
//builder.enableLiveView(true);
// Set the stream format; MJPEG uses less bandwidth than default YUY2.
//builder.setStreamFormat(VisionPortal.StreamFormat.YUY2);
// Choose whether or not LiveView stops if no processors are enabled.
// If set "true", monitor shows solid orange screen if no processors enabled.
// If set "false", monitor shows camera view without annotations.
//builder.setAutoStopLiveView(false);
// Set and enable the processor.
builder.addProcessor(tfod);
// Build the Vision Portal, using the above settings.
visionPortal = builder.build();
// Set confidence threshold for TFOD recognitions, at any time.
//tfod.setMinResultConfidence(0.75f);
// Disable or re-enable the TFOD processor at any time.
//visionPortal.setProcessorEnabled(tfod, true);
} // end method initTfod()
/**
* Add telemetry about TensorFlow Object Detection (TFOD) recognitions.
*/
private void telemetryTfod() {
List<Recognition> currentRecognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", currentRecognitions.size());
// Step through the list of recognitions and display info for each one.
for (Recognition recognition : currentRecognitions) {
double x = (recognition.getLeft() + recognition.getRight()) / 2 ;
double y = (recognition.getTop() + recognition.getBottom()) / 2 ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100);
telemetry.addData("- Position (Row/Col)","%.0f / %.0f", row, col);
telemetry.addData("- Size (Width/Height)","%.0f / %.0f", width, height);
}
telemetry.update();
}
}
}
}
}
telemetry.addData("- Position", "%.0f / %.0f", x, y);
telemetry.addData("- Size", "%.0f x %.0f", recognition.getWidth(), recognition.getHeight());
} // end for() loop
/**
* Initialize the Vuforia localization engine.
*/
private void initVuforia() {
/*
* Configure Vuforia by creating a Parameter object, and passing it to the Vuforia engine.
*/
VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters();
} // end method telemetryTfod()
parameters.vuforiaLicenseKey = VUFORIA_KEY;
parameters.cameraDirection = CameraDirection.BACK;
// Instantiate the Vuforia engine
vuforia = ClassFactory.getInstance().createVuforia(parameters);
}
/**
* Initialize the TensorFlow Object Detection engine.
*/
private void initTfod() {
int tfodMonitorViewId = hardwareMap.appContext.getResources().getIdentifier(
"tfodMonitorViewId", "id", hardwareMap.appContext.getPackageName());
TFObjectDetector.Parameters tfodParameters = new TFObjectDetector.Parameters(tfodMonitorViewId);
tfodParameters.minResultConfidence = 0.75f;
tfodParameters.isModelTensorFlow2 = true;
tfodParameters.inputSize = 300;
tfod = ClassFactory.getInstance().createTFObjectDetector(tfodParameters, vuforia);
// Use loadModelFromAsset() if the TF Model is built in as an asset by Android Studio
// Use loadModelFromFile() if you have downloaded a custom team model to the Robot Controller's FLASH.
tfod.loadModelFromAsset(TFOD_MODEL_ASSET, LABELS);
// tfod.loadModelFromFile(TFOD_MODEL_FILE, LABELS);
}
}
} // end class

View File

@ -0,0 +1,142 @@
/* Copyright (c) 2019 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of TensorFlow Object Detection, using
* the easiest way.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: TensorFlow Object Detection Easy", group = "Concept")
@Disabled
public class ConceptTensorFlowObjectDetectionEasy extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
/**
* The variable to store our instance of the TensorFlow Object Detection processor.
*/
private TfodProcessor tfod;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initTfod();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryTfod();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end runOpMode()
/**
* Initialize the TensorFlow Object Detection processor.
*/
private void initTfod() {
// Create the TensorFlow processor the easy way.
tfod = TfodProcessor.easyCreateWithDefaults();
// Create the vision portal the easy way.
if (USE_WEBCAM) {
visionPortal = VisionPortal.easyCreateWithDefaults(
hardwareMap.get(WebcamName.class, "Webcam 1"), tfod);
} else {
visionPortal = VisionPortal.easyCreateWithDefaults(
BuiltinCameraDirection.BACK, tfod);
}
} // end method initTfod()
/**
* Add telemetry about TensorFlow Object Detection (TFOD) recognitions.
*/
private void telemetryTfod() {
List<Recognition> currentRecognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", currentRecognitions.size());
// Step through the list of recognitions and display info for each one.
for (Recognition recognition : currentRecognitions) {
double x = (recognition.getLeft() + recognition.getRight()) / 2 ;
double y = (recognition.getTop() + recognition.getBottom()) / 2 ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100);
telemetry.addData("- Position", "%.0f / %.0f", x, y);
telemetry.addData("- Size", "%.0f x %.0f", recognition.getWidth(), recognition.getHeight());
} // end for() loop
} // end method telemetryTfod()
} // end class

View File

@ -32,193 +32,155 @@ package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import java.util.List;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.hardware.camera.SwitchableCamera;
import org.firstinspires.ftc.robotcore.external.hardware.camera.CameraName;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer;
import org.firstinspires.ftc.robotcore.external.tfod.TFObjectDetector;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.VisionPortal.CameraState;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
/**
* This 2022-2023 OpMode illustrates the basics of using the TensorFlow Object Detection API to
* determine which image is being presented to the robot.
import java.util.List;
/*
* This OpMode illustrates the basics of TensorFlow Object Detection, using
* two webcams.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list.
*
* IMPORTANT: In order to use this OpMode, you need to obtain your own Vuforia license key as
* is explained below.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Concept: TensorFlow Object Detection Switchable Cameras", group = "Concept")
@Disabled
public class ConceptTensorFlowObjectDetectionSwitchableCameras extends LinearOpMode {
/*
* Specify the source for the Tensor Flow Model.
* If the TensorFlowLite object model is included in the Robot Controller App as an "asset",
* the OpMode must to load it using loadModelFromAsset(). However, if a team generated model
* has been downloaded to the Robot Controller's SD FLASH memory, it must to be loaded using loadModelFromFile()
* Here we assume it's an Asset. Also see method initTfod() below .
*/
private static final String TFOD_MODEL_ASSET = "PowerPlay.tflite";
// private static final String TFOD_MODEL_FILE = "/sdcard/FIRST/tflitemodels/CustomTeamModel.tflite";
private static final String[] LABELS = {
"1 Bolt",
"2 Bulb",
"3 Panel"
};
/*
* IMPORTANT: You need to obtain your own license key to use Vuforia. The string below with which
* 'parameters.vuforiaLicenseKey' is initialized is for illustration only, and will not function.
* A Vuforia 'Development' license key, can be obtained free of charge from the Vuforia developer
* web site at https://developer.vuforia.com/license-manager.
*
* Vuforia license keys are always 380 characters long, and look as if they contain mostly
* random data. As an example, here is a example of a fragment of a valid key:
* ... yIgIzTqZ4mWjk9wd3cZO9T1axEqzuhxoGlfOOI2dRzKS4T0hQ8kT ...
* Once you've obtained a license key, copy the string from the Vuforia web site
* and paste it in to your code on the next line, between the double quotes.
*/
private static final String VUFORIA_KEY =
" -- YOUR NEW VUFORIA KEY GOES HERE --- ";
/**
* {@link #vuforia} is the variable we will use to store our instance of the Vuforia
* localization engine.
*/
private VuforiaLocalizer vuforia;
/**
* Variables used for switching cameras.
*/
private WebcamName webcam1, webcam2;
private SwitchableCamera switchableCamera;
private boolean oldLeftBumper;
private boolean oldRightBumper;
/**
* {@link #tfod} is the variable we will use to store our instance of the TensorFlow Object
* Detection engine.
* The variable to store our instance of the TensorFlow Object Detection processor.
*/
private TFObjectDetector tfod;
private TfodProcessor tfod;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
// The TFObjectDetector uses the camera frames from the VuforiaLocalizer, so we create that
// first.
initVuforia();
initTfod();
/**
* Activate TensorFlow Object Detection before we wait for the start command.
* Do it here so that the Camera Stream window will have the TensorFlow annotations visible.
**/
if (tfod != null) {
tfod.activate();
// The TensorFlow software will scale the input images from the camera to a lower resolution.
// This can result in lower detection accuracy at longer distances (> 55cm or 22").
// If your target is at distance greater than 50 cm (20") you can increase the magnification value
// to artificially zoom in to the center of image. For best results, the "aspectRatio" argument
// should be set to the value of the images used to create the TensorFlow Object Detection model
// (typically 16/9).
tfod.setZoom(1.0, 16.0/9.0);
}
/** Wait for the game to begin */
telemetry.addData(">", "Press Play to start op mode");
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
if (tfod != null) {
doCameraSwitching();
List<Recognition> recognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", recognitions.size());
// step through the list of recognitions and display image size and position
// Note: "Image number" refers to the randomized image orientation/number
for (Recognition recognition : recognitions) {
double col = (recognition.getLeft() + recognition.getRight()) / 2 ;
double row = (recognition.getTop() + recognition.getBottom()) / 2 ;
double width = Math.abs(recognition.getRight() - recognition.getLeft()) ;
double height = Math.abs(recognition.getTop() - recognition.getBottom()) ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100 );
telemetry.addData("- Position (Row/Col)","%.0f / %.0f", row, col);
telemetry.addData("- Size (Width/Height)","%.0f / %.0f", width, height);
}
telemetryCameraSwitching();
telemetryTfod();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
doCameraSwitching();
// Share the CPU.
sleep(20);
}
}
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end runOpMode()
/**
* Initialize the Vuforia localization engine.
*/
private void initVuforia() {
/*
* Configure Vuforia by creating a Parameter object, and passing it to the Vuforia engine.
*/
VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters();
parameters.vuforiaLicenseKey = VUFORIA_KEY;
// Indicate that we wish to be able to switch cameras.
webcam1 = hardwareMap.get(WebcamName.class, "Webcam 1");
webcam2 = hardwareMap.get(WebcamName.class, "Webcam 2");
parameters.cameraName = ClassFactory.getInstance().getCameraManager().nameForSwitchableCamera(webcam1, webcam2);
// Instantiate the Vuforia engine
vuforia = ClassFactory.getInstance().createVuforia(parameters);
// Set the active camera to Webcam 1.
switchableCamera = (SwitchableCamera) vuforia.getCamera();
switchableCamera.setActiveCamera(webcam1);
}
/**
* Initialize the TensorFlow Object Detection engine.
* Initialize the TensorFlow Object Detection processor.
*/
private void initTfod() {
int tfodMonitorViewId = hardwareMap.appContext.getResources().getIdentifier(
"tfodMonitorViewId", "id", hardwareMap.appContext.getPackageName());
TFObjectDetector.Parameters tfodParameters = new TFObjectDetector.Parameters(tfodMonitorViewId);
tfodParameters.minResultConfidence = 0.75f;
tfodParameters.isModelTensorFlow2 = true;
tfodParameters.inputSize = 300;
tfod = ClassFactory.getInstance().createTFObjectDetector(tfodParameters, vuforia);
// Use loadModelFromAsset() if the TF Model is built in as an asset by Android Studio
// Use loadModelFromFile() if you have downloaded a custom team model to the Robot Controller's FLASH.
tfod.loadModelFromAsset(TFOD_MODEL_ASSET, LABELS);
// tfod.loadModelFromFile(TFOD_MODEL_FILE, LABELS);
}
// Create the TensorFlow processor by using a builder.
tfod = new TfodProcessor.Builder().build();
private void doCameraSwitching() {
// If the left bumper is pressed, use Webcam 1.
// If the right bumper is pressed, use Webcam 2.
boolean newLeftBumper = gamepad1.left_bumper;
boolean newRightBumper = gamepad1.right_bumper;
if (newLeftBumper && !oldLeftBumper) {
switchableCamera.setActiveCamera(webcam1);
} else if (newRightBumper && !oldRightBumper) {
switchableCamera.setActiveCamera(webcam2);
}
oldLeftBumper = newLeftBumper;
oldRightBumper = newRightBumper;
webcam1 = hardwareMap.get(WebcamName.class, "Webcam 1");
webcam2 = hardwareMap.get(WebcamName.class, "Webcam 2");
CameraName switchableCamera = ClassFactory.getInstance()
.getCameraManager().nameForSwitchableCamera(webcam1, webcam2);
if (switchableCamera.getActiveCamera().equals(webcam1)) {
// Create the vision portal by using a builder.
visionPortal = new VisionPortal.Builder()
.setCamera(switchableCamera)
.addProcessor(tfod)
.build();
} // end method initTfod()
/**
* Add telemetry about camera switching.
*/
private void telemetryCameraSwitching() {
if (visionPortal.getActiveCamera().equals(webcam1)) {
telemetry.addData("activeCamera", "Webcam 1");
telemetry.addData("Press RightBumper", "to switch to Webcam 2");
} else {
telemetry.addData("activeCamera", "Webcam 2");
telemetry.addData("Press LeftBumper", "to switch to Webcam 1");
}
} // end method telemetryCameraSwitching()
/**
* Add telemetry about TensorFlow Object Detection (TFOD) recognitions.
*/
private void telemetryTfod() {
List<Recognition> currentRecognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", currentRecognitions.size());
// Step through the list of recognitions and display info for each one.
for (Recognition recognition : currentRecognitions) {
double x = (recognition.getLeft() + recognition.getRight()) / 2 ;
double y = (recognition.getTop() + recognition.getBottom()) / 2 ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100);
telemetry.addData("- Position", "%.0f / %.0f", x, y);
telemetry.addData("- Size", "%.0f x %.0f", recognition.getWidth(), recognition.getHeight());
} // end for() loop
} // end method telemetryTfod()
/**
* Set the active camera according to input from the gamepad.
*/
private void doCameraSwitching() {
if (visionPortal.getCameraState() == CameraState.STREAMING) {
// If the left bumper is pressed, use Webcam 1.
// If the right bumper is pressed, use Webcam 2.
boolean newLeftBumper = gamepad1.left_bumper;
boolean newRightBumper = gamepad1.right_bumper;
if (newLeftBumper && !oldLeftBumper) {
visionPortal.setActiveCamera(webcam1);
} else if (newRightBumper && !oldRightBumper) {
visionPortal.setActiveCamera(webcam2);
}
oldLeftBumper = newLeftBumper;
oldRightBumper = newRightBumper;
}
} // end method doCameraSwitching()
} // end class

View File

@ -1,190 +0,0 @@
/* Copyright (c) 2019 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import java.util.List;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer;
import org.firstinspires.ftc.robotcore.external.tfod.TFObjectDetector;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
/**
* This 2022-2023 OpMode illustrates the basics of using the TensorFlow Object Detection API to
* determine which image is being presented to the robot.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list.
*
* IMPORTANT: In order to use this OpMode, you need to obtain your own Vuforia license key as
* is explained below.
*/
@TeleOp(name = "Concept: TensorFlow Object Detection Webcam", group = "Concept")
@Disabled
public class ConceptTensorFlowObjectDetectionWebcam extends LinearOpMode {
/*
* Specify the source for the Tensor Flow Model.
* If the TensorFlowLite object model is included in the Robot Controller App as an "asset",
* the OpMode must to load it using loadModelFromAsset(). However, if a team generated model
* has been downloaded to the Robot Controller's SD FLASH memory, it must to be loaded using loadModelFromFile()
* Here we assume it's an Asset. Also see method initTfod() below .
*/
private static final String TFOD_MODEL_ASSET = "PowerPlay.tflite";
// private static final String TFOD_MODEL_FILE = "/sdcard/FIRST/tflitemodels/CustomTeamModel.tflite";
private static final String[] LABELS = {
"1 Bolt",
"2 Bulb",
"3 Panel"
};
/*
* IMPORTANT: You need to obtain your own license key to use Vuforia. The string below with which
* 'parameters.vuforiaLicenseKey' is initialized is for illustration only, and will not function.
* A Vuforia 'Development' license key, can be obtained free of charge from the Vuforia developer
* web site at https://developer.vuforia.com/license-manager.
*
* Vuforia license keys are always 380 characters long, and look as if they contain mostly
* random data. As an example, here is a example of a fragment of a valid key:
* ... yIgIzTqZ4mWjk9wd3cZO9T1axEqzuhxoGlfOOI2dRzKS4T0hQ8kT ...
* Once you've obtained a license key, copy the string from the Vuforia web site
* and paste it in to your code on the next line, between the double quotes.
*/
private static final String VUFORIA_KEY =
" -- YOUR NEW VUFORIA KEY GOES HERE --- ";
/**
* {@link #vuforia} is the variable we will use to store our instance of the Vuforia
* localization engine.
*/
private VuforiaLocalizer vuforia;
/**
* {@link #tfod} is the variable we will use to store our instance of the TensorFlow Object
* Detection engine.
*/
private TFObjectDetector tfod;
@Override
public void runOpMode() {
// The TFObjectDetector uses the camera frames from the VuforiaLocalizer, so we create that
// first.
initVuforia();
initTfod();
/**
* Activate TensorFlow Object Detection before we wait for the start command.
* Do it here so that the Camera Stream window will have the TensorFlow annotations visible.
**/
if (tfod != null) {
tfod.activate();
// The TensorFlow software will scale the input images from the camera to a lower resolution.
// This can result in lower detection accuracy at longer distances (> 55cm or 22").
// If your target is at distance greater than 50 cm (20") you can increase the magnification value
// to artificially zoom in to the center of image. For best results, the "aspectRatio" argument
// should be set to the value of the images used to create the TensorFlow Object Detection model
// (typically 16/9).
tfod.setZoom(1.0, 16.0/9.0);
}
/** Wait for the game to begin */
telemetry.addData(">", "Press Play to start op mode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
if (tfod != null) {
// getUpdatedRecognitions() will return null if no new information is available since
// the last time that call was made.
List<Recognition> updatedRecognitions = tfod.getUpdatedRecognitions();
if (updatedRecognitions != null) {
telemetry.addData("# Objects Detected", updatedRecognitions.size());
// step through the list of recognitions and display image position/size information for each one
// Note: "Image number" refers to the randomized image orientation/number
for (Recognition recognition : updatedRecognitions) {
double col = (recognition.getLeft() + recognition.getRight()) / 2 ;
double row = (recognition.getTop() + recognition.getBottom()) / 2 ;
double width = Math.abs(recognition.getRight() - recognition.getLeft()) ;
double height = Math.abs(recognition.getTop() - recognition.getBottom()) ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100 );
telemetry.addData("- Position (Row/Col)","%.0f / %.0f", row, col);
telemetry.addData("- Size (Width/Height)","%.0f / %.0f", width, height);
}
telemetry.update();
}
}
}
}
}
/**
* Initialize the Vuforia localization engine.
*/
private void initVuforia() {
/*
* Configure Vuforia by creating a Parameter object, and passing it to the Vuforia engine.
*/
VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters();
parameters.vuforiaLicenseKey = VUFORIA_KEY;
parameters.cameraName = hardwareMap.get(WebcamName.class, "Webcam 1");
// Instantiate the Vuforia engine
vuforia = ClassFactory.getInstance().createVuforia(parameters);
}
/**
* Initialize the TensorFlow Object Detection engine.
*/
private void initTfod() {
int tfodMonitorViewId = hardwareMap.appContext.getResources().getIdentifier(
"tfodMonitorViewId", "id", hardwareMap.appContext.getPackageName());
TFObjectDetector.Parameters tfodParameters = new TFObjectDetector.Parameters(tfodMonitorViewId);
tfodParameters.minResultConfidence = 0.75f;
tfodParameters.isModelTensorFlow2 = true;
tfodParameters.inputSize = 300;
tfod = ClassFactory.getInstance().createTFObjectDetector(tfodParameters, vuforia);
// Use loadModelFromAsset() if the TF Model is built in as an asset by Android Studio
// Use loadModelFromFile() if you have downloaded a custom team model to the Robot Controller's FLASH.
tfod.loadModelFromAsset(TFOD_MODEL_ASSET, LABELS);
// tfod.loadModelFromFile(TFOD_MODEL_FILE, LABELS);
}
}

View File

@ -1,186 +0,0 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.matrices.OpenGLMatrix;
import org.firstinspires.ftc.robotcore.external.matrices.VectorF;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.external.navigation.RelicRecoveryVuMark;
import org.firstinspires.ftc.robotcore.external.navigation.VuMarkInstanceId;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackable;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackableDefaultListener;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackables;
/**
* This OpMode illustrates the basics of using the Vuforia engine to determine
* the identity of Vuforia VuMarks encountered on the field. The code is structured as
* a LinearOpMode. It shares much structure with {@link ConceptVuforiaFieldNavigation}; we do not here
* duplicate the core Vuforia documentation found there, but rather instead focus on the
* differences between the use of Vuforia for navigation vs VuMark identification.
*
* @see ConceptVuforiaFieldNavigation
* @see VuforiaLocalizer
* @see VuforiaTrackableDefaultListener
* see ftc_app/doc/tutorial/FTC_FieldCoordinateSystemDefinition.pdf
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
* IMPORTANT: In order to use this OpMode, you need to obtain your own Vuforia license key as
* is explained below.
*/
@TeleOp(name="Concept: VuMark Id", group ="Concept")
@Disabled
public class ConceptVuMarkIdentification extends LinearOpMode {
public static final String TAG = "Vuforia VuMark Sample";
OpenGLMatrix lastLocation = null;
/**
* {@link #vuforia} is the variable we will use to store our instance of the Vuforia
* localization engine.
*/
VuforiaLocalizer vuforia;
@Override public void runOpMode() {
/*
* To start up Vuforia, tell it the view that we wish to use for camera monitor (on the RC phone);
*/
int cameraMonitorViewId = hardwareMap.appContext.getResources().getIdentifier("cameraMonitorViewId", "id", hardwareMap.appContext.getPackageName());
VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters(cameraMonitorViewId);
// OR... Do Not Activate the Camera Monitor View, to save power
// VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters();
/*
* IMPORTANT: You need to obtain your own license key to use Vuforia. The string below with which
* 'parameters.vuforiaLicenseKey' is initialized is for illustration only, and will not function.
* A Vuforia 'Development' license key, can be obtained free of charge from the Vuforia developer
* web site at https://developer.vuforia.com/license-manager.
*
* Vuforia license keys are always 380 characters long, and look as if they contain mostly
* random data. As an example, here is a example of a fragment of a valid key:
* ... yIgIzTqZ4mWjk9wd3cZO9T1axEqzuhxoGlfOOI2dRzKS4T0hQ8kT ...
* Once you've obtained a license key, copy the string from the Vuforia web site
* and paste it in to your code on the next line, between the double quotes.
*/
parameters.vuforiaLicenseKey = " -- YOUR NEW VUFORIA KEY GOES HERE --- ";
/*
* We also indicate which camera on the RC that we wish to use.
* Here we chose the back (HiRes) camera (for greater range), but
* for a competition robot, the front camera might be more convenient.
*/
parameters.cameraDirection = VuforiaLocalizer.CameraDirection.BACK;
/**
* Instantiate the Vuforia engine
*/
vuforia = ClassFactory.getInstance().createVuforia(parameters);
/**
* Load the data set containing the VuMarks for Relic Recovery. There's only one trackable
* in this data set: all three of the VuMarks in the game were created from this one template,
* but differ in their instance id information.
* @see VuMarkInstanceId
*/
VuforiaTrackables relicTrackables = this.vuforia.loadTrackablesFromAsset("RelicVuMark");
VuforiaTrackable relicTemplate = relicTrackables.get(0);
relicTemplate.setName("relicVuMarkTemplate"); // can help in debugging; otherwise not necessary
telemetry.addData(">", "Press Play to start");
telemetry.update();
waitForStart();
relicTrackables.activate();
while (opModeIsActive()) {
/**
* See if any of the instances of {@link relicTemplate} are currently visible.
* {@link RelicRecoveryVuMark} is an enum which can have the following values:
* UNKNOWN, LEFT, CENTER, and RIGHT. When a VuMark is visible, something other than
* UNKNOWN will be returned by {@link RelicRecoveryVuMark#from(VuforiaTrackable)}.
*/
RelicRecoveryVuMark vuMark = RelicRecoveryVuMark.from(relicTemplate);
if (vuMark != RelicRecoveryVuMark.UNKNOWN) {
/* Found an instance of the template. In the actual game, you will probably
* loop until this condition occurs, then move on to act accordingly depending
* on which VuMark was visible. */
telemetry.addData("VuMark", "%s visible", vuMark);
/* For fun, we also exhibit the navigational pose. In the Relic Recovery game,
* it is perhaps unlikely that you will actually need to act on this pose information, but
* we illustrate it nevertheless, for completeness. */
OpenGLMatrix pose = ((VuforiaTrackableDefaultListener)relicTemplate.getListener()).getPose();
telemetry.addData("Pose", format(pose));
/* We further illustrate how to decompose the pose into useful rotational and
* translational components */
if (pose != null) {
VectorF trans = pose.getTranslation();
Orientation rot = Orientation.getOrientation(pose, AxesReference.EXTRINSIC, AxesOrder.XYZ, AngleUnit.DEGREES);
// Extract the X, Y, and Z components of the offset of the target relative to the robot
double tX = trans.get(0);
double tY = trans.get(1);
double tZ = trans.get(2);
// Extract the rotational components of the target relative to the robot
double rX = rot.firstAngle;
double rY = rot.secondAngle;
double rZ = rot.thirdAngle;
}
}
else {
telemetry.addData("VuMark", "not visible");
}
telemetry.update();
}
}
String format(OpenGLMatrix transformationMatrix) {
return (transformationMatrix != null) ? transformationMatrix.formatAsTransform() : "null";
}
}

View File

@ -1,194 +0,0 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.matrices.OpenGLMatrix;
import org.firstinspires.ftc.robotcore.external.matrices.VectorF;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.external.navigation.RelicRecoveryVuMark;
import org.firstinspires.ftc.robotcore.external.navigation.VuMarkInstanceId;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackable;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackableDefaultListener;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackables;
/**
* This OpMode illustrates the basics of using the Vuforia engine to determine
* the identity of Vuforia VuMarks encountered on the field. The code is structured as
* a LinearOpMode. It shares much structure with {@link ConceptVuforiaFieldNavigationWebcam}; we do not here
* duplicate the core Vuforia documentation found there, but rather instead focus on the
* differences between the use of Vuforia for navigation vs VuMark identification.
*
* @see ConceptVuforiaFieldNavigationWebcam
* @see VuforiaLocalizer
* @see VuforiaTrackableDefaultListener
* see ftc_app/doc/tutorial/FTC_FieldCoordinateSystemDefinition.pdf
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
* IMPORTANT: In order to use this OpMode, you need to obtain your own Vuforia license key as
* is explained below
*/
@TeleOp(name="Concept: VuMark Id Webcam", group ="Concept")
@Disabled
public class ConceptVuMarkIdentificationWebcam extends LinearOpMode {
public static final String TAG = "Vuforia VuMark Sample";
OpenGLMatrix lastLocation = null;
/**
* {@link #vuforia} is the variable we will use to store our instance of the Vuforia
* localization engine.
*/
VuforiaLocalizer vuforia;
/**
* This is the webcam we are to use. As with other hardware devices such as motors and
* servos, this device is identified using the robot configuration tool in the FTC application.
*/
WebcamName webcamName;
@Override public void runOpMode() {
/*
* Retrieve the camera we are to use.
*/
webcamName = hardwareMap.get(WebcamName.class, "Webcam 1");
/*
* To start up Vuforia, tell it the view that we wish to use for camera monitor (on the RC phone);
* If no camera monitor is desired, use the parameterless constructor instead (commented out below).
*/
int cameraMonitorViewId = hardwareMap.appContext.getResources().getIdentifier("cameraMonitorViewId", "id", hardwareMap.appContext.getPackageName());
VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters(cameraMonitorViewId);
// OR... Do Not Activate the Camera Monitor View, to save power
// VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters();
/*
* IMPORTANT: You need to obtain your own license key to use Vuforia. The string below with which
* 'parameters.vuforiaLicenseKey' is initialized is for illustration only, and will not function.
* A Vuforia 'Development' license key, can be obtained free of charge from the Vuforia developer
* web site at https://developer.vuforia.com/license-manager.
*
* Vuforia license keys are always 380 characters long, and look as if they contain mostly
* random data. As an example, here is a example of a fragment of a valid key:
* ... yIgIzTqZ4mWjk9wd3cZO9T1axEqzuhxoGlfOOI2dRzKS4T0hQ8kT ...
* Once you've obtained a license key, copy the string from the Vuforia web site
* and paste it in to your code on the next line, between the double quotes.
*/
parameters.vuforiaLicenseKey = " -- YOUR NEW VUFORIA KEY GOES HERE --- ";
/**
* We also indicate which camera on the RC we wish to use. For pedagogical purposes,
* we use the same logic as in {@link ConceptVuforiaNavigationWebcam}.
*/
parameters.cameraName = webcamName;
this.vuforia = ClassFactory.getInstance().createVuforia(parameters);
/**
* Load the data set containing the VuMarks for Relic Recovery. There's only one trackable
* in this data set: all three of the VuMarks in the game were created from this one template,
* but differ in their instance id information.
* @see VuMarkInstanceId
*/
VuforiaTrackables relicTrackables = this.vuforia.loadTrackablesFromAsset("RelicVuMark");
VuforiaTrackable relicTemplate = relicTrackables.get(0);
relicTemplate.setName("relicVuMarkTemplate"); // can help in debugging; otherwise not necessary
telemetry.addData(">", "Press Play to start");
telemetry.update();
waitForStart();
relicTrackables.activate();
while (opModeIsActive()) {
/**
* See if any of the instances of {@link relicTemplate} are currently visible.
* {@link RelicRecoveryVuMark} is an enum which can have the following values:
* UNKNOWN, LEFT, CENTER, and RIGHT. When a VuMark is visible, something other than
* UNKNOWN will be returned by {@link RelicRecoveryVuMark#from(VuforiaTrackable)}.
*/
RelicRecoveryVuMark vuMark = RelicRecoveryVuMark.from(relicTemplate);
if (vuMark != RelicRecoveryVuMark.UNKNOWN) {
/* Found an instance of the template. In the actual game, you will probably
* loop until this condition occurs, then move on to act accordingly depending
* on which VuMark was visible. */
telemetry.addData("VuMark", "%s visible", vuMark);
/* For fun, we also exhibit the navigational pose. In the Relic Recovery game,
* it is perhaps unlikely that you will actually need to act on this pose information, but
* we illustrate it nevertheless, for completeness. */
OpenGLMatrix pose = ((VuforiaTrackableDefaultListener)relicTemplate.getListener()).getFtcCameraFromTarget();
telemetry.addData("Pose", format(pose));
/* We further illustrate how to decompose the pose into useful rotational and
* translational components */
if (pose != null) {
VectorF trans = pose.getTranslation();
Orientation rot = Orientation.getOrientation(pose, AxesReference.EXTRINSIC, AxesOrder.XYZ, AngleUnit.DEGREES);
// Extract the X, Y, and Z components of the offset of the target relative to the robot
double tX = trans.get(0);
double tY = trans.get(1);
double tZ = trans.get(2);
// Extract the rotational components of the target relative to the robot
double rX = rot.firstAngle;
double rY = rot.secondAngle;
double rZ = rot.thirdAngle;
}
}
else {
telemetry.addData("VuMark", "not visible");
}
telemetry.update();
}
}
String format(OpenGLMatrix transformationMatrix) {
return (transformationMatrix != null) ? transformationMatrix.formatAsTransform() : "null";
}
}

View File

@ -1,203 +0,0 @@
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.Range;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.matrices.OpenGLMatrix;
import org.firstinspires.ftc.robotcore.external.matrices.VectorF;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackable;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackableDefaultListener;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackables;
/**
* This OpMode illustrates using a webcam to locate and drive towards ANY Vuforia target.
* The code assumes a basic two-wheel Robot Configuration with motors named left_drive and right_drive.
* The motor directions must be set so a positive drive goes forward and a positive turn rotates to the right.
*
* Under manual control, the left stick will move forward/back, and the right stick will turn left/right.
* This is called POV Joystick mode, different than Tank Drive (where each joystick controls a wheel).
* Manually drive the robot until it displays Target data on the Driver Station.
* Press and hold the *Left Bumper* to enable the automatic "Drive to target" mode.
* Release the Left Bumper to return to manual driving mode.
*
* Use DESIRED_DISTANCE to set how close you want the robot to get to the target.
* Speed and Turn sensitivity can be adjusted using the SPEED_GAIN and TURN_GAIN constants.
*
* For more Vuforia details, or to adapt this OpMode for a phone camera, view the
* ConceptVuforiaFieldNavigation and ConceptVuforiaFieldNavigationWebcam samples.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list.
*
* IMPORTANT: In order to use this OpMode, you need to obtain your own Vuforia license key as
* is explained below.
*/
@TeleOp(name="Drive To Target", group = "Concept")
@Disabled
public class ConceptVuforiaDriveToTargetWebcam extends LinearOpMode
{
// Adjust these numbers to suit your robot.
final double DESIRED_DISTANCE = 8.0; // this is how close the camera should get to the target (inches)
// The GAIN constants set the relationship between the measured position error,
// and how much power is applied to the drive motors. Drive = Error * Gain
// Make these values smaller for smoother control.
final double SPEED_GAIN = 0.02 ; // Speed Control "Gain". eg: Ramp up to 50% power at a 25 inch error. (0.50 / 25.0)
final double TURN_GAIN = 0.01 ; // Turn Control "Gain". eg: Ramp up to 25% power at a 25 degree error. (0.25 / 25.0)
final double MM_PER_INCH = 25.40 ; // Metric conversion
/*
* IMPORTANT: You need to obtain your own license key to use Vuforia. The string below with which
* 'parameters.vuforiaLicenseKey' is initialized is for illustration only, and will not function.
* A Vuforia 'Development' license key, can be obtained free of charge from the Vuforia developer
* web site at https://developer.vuforia.com/license-manager.
*
* Vuforia license keys are always 380 characters long, and look as if they contain mostly
* random data. As an example, here is a example of a fragment of a valid key:
* ... yIgIzTqZ4mWjk9wd3cZO9T1axEqzuhxoGlfOOI2dRzKS4T0hQ8kT ...
* Once you've obtained a license key, copy the string from the Vuforia web site
* and paste it in to your code on the next line, between the double quotes.
*/
private static final String VUFORIA_KEY =
" --- YOUR NEW VUFORIA KEY GOES HERE --- ";
VuforiaLocalizer vuforia = null;
OpenGLMatrix targetPose = null;
String targetName = "";
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
@Override public void runOpMode()
{
/*
* Configure Vuforia by creating a Parameter object, and passing it to the Vuforia engine.
* To get an on-phone camera preview, use the code below.
* If no camera preview is desired, use the parameter-less constructor instead (commented out below).
*/
int cameraMonitorViewId = hardwareMap.appContext.getResources().getIdentifier("cameraMonitorViewId", "id", hardwareMap.appContext.getPackageName());
VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters(cameraMonitorViewId);
// VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters();
parameters.vuforiaLicenseKey = VUFORIA_KEY;
// Turn off Extended tracking. Set this true if you want Vuforia to track beyond the target.
parameters.useExtendedTracking = false;
// Connect to the camera we are to use. This name must match what is set up in Robot Configuration
parameters.cameraName = hardwareMap.get(WebcamName.class, "Webcam 1");
this.vuforia = ClassFactory.getInstance().createVuforia(parameters);
// Load the trackable objects from the Assets file, and give them meaningful names
VuforiaTrackables targetsPowerPlay = this.vuforia.loadTrackablesFromAsset("PowerPlay");
targetsPowerPlay.get(0).setName("Red Audience Wall");
targetsPowerPlay.get(1).setName("Red Rear Wall");
targetsPowerPlay.get(2).setName("Blue Audience Wall");
targetsPowerPlay.get(3).setName("Blue Rear Wall");
// Start tracking targets in the background
targetsPowerPlay.activate();
// Initialize the hardware variables. Note that the strings used here as parameters
// to 'get' must correspond to the names assigned during the robot configuration
// step (using the FTC Robot Controller app on the phone).
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// When run, this OpMode should start both motors driving forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
telemetry.addData(">", "Press Play to start");
telemetry.update();
waitForStart();
boolean targetFound = false; // Set to true when a target is detected by Vuforia
double targetRange = 0; // Distance from camera to target in Inches
double targetBearing = 0; // Robot Heading, relative to target. Positive degrees means target is to the right.
double drive = 0; // Desired forward power (-1 to +1)
double turn = 0; // Desired turning power (-1 to +1)
while (opModeIsActive())
{
// Look for first visible target, and save its pose.
targetFound = false;
for (VuforiaTrackable trackable : targetsPowerPlay)
{
if (((VuforiaTrackableDefaultListener) trackable.getListener()).isVisible())
{
targetPose = ((VuforiaTrackableDefaultListener)trackable.getListener()).getVuforiaCameraFromTarget();
// if we have a target, process the "pose" to determine the position of the target relative to the robot.
if (targetPose != null)
{
targetFound = true;
targetName = trackable.getName();
VectorF trans = targetPose.getTranslation();
// Extract the X & Y components of the offset of the target relative to the robot
double targetX = trans.get(0) / MM_PER_INCH; // Image X axis
double targetY = trans.get(2) / MM_PER_INCH; // Image Z axis
// target range is based on distance from robot position to origin (right triangle).
targetRange = Math.hypot(targetX, targetY);
// target bearing is based on angle formed between the X axis and the target range line
targetBearing = Math.toDegrees(Math.asin(targetX / targetRange));
break; // jump out of target tracking loop if we find a target.
}
}
}
// Tell the driver what we see, and what to do.
if (targetFound) {
telemetry.addData(">","HOLD Left-Bumper to Drive to Target\n");
telemetry.addData("Target", " %s", targetName);
telemetry.addData("Range", "%5.1f inches", targetRange);
telemetry.addData("Bearing","%3.0f degrees", targetBearing);
} else {
telemetry.addData(">","Drive using joystick to find target\n");
}
// Drive to target Automatically if Left Bumper is being pressed, AND we have found a target.
if (gamepad1.left_bumper && targetFound) {
// Determine heading and range error so we can use them to control the robot automatically.
double rangeError = (targetRange - DESIRED_DISTANCE);
double headingError = targetBearing;
// Use the speed and turn "gains" to calculate how we want the robot to move.
drive = rangeError * SPEED_GAIN;
turn = headingError * TURN_GAIN ;
telemetry.addData("Auto","Drive %5.2f, Turn %5.2f", drive, turn);
} else {
// drive using manual POV Joystick mode.
drive = -gamepad1.left_stick_y / 2.0; // Reduce drive rate to 50%.
turn = gamepad1.right_stick_x / 4.0; // Reduce turn rate to 25%.
telemetry.addData("Manual","Drive %5.2f, Turn %5.2f", drive, turn);
}
telemetry.update();
// Calculate left and right wheel powers and send to them to the motors.
double leftPower = Range.clip(drive + turn, -1.0, 1.0) ;
double rightPower = Range.clip(drive - turn, -1.0, 1.0) ;
leftDrive.setPower(leftPower);
rightDrive.setPower(rightPower);
sleep(10);
}
}
}

View File

@ -1,283 +0,0 @@
/* Copyright (c) 2019 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.matrices.OpenGLMatrix;
import org.firstinspires.ftc.robotcore.external.matrices.VectorF;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackable;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackableDefaultListener;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackables;
import java.util.ArrayList;
import java.util.List;
import static org.firstinspires.ftc.robotcore.external.navigation.AngleUnit.DEGREES;
import static org.firstinspires.ftc.robotcore.external.navigation.AxesOrder.XYZ;
import static org.firstinspires.ftc.robotcore.external.navigation.AxesOrder.YZX;
import static org.firstinspires.ftc.robotcore.external.navigation.AxesReference.EXTRINSIC;
import static org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer.CameraDirection.BACK;
/**
* This OpMode illustrates using the Vuforia localizer to determine positioning and orientation of
* robot on the FTC field using the RC phone's camera. The code is structured as a LinearOpMode
*
* Note: If you are using a WEBCAM see ConceptVuforiaFieldNavigationWebcam.java
*
* When images are located, Vuforia is able to determine the position and orientation of the
* image relative to the camera. This sample code then combines that information with a
* knowledge of where the target images are on the field, to determine the location of the camera.
*
* Finally, the location of the camera on the robot is used to determine the
* robot's location and orientation on the field.
*
* To learn more about the FTC field coordinate model, see FTC_FieldCoordinateSystemDefinition.pdf in this folder
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
* IMPORTANT: In order to use this OpMode, you need to obtain your own Vuforia license key as
* is explained below.
*/
@TeleOp(name="Vuforia Field Nav", group ="Concept")
@Disabled
public class ConceptVuforiaFieldNavigation extends LinearOpMode {
// IMPORTANT: For Phone Camera, set 1) the camera source and 2) the orientation, based on how your phone is mounted:
// 1) Camera Source. Valid choices are: BACK (behind screen) or FRONT (selfie side)
// 2) Phone Orientation. Choices are: PHONE_IS_PORTRAIT = true (portrait) or PHONE_IS_PORTRAIT = false (landscape)
private static final VuforiaLocalizer.CameraDirection CAMERA_CHOICE = BACK;
private static final boolean PHONE_IS_PORTRAIT = false ;
/*
* IMPORTANT: You need to obtain your own license key to use Vuforia. The string below with which
* 'parameters.vuforiaLicenseKey' is initialized is for illustration only, and will not function.
* A Vuforia 'Development' license key, can be obtained free of charge from the Vuforia developer
* web site at https://developer.vuforia.com/license-manager.
*
* Vuforia license keys are always 380 characters long, and look as if they contain mostly
* random data. As an example, here is a example of a fragment of a valid key:
* ... yIgIzTqZ4mWjk9wd3cZO9T1axEqzuhxoGlfOOI2dRzKS4T0hQ8kT ...
* Once you've obtained a license key, copy the string from the Vuforia web site
* and paste it in to your code on the next line, between the double quotes.
*/
private static final String VUFORIA_KEY =
" -- YOUR NEW VUFORIA KEY GOES HERE --- ";
// Since ImageTarget trackables use mm to specifiy their dimensions, we must use mm for all the physical dimension.
// We will define some constants and conversions here. These are useful for the FTC competition field.
private static final float mmPerInch = 25.4f;
private static final float mmTargetHeight = 6 * mmPerInch; // the height of the center of the target image above the floor
private static final float halfField = 72 * mmPerInch;
private static final float halfTile = 12 * mmPerInch;
private static final float oneAndHalfTile = 36 * mmPerInch;
// Class Members
private OpenGLMatrix lastLocation = null;
private VuforiaLocalizer vuforia = null;
private VuforiaTrackables targets = null ;
private boolean targetVisible = false;
private float phoneXRotate = 0;
private float phoneYRotate = 0;
private float phoneZRotate = 0;
@Override public void runOpMode() {
/*
* Configure Vuforia by creating a Parameter object, and passing it to the Vuforia engine.
* To get an on-phone camera preview, use the code below.
* If no camera preview is desired, use the parameter-less constructor instead (commented out below).
*/
int cameraMonitorViewId = hardwareMap.appContext.getResources().getIdentifier("cameraMonitorViewId", "id", hardwareMap.appContext.getPackageName());
VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters(cameraMonitorViewId);
// VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters();
parameters.vuforiaLicenseKey = VUFORIA_KEY;
parameters.cameraDirection = CAMERA_CHOICE;
// Turn off Extended tracking. Set this true if you want Vuforia to track beyond the target.
parameters.useExtendedTracking = false;
// Instantiate the Vuforia engine
vuforia = ClassFactory.getInstance().createVuforia(parameters);
// Load the trackable assets.
targets = this.vuforia.loadTrackablesFromAsset("PowerPlay");
// For convenience, gather together all the trackable objects in one easily-iterable collection */
List<VuforiaTrackable> allTrackables = new ArrayList<VuforiaTrackable>();
allTrackables.addAll(targets);
/**
* In order for localization to work, we need to tell the system where each target is on the field, and
* where the phone resides on the robot. These specifications are in the form of <em>transformation matrices.</em>
* Transformation matrices are a central, important concept in the math here involved in localization.
* See <a href="https://en.wikipedia.org/wiki/Transformation_matrix">Transformation Matrix</a>
* for detailed information. Commonly, you'll encounter transformation matrices as instances
* of the {@link OpenGLMatrix} class.
*
* If you are standing in the Red Alliance Station looking towards the center of the field,
* - The X axis runs from your left to the right. (positive from the center to the right)
* - The Y axis runs from the Red Alliance Station towards the other side of the field
* where the Blue Alliance Station is. (Positive is from the center, towards the BlueAlliance station)
* - The Z axis runs from the floor, upwards towards the ceiling. (Positive is above the floor)
*
* Before being transformed, each target image is conceptually located at the origin of the field's
* coordinate system (the center of the field), facing up.
*/
// Name and locate each trackable object
identifyTarget(0, "Red Audience Wall", -halfField, -oneAndHalfTile, mmTargetHeight, 90, 0, 90);
identifyTarget(1, "Red Rear Wall", halfField, -oneAndHalfTile, mmTargetHeight, 90, 0, -90);
identifyTarget(2, "Blue Audience Wall", -halfField, oneAndHalfTile, mmTargetHeight, 90, 0, 90);
identifyTarget(3, "Blue Rear Wall", halfField, oneAndHalfTile, mmTargetHeight, 90, 0, -90);
/*
* Create a transformation matrix describing where the phone is on the robot.
*
* NOTE !!!! It's very important that you turn OFF your phone's Auto-Screen-Rotation option.
* Lock it into Portrait for these numbers to work.
*
* Info: The coordinate frame for the robot looks the same as the field.
* The robot's "forward" direction is facing out along X axis, with the LEFT side facing out along the Y axis.
* Z is UP on the robot. This equates to a heading angle of Zero degrees.
*
* The phone starts out lying flat, with the screen facing Up and with the physical top of the phone
* pointing to the LEFT side of the Robot.
* The two examples below assume that the camera is facing forward out the front of the robot.
*/
// We need to rotate the camera around its long axis to bring the correct camera forward.
if (CAMERA_CHOICE == BACK) {
phoneYRotate = -90;
} else {
phoneYRotate = 90;
}
// Rotate the phone vertical about the X axis if it's in portrait mode
if (PHONE_IS_PORTRAIT) {
phoneXRotate = 90 ;
}
// Next, translate the camera lens to where it is on the robot.
// In this example, it is centered on the robot (left-to-right and front-to-back), and 6 inches above ground level.
final float CAMERA_FORWARD_DISPLACEMENT = 0.0f * mmPerInch; // eg: Enter the forward distance from the center of the robot to the camera lens
final float CAMERA_VERTICAL_DISPLACEMENT = 6.0f * mmPerInch; // eg: Camera is 6 Inches above ground
final float CAMERA_LEFT_DISPLACEMENT = 0.0f * mmPerInch; // eg: Enter the left distance from the center of the robot to the camera lens
OpenGLMatrix robotFromCamera = OpenGLMatrix
.translation(CAMERA_FORWARD_DISPLACEMENT, CAMERA_LEFT_DISPLACEMENT, CAMERA_VERTICAL_DISPLACEMENT)
.multiplied(Orientation.getRotationMatrix(EXTRINSIC, YZX, DEGREES, phoneYRotate, phoneZRotate, phoneXRotate));
/** Let all the trackable listeners know where the phone is. */
for (VuforiaTrackable trackable : allTrackables) {
((VuforiaTrackableDefaultListener) trackable.getListener()).setPhoneInformation(robotFromCamera, parameters.cameraDirection);
}
/*
* WARNING:
* In this sample, we do not wait for PLAY to be pressed. Target Tracking is started immediately when INIT is pressed.
* This sequence is used to enable the new remote DS Camera Stream feature to be used with this sample.
* CONSEQUENTLY do not put any driving commands in this loop.
* To restore the normal opmode structure, just un-comment the following line:
*/
// waitForStart();
/* Note: To use the remote camera preview:
* AFTER you hit Init on the Driver Station, use the "options menu" to select "Camera Stream"
* Tap the preview window to receive a fresh image.
* It is not permitted to transition to RUN while the camera preview window is active.
* Either press STOP to exit the OpMode, or use the "options menu" again, and select "Camera Stream" to close the preview window.
*/
targets.activate();
while (!isStopRequested()) {
// check all the trackable targets to see which one (if any) is visible.
targetVisible = false;
for (VuforiaTrackable trackable : allTrackables) {
if (((VuforiaTrackableDefaultListener)trackable.getListener()).isVisible()) {
telemetry.addData("Visible Target", trackable.getName());
targetVisible = true;
// getUpdatedRobotLocation() will return null if no new information is available since
// the last time that call was made, or if the trackable is not currently visible.
OpenGLMatrix robotLocationTransform = ((VuforiaTrackableDefaultListener)trackable.getListener()).getUpdatedRobotLocation();
if (robotLocationTransform != null) {
lastLocation = robotLocationTransform;
}
break;
}
}
// Provide feedback as to where the robot is located (if we know).
if (targetVisible) {
// express position (translation) of robot in inches.
VectorF translation = lastLocation.getTranslation();
telemetry.addData("Pos (inches)", "{X, Y, Z} = %.1f, %.1f, %.1f",
translation.get(0) / mmPerInch, translation.get(1) / mmPerInch, translation.get(2) / mmPerInch);
// express the rotation of the robot in degrees.
Orientation rotation = Orientation.getOrientation(lastLocation, EXTRINSIC, XYZ, DEGREES);
telemetry.addData("Rot (deg)", "{Roll, Pitch, Heading} = %.0f, %.0f, %.0f", rotation.firstAngle, rotation.secondAngle, rotation.thirdAngle);
}
else {
telemetry.addData("Visible Target", "none");
}
telemetry.update();
}
// Disable Tracking when we are done;
targets.deactivate();
}
/***
* Identify a target by naming it, and setting its position and orientation on the field
* @param targetIndex
* @param targetName
* @param dx, dy, dz Target offsets in x,y,z axes
* @param rx, ry, rz Target rotations in x,y,z axes
*/
void identifyTarget(int targetIndex, String targetName, float dx, float dy, float dz, float rx, float ry, float rz) {
VuforiaTrackable aTarget = targets.get(targetIndex);
aTarget.setName(targetName);
aTarget.setLocation(OpenGLMatrix.translation(dx, dy, dz)
.multiplied(Orientation.getRotationMatrix(EXTRINSIC, XYZ, DEGREES, rx, ry, rz)));
}
}

View File

@ -1,274 +0,0 @@
/* Copyright (c) 2019 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.matrices.OpenGLMatrix;
import org.firstinspires.ftc.robotcore.external.matrices.VectorF;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackable;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackableDefaultListener;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackables;
import java.util.ArrayList;
import java.util.List;
import static org.firstinspires.ftc.robotcore.external.navigation.AngleUnit.DEGREES;
import static org.firstinspires.ftc.robotcore.external.navigation.AxesOrder.XYZ;
import static org.firstinspires.ftc.robotcore.external.navigation.AxesOrder.XZY;
import static org.firstinspires.ftc.robotcore.external.navigation.AxesReference.EXTRINSIC;
/**
* This OpMode illustrates using the Vuforia localizer to determine positioning and orientation of
* robot on the FTC field using a WEBCAM. The code is structured as a LinearOpMode
*
* NOTE: If you are running on a Phone with a built-in camera, use the ConceptVuforiaFieldNavigation example instead of this one.
* NOTE: It is possible to switch between multiple WebCams (eg: one for the left side and one for the right).
* For a related example of how to do this, see ConceptTensorFlowObjectDetectionSwitchableCameras
*
* When images are located, Vuforia is able to determine the position and orientation of the
* image relative to the camera. This sample code then combines that information with a
* knowledge of where the target images are on the field, to determine the location of the camera.
*
* Finally, the location of the camera on the robot is used to determine the
* robot's location and orientation on the field.
*
* To learn more about the FTC field coordinate model, see FTC_FieldCoordinateSystemDefinition.pdf in this folder
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
* IMPORTANT: In order to use this OpMode, you need to obtain your own Vuforia license key as
* is explained below.
*/
@TeleOp(name="Vuforia Field Nav Webcam", group ="Concept")
@Disabled
public class ConceptVuforiaFieldNavigationWebcam extends LinearOpMode {
/*
* IMPORTANT: You need to obtain your own license key to use Vuforia. The string below with which
* 'parameters.vuforiaLicenseKey' is initialized is for illustration only, and will not function.
* A Vuforia 'Development' license key, can be obtained free of charge from the Vuforia developer
* web site at https://developer.vuforia.com/license-manager.
*
* Vuforia license keys are always 380 characters long, and look as if they contain mostly
* random data. As an example, here is a example of a fragment of a valid key:
* ... yIgIzTqZ4mWjk9wd3cZO9T1axEqzuhxoGlfOOI2dRzKS4T0hQ8kT ...
* Once you've obtained a license key, copy the string from the Vuforia web site
* and paste it in to your code on the next line, between the double quotes.
*/
private static final String VUFORIA_KEY =
" --- YOUR NEW VUFORIA KEY GOES HERE --- ";
// Since ImageTarget trackables use mm to specifiy their dimensions, we must use mm for all the physical dimension.
// We will define some constants and conversions here
private static final float mmPerInch = 25.4f;
private static final float mmTargetHeight = 6 * mmPerInch; // the height of the center of the target image above the floor
private static final float halfField = 72 * mmPerInch;
private static final float halfTile = 12 * mmPerInch;
private static final float oneAndHalfTile = 36 * mmPerInch;
// Class Members
private OpenGLMatrix lastLocation = null;
private VuforiaLocalizer vuforia = null;
private VuforiaTrackables targets = null ;
private WebcamName webcamName = null;
private boolean targetVisible = false;
@Override public void runOpMode() {
// Connect to the camera we are to use. This name must match what is set up in Robot Configuration
webcamName = hardwareMap.get(WebcamName.class, "Webcam 1");
/*
* Configure Vuforia by creating a Parameter object, and passing it to the Vuforia engine.
* We can pass Vuforia the handle to a camera preview resource (on the RC screen);
* If no camera-preview is desired, use the parameter-less constructor instead (commented out below).
* Note: A preview window is required if you want to view the camera stream on the Driver Station Phone.
*/
int cameraMonitorViewId = hardwareMap.appContext.getResources().getIdentifier("cameraMonitorViewId", "id", hardwareMap.appContext.getPackageName());
VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters(cameraMonitorViewId);
// VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters();
parameters.vuforiaLicenseKey = VUFORIA_KEY;
// We also indicate which camera we wish to use.
parameters.cameraName = webcamName;
// Turn off Extended tracking. Set this true if you want Vuforia to track beyond the target.
parameters.useExtendedTracking = false;
// Instantiate the Vuforia engine
vuforia = ClassFactory.getInstance().createVuforia(parameters);
// Load the data sets for the trackable objects. These particular data
// sets are stored in the 'assets' part of our application.
targets = this.vuforia.loadTrackablesFromAsset("PowerPlay");
// For convenience, gather together all the trackable objects in one easily-iterable collection */
List<VuforiaTrackable> allTrackables = new ArrayList<VuforiaTrackable>();
allTrackables.addAll(targets);
/**
* In order for localization to work, we need to tell the system where each target is on the field, and
* where the phone resides on the robot. These specifications are in the form of <em>transformation matrices.</em>
* Transformation matrices are a central, important concept in the math here involved in localization.
* See <a href="https://en.wikipedia.org/wiki/Transformation_matrix">Transformation Matrix</a>
* for detailed information. Commonly, you'll encounter transformation matrices as instances
* of the {@link OpenGLMatrix} class.
*
* If you are standing in the Red Alliance Station looking towards the center of the field,
* - The X axis runs from your left to the right. (positive from the center to the right)
* - The Y axis runs from the Red Alliance Station towards the other side of the field
* where the Blue Alliance Station is. (Positive is from the center, towards the BlueAlliance station)
* - The Z axis runs from the floor, upwards towards the ceiling. (Positive is above the floor)
*
* Before being transformed, each target image is conceptually located at the origin of the field's
* coordinate system (the center of the field), facing up.
*/
// Name and locate each trackable object
identifyTarget(0, "Red Audience Wall", -halfField, -oneAndHalfTile, mmTargetHeight, 90, 0, 90);
identifyTarget(1, "Red Rear Wall", halfField, -oneAndHalfTile, mmTargetHeight, 90, 0, -90);
identifyTarget(2, "Blue Audience Wall", -halfField, oneAndHalfTile, mmTargetHeight, 90, 0, 90);
identifyTarget(3, "Blue Rear Wall", halfField, oneAndHalfTile, mmTargetHeight, 90, 0, -90);
/*
* Create a transformation matrix describing where the camera is on the robot.
*
* Info: The coordinate frame for the robot looks the same as the field.
* The robot's "forward" direction is facing out along X axis, with the LEFT side facing out along the Y axis.
* Z is UP on the robot. This equates to a bearing angle of Zero degrees.
*
* For a WebCam, the default starting orientation of the camera is looking UP (pointing in the Z direction),
* with the wide (horizontal) axis of the camera aligned with the X axis, and
* the Narrow (vertical) axis of the camera aligned with the Y axis
*
* But, this example assumes that the camera is actually facing forward out the front of the robot.
* So, the "default" camera position requires two rotations to get it oriented correctly.
* 1) First it must be rotated +90 degrees around the X axis to get it horizontal (its now facing out the right side of the robot)
* 2) Next it must be be rotated +90 degrees (counter-clockwise) around the Z axis to face forward.
*
* Finally the camera can be translated to its actual mounting position on the robot.
* In this example, it is centered on the robot (left-to-right and front-to-back), and 6 inches above ground level.
*/
final float CAMERA_FORWARD_DISPLACEMENT = 0.0f * mmPerInch; // eg: Enter the forward distance from the center of the robot to the camera lens
final float CAMERA_VERTICAL_DISPLACEMENT = 6.0f * mmPerInch; // eg: Camera is 6 Inches above ground
final float CAMERA_LEFT_DISPLACEMENT = 0.0f * mmPerInch; // eg: Enter the left distance from the center of the robot to the camera lens
OpenGLMatrix cameraLocationOnRobot = OpenGLMatrix
.translation(CAMERA_FORWARD_DISPLACEMENT, CAMERA_LEFT_DISPLACEMENT, CAMERA_VERTICAL_DISPLACEMENT)
.multiplied(Orientation.getRotationMatrix(EXTRINSIC, XZY, DEGREES, 90, 90, 0));
/** Let all the trackable listeners know where the camera is. */
for (VuforiaTrackable trackable : allTrackables) {
((VuforiaTrackableDefaultListener) trackable.getListener()).setCameraLocationOnRobot(parameters.cameraName, cameraLocationOnRobot);
}
/*
* WARNING:
* In this sample, we do not wait for PLAY to be pressed. Target Tracking is started immediately when INIT is pressed.
* This sequence is used to enable the new remote DS Camera Preview feature to be used with this sample.
* CONSEQUENTLY do not put any driving commands in this loop.
* To restore the normal opmode structure, just un-comment the following line:
*/
// waitForStart();
/* Note: To use the remote camera preview:
* AFTER you hit Init on the Driver Station, use the "options menu" to select "Camera Stream"
* Tap the preview window to receive a fresh image.
* It is not permitted to transition to RUN while the camera preview window is active.
* Either press STOP to exit the OpMode, or use the "options menu" again, and select "Camera Stream" to close the preview window.
*/
targets.activate();
while (!isStopRequested()) {
// check all the trackable targets to see which one (if any) is visible.
targetVisible = false;
for (VuforiaTrackable trackable : allTrackables) {
if (((VuforiaTrackableDefaultListener)trackable.getListener()).isVisible()) {
telemetry.addData("Visible Target", trackable.getName());
targetVisible = true;
// getUpdatedRobotLocation() will return null if no new information is available since
// the last time that call was made, or if the trackable is not currently visible.
OpenGLMatrix robotLocationTransform = ((VuforiaTrackableDefaultListener)trackable.getListener()).getUpdatedRobotLocation();
if (robotLocationTransform != null) {
lastLocation = robotLocationTransform;
}
break;
}
}
// Provide feedback as to where the robot is located (if we know).
if (targetVisible) {
// express position (translation) of robot in inches.
VectorF translation = lastLocation.getTranslation();
telemetry.addData("Pos (inches)", "{X, Y, Z} = %.1f, %.1f, %.1f",
translation.get(0) / mmPerInch, translation.get(1) / mmPerInch, translation.get(2) / mmPerInch);
// express the rotation of the robot in degrees.
Orientation rotation = Orientation.getOrientation(lastLocation, EXTRINSIC, XYZ, DEGREES);
telemetry.addData("Rot (deg)", "{Roll, Pitch, Heading} = %.0f, %.0f, %.0f", rotation.firstAngle, rotation.secondAngle, rotation.thirdAngle);
}
else {
telemetry.addData("Visible Target", "none");
}
telemetry.update();
}
// Disable Tracking when we are done;
targets.deactivate();
}
/***
* Identify a target by naming it, and setting its position and orientation on the field
* @param targetIndex
* @param targetName
* @param dx, dy, dz Target offsets in x,y,z axes
* @param rx, ry, rz Target rotations in x,y,z axes
*/
void identifyTarget(int targetIndex, String targetName, float dx, float dy, float dz, float rx, float ry, float rz) {
VuforiaTrackable aTarget = targets.get(targetIndex);
aTarget.setName(targetName);
aTarget.setLocation(OpenGLMatrix.translation(dx, dy, dz)
.multiplied(Orientation.getRotationMatrix(EXTRINSIC, XYZ, DEGREES, rx, ry, rz)));
}
}

View File

@ -33,11 +33,10 @@ import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.Servo;
import com.qualcomm.robotcore.util.ElapsedTime;
/**
* This file illustrates the concept of driving a path based on encoder counts.
/*
* This OpMode illustrates the concept of driving a path based on encoder counts.
* The code is structured as a LinearOpMode
*
* The code REQUIRES that you DO have encoders on the wheels,
@ -59,7 +58,7 @@ import com.qualcomm.robotcore.util.ElapsedTime;
* This code uses the RUN_TO_POSITION mode to enable the Motor controllers to generate the run profile
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@Autonomous(name="Robot: Auto Drive By Encoder", group="Robot")
@ -131,7 +130,7 @@ public class RobotAutoDriveByEncoder_Linear extends LinearOpMode {
* Move will stop if any of three conditions occur:
* 1) Move gets to the desired position
* 2) Move runs out of time
* 3) Driver stops the opmode running.
* 3) Driver stops the OpMode running.
*/
public void encoderDrive(double speed,
double leftInches, double rightInches,
@ -139,7 +138,7 @@ public class RobotAutoDriveByEncoder_Linear extends LinearOpMode {
int newLeftTarget;
int newRightTarget;
// Ensure that the opmode is still active
// Ensure that the OpMode is still active
if (opModeIsActive()) {
// Determine new target position, and pass to motor controller

View File

@ -29,21 +29,19 @@
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.bosch.BNO055IMU;
import com.qualcomm.hardware.rev.RevHubOrientationOnRobot;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.IMU;
import com.qualcomm.robotcore.util.ElapsedTime;
import com.qualcomm.robotcore.util.Range;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
/**
* This file illustrates the concept of driving an autonomous path based on Gyro heading and encoder counts.
/*
* This OpMode illustrates the concept of driving an autonomous path based on Gyro (IMU) heading and encoder counts.
* The code is structured as a LinearOpMode
*
* The path to be followed by the robot is built from a series of drive, turn or pause steps.
@ -51,9 +49,10 @@ import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
*
* The code REQUIRES that you have encoders on the drive motors, otherwise you should use: RobotAutoDriveByTime;
*
* This code ALSO requires that you have a BOSCH BNO055 IMU, otherwise you would use: RobotAutoDriveByEncoder;
* This IMU is found in REV Control/Expansion Hubs shipped prior to July 2022, and possibly also on later models.
* This code uses the Universal IMU interface so it will work with either the BNO055, or BHI260 IMU.
* To run as written, the Control/Expansion hub should be mounted horizontally on a flat part of the robot chassis.
* The REV Logo should be facing UP, and the USB port should be facing forward.
* If this is not the configuration of your REV Control Hub, then the code should be modified to reflect the correct orientation.
*
* This sample requires that the drive Motors have been configured with names : left_drive and right_drive.
* It also requires that a positive power command moves both motors forward, and causes the encoders to count UP.
@ -61,7 +60,7 @@ import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
* See the beginning of runOpMode() to set the FORWARD/REVERSE option for each motor.
*
* This code uses RUN_TO_POSITION mode for driving straight, and RUN_USING_ENCODER mode for turning and holding.
* Note: You must call setTargetPosition() at least once before switching to RUN_TO_POSITION mode.
* Note: This code implements the requirement of calling setTargetPosition() at least once before switching to RUN_TO_POSITION mode.
*
* Notes:
*
@ -72,7 +71,7 @@ import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
* The angle of movement/rotation is assumed to be a standardized rotation around the robot Z axis,
* which means that a Positive rotation is Counter Clockwise, looking down on the field.
* This is consistent with the FTC field coordinate conventions set out in the document:
* ftc_app\doc\tutorial\FTC_FieldCoordinateSystemDefinition.pdf
* https://ftc-docs.firstinspires.org/field-coordinate-system
*
* Control Approach.
*
@ -96,10 +95,8 @@ public class RobotAutoDriveByGyro_Linear extends LinearOpMode {
/* Declare OpMode members. */
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
private BNO055IMU imu = null; // Control/Expansion Hub IMU
private IMU imu = null; // Control/Expansion Hub IMU
private double robotHeading = 0;
private double headingOffset = 0;
private double headingError = 0;
// These variable are declared here (as class members) so they can be updated in various methods,
@ -151,11 +148,19 @@ public class RobotAutoDriveByGyro_Linear extends LinearOpMode {
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
// define initialization values for IMU, and then initialize it.
BNO055IMU.Parameters parameters = new BNO055IMU.Parameters();
parameters.angleUnit = BNO055IMU.AngleUnit.DEGREES;
imu = hardwareMap.get(BNO055IMU.class, "imu");
imu.initialize(parameters);
/* The next two lines define Hub orientation.
* The Default Orientation (shown) is when a hub is mounted horizontally with the printed logo pointing UP and the USB port pointing FORWARD.
*
* To Do: EDIT these two lines to match YOUR mounting configuration.
*/
RevHubOrientationOnRobot.LogoFacingDirection logoDirection = RevHubOrientationOnRobot.LogoFacingDirection.UP;
RevHubOrientationOnRobot.UsbFacingDirection usbDirection = RevHubOrientationOnRobot.UsbFacingDirection.FORWARD;
RevHubOrientationOnRobot orientationOnRobot = new RevHubOrientationOnRobot(logoDirection, usbDirection);
// Now initialize the IMU with this mounting orientation
// This sample expects the IMU to be in a REV Hub and named "imu".
imu = hardwareMap.get(IMU.class, "imu");
imu.initialize(new IMU.Parameters(orientationOnRobot));
// Ensure the robot is stationary. Reset the encoders and set the motors to BRAKE mode
leftDrive.setMode(DcMotor.RunMode.STOP_AND_RESET_ENCODER);
@ -165,14 +170,14 @@ public class RobotAutoDriveByGyro_Linear extends LinearOpMode {
// Wait for the game to start (Display Gyro value while waiting)
while (opModeInInit()) {
telemetry.addData(">", "Robot Heading = %4.0f", getRawHeading());
telemetry.addData(">", "Robot Heading = %4.0f", getHeading());
telemetry.update();
}
// Set the encoders for closed loop speed control, and reset the heading.
leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
resetHeading();
imu.resetYaw();
// Step through each leg of the path,
// Notes: Reverse movement is obtained by setting a negative distance (not speed)
@ -208,10 +213,10 @@ public class RobotAutoDriveByGyro_Linear extends LinearOpMode {
// ********** HIGH Level driving functions. ********************
/**
* Method to drive in a straight line, on a fixed compass heading (angle), based on encoder counts.
* Drive in a straight line, on a fixed compass heading (angle), based on encoder counts.
* Move will stop if either of these conditions occur:
* 1) Move gets to the desired position
* 2) Driver stops the opmode running.
* 2) Driver stops the OpMode running.
*
* @param maxDriveSpeed MAX Speed for forward/rev motion (range 0 to +1.0) .
* @param distance Distance (in inches) to move from current position. Negative distance means move backward.
@ -223,7 +228,7 @@ public class RobotAutoDriveByGyro_Linear extends LinearOpMode {
double distance,
double heading) {
// Ensure that the opmode is still active
// Ensure that the OpMode is still active
if (opModeIsActive()) {
// Determine new target position, and pass to motor controller
@ -269,10 +274,13 @@ public class RobotAutoDriveByGyro_Linear extends LinearOpMode {
}
/**
* Method to spin on central axis to point in a new direction.
* Spin on the central axis to point in a new direction.
* <p>
* Move will stop if either of these conditions occur:
* <p>
* 1) Move gets to the heading (angle)
* 2) Driver stops the opmode running.
* <p>
* 2) Driver stops the OpMode running.
*
* @param maxTurnSpeed Desired MAX speed of turn. (range 0 to +1.0)
* @param heading Absolute Heading Angle (in Degrees) relative to last gyro reset.
@ -305,8 +313,10 @@ public class RobotAutoDriveByGyro_Linear extends LinearOpMode {
}
/**
* Method to obtain & hold a heading for a finite amount of time
* Obtain & hold a heading for a finite amount of time
* <p>
* Move will stop once the requested time has elapsed
* <p>
* This function is useful for giving the robot a moment to stabilize it's heading between movements.
*
* @param maxTurnSpeed Maximum differential turn speed (range 0 to +1.0)
@ -342,7 +352,7 @@ public class RobotAutoDriveByGyro_Linear extends LinearOpMode {
// ********** LOW Level driving functions. ********************
/**
* This method uses a Proportional Controller to determine how much steering correction is required.
* Use a Proportional Controller to determine how much steering correction is required.
*
* @param desiredHeading The desired absolute heading (relative to last heading reset)
* @param proportionalGain Gain factor applied to heading error to obtain turning power.
@ -351,11 +361,8 @@ public class RobotAutoDriveByGyro_Linear extends LinearOpMode {
public double getSteeringCorrection(double desiredHeading, double proportionalGain) {
targetHeading = desiredHeading; // Save for telemetry
// Get the robot heading by applying an offset to the IMU heading
robotHeading = getRawHeading() - headingOffset;
// Determine the heading current error
headingError = targetHeading - robotHeading;
headingError = targetHeading - getHeading();
// Normalize the error to be within +/- 180 degrees
while (headingError > 180) headingError -= 360;
@ -366,7 +373,7 @@ public class RobotAutoDriveByGyro_Linear extends LinearOpMode {
}
/**
* This method takes separate drive (fwd/rev) and turn (right/left) requests,
* Take separate drive (fwd/rev) and turn (right/left) requests,
* combines them, and applies the appropriate speed commands to the left and right wheel motors.
* @param drive forward motor speed
* @param turn clockwise turning motor speed.
@ -406,26 +413,17 @@ public class RobotAutoDriveByGyro_Linear extends LinearOpMode {
telemetry.addData("Motion", "Turning");
}
telemetry.addData("Angle Target:Current", "%5.2f:%5.0f", targetHeading, robotHeading);
telemetry.addData("Error:Steer", "%5.1f:%5.1f", headingError, turnSpeed);
telemetry.addData("Wheel Speeds L:R.", "%5.2f : %5.2f", leftSpeed, rightSpeed);
telemetry.addData("Heading- Target : Current", "%5.2f : %5.0f", targetHeading, getHeading());
telemetry.addData("Error : Steer Pwr", "%5.1f : %5.1f", headingError, turnSpeed);
telemetry.addData("Wheel Speeds L : R", "%5.2f : %5.2f", leftSpeed, rightSpeed);
telemetry.update();
}
/**
* read the raw (un-offset Gyro heading) directly from the IMU
* read the Robot heading directly from the IMU (in degrees)
*/
public double getRawHeading() {
Orientation angles = imu.getAngularOrientation(AxesReference.INTRINSIC, AxesOrder.ZYX, AngleUnit.DEGREES);
return angles.firstAngle;
}
/**
* Reset the "offset" heading back to zero
*/
public void resetHeading() {
// Save a new heading offset equal to the current raw heading.
headingOffset = getRawHeading();
robotHeading = 0;
public double getHeading() {
YawPitchRollAngles orientation = imu.getRobotYawPitchRollAngles();
return orientation.getYaw(AngleUnit.DEGREES);
}
}

View File

@ -35,8 +35,8 @@ import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
/**
* This file illustrates the concept of driving a path based on time.
/*
* This OpMode illustrates the concept of driving a path based on time.
* The code is structured as a LinearOpMode
*
* The code assumes that you do NOT have encoders on the wheels,

View File

@ -0,0 +1,321 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.Range;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.ExposureControl;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.GainControl;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
import java.util.concurrent.TimeUnit;
/*
* This OpMode illustrates using a camera to locate and drive towards a specific AprilTag.
* The code assumes a Holonomic (Mecanum or X Drive) Robot.
*
* For an introduction to AprilTags, see the ftc-docs link below:
* https://ftc-docs.firstinspires.org/en/latest/apriltag/vision_portal/apriltag_intro/apriltag-intro.html
*
* When an AprilTag in the TagLibrary is detected, the SDK provides location and orientation of the tag, relative to the camera.
* This information is provided in the "ftcPose" member of the returned "detection", and is explained in the ftc-docs page linked below.
* https://ftc-docs.firstinspires.org/apriltag-detection-values
*
* The drive goal is to rotate to keep the Tag centered in the camera, while strafing to be directly in front of the tag, and
* driving towards the tag to achieve the desired distance.
* To reduce any motion blur (which will interrupt the detection process) the Camera exposure is reduced to a very low value (5mS)
* You can determine the best Exposure and Gain values by using the ConceptAprilTagOptimizeExposure OpMode in this Samples folder.
*
* The code assumes a Robot Configuration with motors named: leftfront_drive and rightfront_drive, leftback_drive and rightback_drive.
* The motor directions must be set so a positive power goes forward on all wheels.
* This sample assumes that the current game AprilTag Library (usually for the current season) is being loaded by default,
* so you should choose to approach a valid tag ID (usually starting at 0)
*
* Under manual control, the left stick will move forward/back & left/right. The right stick will rotate the robot.
* Manually drive the robot until it displays Target data on the Driver Station.
*
* Press and hold the *Left Bumper* to enable the automatic "Drive to target" mode.
* Release the Left Bumper to return to manual driving mode.
*
* Under "Drive To Target" mode, the robot has three goals:
* 1) Turn the robot to always keep the Tag centered on the camera frame. (Use the Target Bearing to turn the robot.)
* 2) Strafe the robot towards the centerline of the Tag, so it approaches directly in front of the tag. (Use the Target Yaw to strafe the robot)
* 3) Drive towards the Tag to get to the desired distance. (Use Tag Range to drive the robot forward/backward)
*
* Use DESIRED_DISTANCE to set how close you want the robot to get to the target.
* Speed and Turn sensitivity can be adjusted using the SPEED_GAIN, STRAFE_GAIN and TURN_GAIN constants.
*
* Use Android Studio to Copy this Class, and Paste it into the TeamCode/src/main/java/org/firstinspires/ftc/teamcode folder.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
*/
@TeleOp(name="Omni Drive To AprilTag", group = "Concept")
@Disabled
public class RobotAutoDriveToAprilTagOmni extends LinearOpMode
{
// Adjust these numbers to suit your robot.
final double DESIRED_DISTANCE = 12.0; // this is how close the camera should get to the target (inches)
// Set the GAIN constants to control the relationship between the measured position error, and how much power is
// applied to the drive motors to correct the error.
// Drive = Error * Gain Make these values smaller for smoother control, or larger for a more aggressive response.
final double SPEED_GAIN = 0.02 ; // Forward Speed Control "Gain". eg: Ramp up to 50% power at a 25 inch error. (0.50 / 25.0)
final double STRAFE_GAIN = 0.015 ; // Strafe Speed Control "Gain". eg: Ramp up to 25% power at a 25 degree Yaw error. (0.25 / 25.0)
final double TURN_GAIN = 0.01 ; // Turn Control "Gain". eg: Ramp up to 25% power at a 25 degree error. (0.25 / 25.0)
final double MAX_AUTO_SPEED = 0.5; // Clip the approach speed to this max value (adjust for your robot)
final double MAX_AUTO_STRAFE= 0.5; // Clip the approach speed to this max value (adjust for your robot)
final double MAX_AUTO_TURN = 0.3; // Clip the turn speed to this max value (adjust for your robot)
private DcMotor leftFrontDrive = null; // Used to control the left front drive wheel
private DcMotor rightFrontDrive = null; // Used to control the right front drive wheel
private DcMotor leftBackDrive = null; // Used to control the left back drive wheel
private DcMotor rightBackDrive = null; // Used to control the right back drive wheel
private static final boolean USE_WEBCAM = true; // Set true to use a webcam, or false for a phone camera
private static final int DESIRED_TAG_ID = -1; // Choose the tag you want to approach or set to -1 for ANY tag.
private VisionPortal visionPortal; // Used to manage the video source.
private AprilTagProcessor aprilTag; // Used for managing the AprilTag detection process.
private AprilTagDetection desiredTag = null; // Used to hold the data for a detected AprilTag
@Override public void runOpMode()
{
boolean targetFound = false; // Set to true when an AprilTag target is detected
double drive = 0; // Desired forward power/speed (-1 to +1)
double strafe = 0; // Desired strafe power/speed (-1 to +1)
double turn = 0; // Desired turning power/speed (-1 to +1)
// Initialize the Apriltag Detection process
initAprilTag();
// Initialize the hardware variables. Note that the strings used here as parameters
// to 'get' must match the names assigned during the robot configuration.
// step (using the FTC Robot Controller app on the phone).
leftFrontDrive = hardwareMap.get(DcMotor.class, "leftfront_drive");
rightFrontDrive = hardwareMap.get(DcMotor.class, "rightfront_drive");
leftBackDrive = hardwareMap.get(DcMotor.class, "leftback_drive");
rightBackDrive = hardwareMap.get(DcMotor.class, "rightback_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// When run, this OpMode should start both motors driving forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Gear Reduction or 90 Deg drives may require direction flips
leftFrontDrive.setDirection(DcMotor.Direction.REVERSE);
leftBackDrive.setDirection(DcMotor.Direction.REVERSE);
rightFrontDrive.setDirection(DcMotor.Direction.FORWARD);
rightBackDrive.setDirection(DcMotor.Direction.FORWARD);
if (USE_WEBCAM)
setManualExposure(6, 250); // Use low exposure time to reduce motion blur
// Wait for driver to press start
telemetry.addData("Camera preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
while (opModeIsActive())
{
targetFound = false;
desiredTag = null;
// Step through the list of detected tags and look for a matching tag
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
for (AprilTagDetection detection : currentDetections) {
// Look to see if we have size info on this tag.
if (detection.metadata != null) {
// Check to see if we want to track towards this tag.
if ((DESIRED_TAG_ID < 0) || (detection.id == DESIRED_TAG_ID)) {
// Yes, we want to use this tag.
targetFound = true;
desiredTag = detection;
break; // don't look any further.
} else {
// This tag is in the library, but we do not want to track it right now.
telemetry.addData("Skipping", "Tag ID %d is not desired", detection.id);
}
} else {
// This tag is NOT in the library, so we don't have enough information to track to it.
telemetry.addData("Unknown", "Tag ID %d is not in TagLibrary", detection.id);
}
}
// Tell the driver what we see, and what to do.
if (targetFound) {
telemetry.addData("\n>","HOLD Left-Bumper to Drive to Target\n");
telemetry.addData("Found", "ID %d (%s)", desiredTag.id, desiredTag.metadata.name);
telemetry.addData("Range", "%5.1f inches", desiredTag.ftcPose.range);
telemetry.addData("Bearing","%3.0f degrees", desiredTag.ftcPose.bearing);
telemetry.addData("Yaw","%3.0f degrees", desiredTag.ftcPose.yaw);
} else {
telemetry.addData("\n>","Drive using joysticks to find valid target\n");
}
// If Left Bumper is being pressed, AND we have found the desired target, Drive to target Automatically .
if (gamepad1.left_bumper && targetFound) {
// Determine heading, range and Yaw (tag image rotation) error so we can use them to control the robot automatically.
double rangeError = (desiredTag.ftcPose.range - DESIRED_DISTANCE);
double headingError = desiredTag.ftcPose.bearing;
double yawError = desiredTag.ftcPose.yaw;
// Use the speed and turn "gains" to calculate how we want the robot to move.
drive = Range.clip(rangeError * SPEED_GAIN, -MAX_AUTO_SPEED, MAX_AUTO_SPEED);
turn = Range.clip(headingError * TURN_GAIN, -MAX_AUTO_TURN, MAX_AUTO_TURN) ;
strafe = Range.clip(-yawError * STRAFE_GAIN, -MAX_AUTO_STRAFE, MAX_AUTO_STRAFE);
telemetry.addData("Auto","Drive %5.2f, Strafe %5.2f, Turn %5.2f ", drive, strafe, turn);
} else {
// drive using manual POV Joystick mode. Slow things down to make the robot more controlable.
drive = -gamepad1.left_stick_y / 2.0; // Reduce drive rate to 50%.
strafe = -gamepad1.left_stick_x / 2.0; // Reduce strafe rate to 50%.
turn = -gamepad1.right_stick_x / 3.0; // Reduce turn rate to 33%.
telemetry.addData("Manual","Drive %5.2f, Strafe %5.2f, Turn %5.2f ", drive, strafe, turn);
}
telemetry.update();
// Apply desired axes motions to the drivetrain.
moveRobot(drive, strafe, turn);
sleep(10);
}
}
/**
* Move robot according to desired axes motions
* <p>
* Positive X is forward
* <p>
* Positive Y is strafe left
* <p>
* Positive Yaw is counter-clockwise
*/
public void moveRobot(double x, double y, double yaw) {
// Calculate wheel powers.
double leftFrontPower = x -y -yaw;
double rightFrontPower = x +y +yaw;
double leftBackPower = x +y -yaw;
double rightBackPower = x -y +yaw;
// Normalize wheel powers to be less than 1.0
double max = Math.max(Math.abs(leftFrontPower), Math.abs(rightFrontPower));
max = Math.max(max, Math.abs(leftBackPower));
max = Math.max(max, Math.abs(rightBackPower));
if (max > 1.0) {
leftFrontPower /= max;
rightFrontPower /= max;
leftBackPower /= max;
rightBackPower /= max;
}
// Send powers to the wheels.
leftFrontDrive.setPower(leftFrontPower);
rightFrontDrive.setPower(rightFrontPower);
leftBackDrive.setPower(leftBackPower);
rightBackDrive.setPower(rightBackPower);
}
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor by using a builder.
aprilTag = new AprilTagProcessor.Builder().build();
// Adjust Image Decimation to trade-off detection-range for detection-rate.
// eg: Some typical detection data using a Logitech C920 WebCam
// Decimation = 1 .. Detect 2" Tag from 10 feet away at 10 Frames per second
// Decimation = 2 .. Detect 2" Tag from 6 feet away at 22 Frames per second
// Decimation = 3 .. Detect 2" Tag from 4 feet away at 30 Frames Per Second
// Decimation = 3 .. Detect 5" Tag from 10 feet away at 30 Frames Per Second
// Note: Decimation can be changed on-the-fly to adapt during a match.
aprilTag.setDecimation(2);
// Create the vision portal by using a builder.
if (USE_WEBCAM) {
visionPortal = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"))
.addProcessor(aprilTag)
.build();
} else {
visionPortal = new VisionPortal.Builder()
.setCamera(BuiltinCameraDirection.BACK)
.addProcessor(aprilTag)
.build();
}
}
/*
Manually set the camera gain and exposure.
This can only be called AFTER calling initAprilTag(), and only works for Webcams;
*/
private void setManualExposure(int exposureMS, int gain) {
// Wait for the camera to be open, then use the controls
if (visionPortal == null) {
return;
}
// Make sure camera is streaming before we try to set the exposure controls
if (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING) {
telemetry.addData("Camera", "Waiting");
telemetry.update();
while (!isStopRequested() && (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING)) {
sleep(20);
}
telemetry.addData("Camera", "Ready");
telemetry.update();
}
// Set camera controls unless we are stopping.
if (!isStopRequested())
{
ExposureControl exposureControl = visionPortal.getCameraControl(ExposureControl.class);
if (exposureControl.getMode() != ExposureControl.Mode.Manual) {
exposureControl.setMode(ExposureControl.Mode.Manual);
sleep(50);
}
exposureControl.setExposure((long)exposureMS, TimeUnit.MILLISECONDS);
sleep(20);
GainControl gainControl = visionPortal.getCameraControl(GainControl.class);
gainControl.setGain(gain);
sleep(20);
}
}
}

View File

@ -0,0 +1,298 @@
/* Copyright (c) 2023 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.Range;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.ExposureControl;
import org.firstinspires.ftc.robotcore.external.hardware.camera.controls.GainControl;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.apriltag.AprilTagDetection;
import org.firstinspires.ftc.vision.apriltag.AprilTagProcessor;
import java.util.List;
import java.util.concurrent.TimeUnit;
/*
* This OpMode illustrates using a camera to locate and drive towards a specific AprilTag.
* The code assumes a basic two-wheel (Tank) Robot Drivetrain
*
* For an introduction to AprilTags, see the ftc-docs link below:
* https://ftc-docs.firstinspires.org/en/latest/apriltag/vision_portal/apriltag_intro/apriltag-intro.html
*
* When an AprilTag in the TagLibrary is detected, the SDK provides location and orientation of the tag, relative to the camera.
* This information is provided in the "ftcPose" member of the returned "detection", and is explained in the ftc-docs page linked below.
* https://ftc-docs.firstinspires.org/apriltag-detection-values
*
* The driving goal is to rotate to keep the tag centered in the camera, while driving towards the tag to achieve the desired distance.
* To reduce any motion blur (which will interrupt the detection process) the Camera exposure is reduced to a very low value (5mS)
* You can determine the best exposure and gain values by using the ConceptAprilTagOptimizeExposure OpMode in this Samples folder.
*
* The code assumes a Robot Configuration with motors named left_drive and right_drive.
* The motor directions must be set so a positive power goes forward on both wheels;
* This sample assumes that the default AprilTag Library (usually for the current season) is being loaded by default
* so you should choose to approach a valid tag ID (usually starting at 0)
*
* Under manual control, the left stick will move forward/back, and the right stick will rotate the robot.
* This is called POV Joystick mode, different than Tank Drive (where each joystick controls a wheel).
*
* Manually drive the robot until it displays Target data on the Driver Station.
* Press and hold the *Left Bumper* to enable the automatic "Drive to target" mode.
* Release the Left Bumper to return to manual driving mode.
*
* Under "Drive To Target" mode, the robot has two goals:
* 1) Turn the robot to always keep the Tag centered on the camera frame. (Use the Target Bearing to turn the robot.)
* 2) Drive towards the Tag to get to the desired distance. (Use Tag Range to drive the robot forward/backward)
*
* Use DESIRED_DISTANCE to set how close you want the robot to get to the target.
* Speed and Turn sensitivity can be adjusted using the SPEED_GAIN and TURN_GAIN constants.
*
* Use Android Studio to Copy this Class, and Paste it into the TeamCode/src/main/java/org/firstinspires/ftc/teamcode folder.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
*/
@TeleOp(name="Tank Drive To AprilTag", group = "Concept")
@Disabled
public class RobotAutoDriveToAprilTagTank extends LinearOpMode
{
// Adjust these numbers to suit your robot.
final double DESIRED_DISTANCE = 12.0; // this is how close the camera should get to the target (inches)
// Set the GAIN constants to control the relationship between the measured position error, and how much power is
// applied to the drive motors to correct the error.
// Drive = Error * Gain Make these values smaller for smoother control, or larger for a more aggressive response.
final double SPEED_GAIN = 0.02 ; // Speed Control "Gain". eg: Ramp up to 50% power at a 25 inch error. (0.50 / 25.0)
final double TURN_GAIN = 0.01 ; // Turn Control "Gain". eg: Ramp up to 25% power at a 25 degree error. (0.25 / 25.0)
final double MAX_AUTO_SPEED = 0.5; // Clip the approach speed to this max value (adjust for your robot)
final double MAX_AUTO_TURN = 0.25; // Clip the turn speed to this max value (adjust for your robot)
private DcMotor leftDrive = null; // Used to control the left drive wheel
private DcMotor rightDrive = null; // Used to control the right drive wheel
private static final boolean USE_WEBCAM = true; // Set true to use a webcam, or false for a phone camera
private static final int DESIRED_TAG_ID = -1; // Choose the tag you want to approach or set to -1 for ANY tag.
private VisionPortal visionPortal; // Used to manage the video source.
private AprilTagProcessor aprilTag; // Used for managing the AprilTag detection process.
private AprilTagDetection desiredTag = null; // Used to hold the data for a detected AprilTag
@Override public void runOpMode()
{
boolean targetFound = false; // Set to true when an AprilTag target is detected
double drive = 0; // Desired forward power/speed (-1 to +1) +ve is forward
double turn = 0; // Desired turning power/speed (-1 to +1) +ve is CounterClockwise
// Initialize the Apriltag Detection process
initAprilTag();
// Initialize the hardware variables. Note that the strings used here as parameters
// to 'get' must match the names assigned during the robot configuration.
// step (using the FTC Robot Controller app on the phone).
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed because the axles point in opposite directions.
// When run, this OpMode should start both motors driving forward. So adjust these two lines based on your first test drive.
// Note: The settings here assume direct drive on left and right wheels. Single Gear Reduction or 90 Deg drives may require direction flips
leftDrive.setDirection(DcMotor.Direction.REVERSE);
rightDrive.setDirection(DcMotor.Direction.FORWARD);
if (USE_WEBCAM)
setManualExposure(6, 250); // Use low exposure time to reduce motion blur
// Wait for the driver to press Start
telemetry.addData("Camera preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
while (opModeIsActive())
{
targetFound = false;
desiredTag = null;
// Step through the list of detected tags and look for a matching tag
List<AprilTagDetection> currentDetections = aprilTag.getDetections();
for (AprilTagDetection detection : currentDetections) {
// Look to see if we have size info on this tag.
if (detection.metadata != null) {
// Check to see if we want to track towards this tag.
if ((DESIRED_TAG_ID < 0) || (detection.id == DESIRED_TAG_ID)) {
// Yes, we want to use this tag.
targetFound = true;
desiredTag = detection;
break; // don't look any further.
} else {
// This tag is in the library, but we do not want to track it right now.
telemetry.addData("Skipping", "Tag ID %d is not desired", detection.id);
}
} else {
// This tag is NOT in the library, so we don't have enough information to track to it.
telemetry.addData("Unknown", "Tag ID %d is not in TagLibrary", detection.id);
}
}
// Tell the driver what we see, and what to do.
if (targetFound) {
telemetry.addData("\n>","HOLD Left-Bumper to Drive to Target\n");
telemetry.addData("Found", "ID %d (%s)", desiredTag.id, desiredTag.metadata.name);
telemetry.addData("Range", "%5.1f inches", desiredTag.ftcPose.range);
telemetry.addData("Bearing","%3.0f degrees", desiredTag.ftcPose.bearing);
} else {
telemetry.addData("\n>","Drive using joysticks to find valid target\n");
}
// If Left Bumper is being pressed, AND we have found the desired target, Drive to target Automatically .
if (gamepad1.left_bumper && targetFound) {
// Determine heading and range error so we can use them to control the robot automatically.
double rangeError = (desiredTag.ftcPose.range - DESIRED_DISTANCE);
double headingError = desiredTag.ftcPose.bearing;
// Use the speed and turn "gains" to calculate how we want the robot to move. Clip it to the maximum
drive = Range.clip(rangeError * SPEED_GAIN, -MAX_AUTO_SPEED, MAX_AUTO_SPEED);
turn = Range.clip(headingError * TURN_GAIN, -MAX_AUTO_TURN, MAX_AUTO_TURN) ;
telemetry.addData("Auto","Drive %5.2f, Turn %5.2f", drive, turn);
} else {
// drive using manual POV Joystick mode.
drive = -gamepad1.left_stick_y / 2.0; // Reduce drive rate to 50%.
turn = -gamepad1.right_stick_x / 4.0; // Reduce turn rate to 25%.
telemetry.addData("Manual","Drive %5.2f, Turn %5.2f", drive, turn);
}
telemetry.update();
// Apply desired axes motions to the drivetrain.
moveRobot(drive, turn);
sleep(10);
}
}
/**
* Move robot according to desired axes motions
* <p>
* Positive X is forward
* <p>
* Positive Yaw is counter-clockwise
*/
public void moveRobot(double x, double yaw) {
// Calculate left and right wheel powers.
double leftPower = x - yaw;
double rightPower = x + yaw;
// Normalize wheel powers to be less than 1.0
double max = Math.max(Math.abs(leftPower), Math.abs(rightPower));
if (max >1.0) {
leftPower /= max;
rightPower /= max;
}
// Send powers to the wheels.
leftDrive.setPower(leftPower);
rightDrive.setPower(rightPower);
}
/**
* Initialize the AprilTag processor.
*/
private void initAprilTag() {
// Create the AprilTag processor by using a builder.
aprilTag = new AprilTagProcessor.Builder().build();
// Adjust Image Decimation to trade-off detection-range for detection-rate.
// eg: Some typical detection data using a Logitech C920 WebCam
// Decimation = 1 .. Detect 2" Tag from 10 feet away at 10 Frames per second
// Decimation = 2 .. Detect 2" Tag from 6 feet away at 22 Frames per second
// Decimation = 3 .. Detect 2" Tag from 4 feet away at 30 Frames Per Second
// Decimation = 3 .. Detect 5" Tag from 10 feet away at 30 Frames Per Second
// Note: Decimation can be changed on-the-fly to adapt during a match.
aprilTag.setDecimation(2);
// Create the vision portal by using a builder.
if (USE_WEBCAM) {
visionPortal = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"))
.addProcessor(aprilTag)
.build();
} else {
visionPortal = new VisionPortal.Builder()
.setCamera(BuiltinCameraDirection.BACK)
.addProcessor(aprilTag)
.build();
}
}
/*
Manually set the camera gain and exposure.
This can only be called AFTER calling initAprilTag(), and only works for Webcams;
*/
private void setManualExposure(int exposureMS, int gain) {
// Wait for the camera to be open, then use the controls
if (visionPortal == null) {
return;
}
// Make sure camera is streaming before we try to set the exposure controls
if (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING) {
telemetry.addData("Camera", "Waiting");
telemetry.update();
while (!isStopRequested() && (visionPortal.getCameraState() != VisionPortal.CameraState.STREAMING)) {
sleep(20);
}
telemetry.addData("Camera", "Ready");
telemetry.update();
}
// Set camera controls unless we are stopping.
if (!isStopRequested())
{
ExposureControl exposureControl = visionPortal.getCameraControl(ExposureControl.class);
if (exposureControl.getMode() != ExposureControl.Mode.Manual) {
exposureControl.setMode(ExposureControl.Mode.Manual);
sleep(50);
}
exposureControl.setExposure((long)exposureMS, TimeUnit.MILLISECONDS);
sleep(20);
GainControl gainControl = visionPortal.getCameraControl(GainControl.class);
gainControl.setGain(gain);
sleep(20);
telemetry.addData("Camera", "Ready");
telemetry.update();
}
}
}

View File

@ -32,15 +32,13 @@ package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.ColorSensor;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.LightSensor;
import com.qualcomm.robotcore.hardware.NormalizedColorSensor;
import com.qualcomm.robotcore.hardware.NormalizedRGBA;
import com.qualcomm.robotcore.hardware.SwitchableLight;
/**
* This file illustrates the concept of driving up to a line and then stopping.
/*
* This OpMode illustrates the concept of driving up to a line and then stopping.
* The code is structured as a LinearOpMode
*
* The Sensor used here can be a REV Color Sensor V2 or V3. Make sure the white LED is turned on.
@ -70,7 +68,7 @@ public class RobotAutoDriveToLine_Linear extends LinearOpMode {
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
/** The colorSensor field will contain a reference to our color sensor hardware object */
/** The variable to store a reference to our color sensor hardware object */
NormalizedColorSensor colorSensor;
static final double WHITE_THRESHOLD = 0.5; // spans between 0.0 - 1.0 from dark to light

View File

@ -34,7 +34,7 @@ import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.Servo;
import com.qualcomm.robotcore.util.Range;
/**
/*
* This file works in conjunction with the External Hardware Class sample called: ConceptExternalHardwareClass.java
* Please read the explanations in that Sample about how to use this class definition.
*
@ -48,8 +48,8 @@ import com.qualcomm.robotcore.util.Range;
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with *exactly the same name*.
*
* Or.. In OnBot Java, add a new file named RobotHardware.java, drawing from this Sample; select Not an OpMode.
* Also add a new OpMode, drawing from the Sample ConceptExternalHardwareClass.java; select TeleOp.
* Or... In OnBot Java, add a new file named RobotHardware.java, select this sample, and select Not an OpMode.
* Also add a new OpMode, select the sample ConceptExternalHardwareClass.java, and select TeleOp.
*
*/
@ -79,7 +79,7 @@ public class RobotHardware {
/**
* Initialize all the robot's hardware.
* This method must be called ONCE when the OpMode is initialized.
*
* <p>
* All of the hardware devices are accessed via the hardware map, and initialized.
*/
public void init() {

View File

@ -36,8 +36,8 @@ import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.Servo;
import com.qualcomm.robotcore.util.Range;
/**
* This particular OpMode executes a POV Game style Teleop for a direct drive robot
/*
* This OpMode executes a POV Game style Teleop for a direct drive robot
* The code is structured as a LinearOpMode
*
* In this mode the left stick moves the robot FWD and back, the Right stick turns left and right.
@ -45,7 +45,7 @@ import com.qualcomm.robotcore.util.Range;
* It also opens and closes the claws slowly using the left and right Bumper buttons.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="Robot: Teleop POV", group="Robot")

View File

@ -36,8 +36,8 @@ import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.Servo;
import com.qualcomm.robotcore.util.Range;
/**
* This particular OpMode executes a Tank Drive control TeleOp a direct drive robot
/*
* This OpMode executes a Tank Drive control TeleOp a direct drive robot
* The code is structured as an Iterative OpMode
*
* In this mode, the left and right joysticks control the left and right motors respectively.

View File

@ -33,14 +33,13 @@ import com.qualcomm.hardware.rev.RevBlinkinLedDriver;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.Telemetry;
import org.firstinspires.ftc.robotcore.internal.system.Deadline;
import java.util.concurrent.TimeUnit;
/*
* Display patterns of a REV Robotics Blinkin LED Driver.
* This OpMode demonstrates use of the REV Robotics Blinkin LED Driver.
* AUTO mode cycles through all of the patterns.
* MANUAL mode allows the user to manually change patterns using the
* left and right bumpers of a gamepad.

View File

@ -34,7 +34,6 @@ import com.qualcomm.hardware.bosch.JustLoggingAccelerationIntegrator;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.Func;
import org.firstinspires.ftc.robotcore.external.navigation.Acceleration;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
@ -46,19 +45,19 @@ import org.firstinspires.ftc.robotcore.external.navigation.Velocity;
import java.util.Locale;
/**
* {@link SensorBNO055IMU} gives a short demo on how to use the BNO055 Inertial Motion Unit (IMU) from AdaFruit.
/*
* This OpMode gives a short demo on how to use the BNO055 Inertial Motion Unit (IMU) from AdaFruit.
*
* Note: this is a Legacy example that will not work with newer Control/Expansion Hubs that use a different IMU
* Please use the new SensorIMUOrthogonal or SensorIMUNonOrthogonal samples for a more universal IMU interface.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* @see <a href="http://www.adafruit.com/products/2472">Adafruit IMU</a>
*/
@TeleOp(name = "Sensor: BNO055 IMU", group = "Sensor")
@Disabled // Comment this out to add to the opmode list
@Disabled // Comment this out to add to the OpMode list
public class SensorBNO055IMU extends LinearOpMode
{
//----------------------------------------------------------------------------------------------
@ -84,7 +83,7 @@ public class SensorBNO055IMU extends LinearOpMode
BNO055IMU.Parameters parameters = new BNO055IMU.Parameters();
parameters.angleUnit = BNO055IMU.AngleUnit.DEGREES;
parameters.accelUnit = BNO055IMU.AccelUnit.METERS_PERSEC_PERSEC;
parameters.calibrationDataFile = "BNO055IMUCalibration.json"; // see the calibration sample opmode
parameters.calibrationDataFile = "BNO055IMUCalibration.json"; // see the calibration sample OpMode
parameters.loggingEnabled = true;
parameters.loggingTag = "IMU";
parameters.accelerationIntegrationAlgorithm = new JustLoggingAccelerationIntegrator();

View File

@ -29,13 +29,11 @@
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.adafruit.AdafruitBNO055IMU;
import com.qualcomm.hardware.bosch.BNO055IMU;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.ReadWriteFile;
import org.firstinspires.ftc.robotcore.external.Func;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
@ -46,63 +44,60 @@ import org.firstinspires.ftc.robotcore.internal.system.AppUtil;
import java.io.File;
import java.util.Locale;
/**
* {@link SensorBNO055IMUCalibration} calibrates the IMU accelerometer per
/*
* This OpMode calibrates a BNO055 IMU per
* "Section 3.11 Calibration" of the BNO055 specification.
*
* Note: this is a Legacy example that will not work with newer Control/Expansion Hubs that use a different IMU
* Please use the new SensorIMUOrthogonal or SensorIMUNonOrthogonal samples for a more universal IMU interface.
*
* <p>Manual calibration of the IMU is definitely NOT necessary: except for the magnetometer
* (which is not used by the default {@link BNO055IMU.SensorMode#IMU
* SensorMode#IMU}), the BNO055 is internally self-calibrating and thus can be very successfully
* used without manual intervention. That said, performing a one-time calibration, saving the
* results persistently, then loading them again at each run can help reduce the time that automatic
* calibration requires.</p>
* Manual calibration of the IMU is definitely NOT necessary: except for the magnetometer (which is not used by the
* default "IMU" SensorMode), the BNO055 is internally self-calibrating and thus can be very successfully used without
* manual intervention. That said, performing a one-time calibration, saving the results persistently, then loading them
* again at each run can help reduce the time that automatic calibration requires.
*
* <p>This summary of the calibration process, from <a href="http://iotdk.intel.com/docs/master/upm/classupm_1_1_b_n_o055.html">
* Intel</a>, is informative:</p>
* This summary of the calibration process from Intel is informative:
* http://iotdk.intel.com/docs/master/upm/classupm_1_1_b_n_o055.html
*
* <p>"This device requires calibration in order to operate accurately. [...] Calibration data is
* "This device requires calibration in order to operate accurately. [...] Calibration data is
* lost on a power cycle. See one of the examples for a description of how to calibrate the device,
* but in essence:</p>
* but in essence:
*
* <p>There is a calibration status register available [...] that returns the calibration status
* There is a calibration status register available [...] that returns the calibration status
* of the accelerometer (ACC), magnetometer (MAG), gyroscope (GYR), and overall system (SYS).
* Each of these values range from 0 (uncalibrated) to 3 (fully calibrated). Calibration [ideally]
* involves certain motions to get all 4 values at 3. The motions are as follows (though see the
* datasheet for more information):</p>
* datasheet for more information):
*
* <li>
* <ol>GYR: Simply let the sensor sit flat for a few seconds.</ol>
* <ol>ACC: Move the sensor in various positions. Start flat, then rotate slowly by 45
* 1. GYR: Simply let the sensor sit flat for a few seconds.</ol>
* 2. ACC: Move the sensor in various positions. Start flat, then rotate slowly by 45
* degrees, hold for a few seconds, then continue rotating another 45 degrees and
* hold, etc. 6 or more movements of this type may be required. You can move through
* any axis you desire, but make sure that the device is lying at least once
* perpendicular to the x, y, and z axis.</ol>
* <ol>MAG: Move slowly in a figure 8 pattern in the air, until the calibration values reaches 3.</ol>
* <ol>SYS: This will usually reach 3 when the other items have also reached 3. If not, continue
* 3. MAG: Move slowly in a figure 8 pattern in the air, until the calibration values reaches 3.</ol>
* 4. SYS: This will usually reach 3 when the other items have also reached 3. If not, continue
* slowly moving the device though various axes until it does."</ol>
* </li>
*
* <p>To calibrate the IMU, run this sample opmode with a gamepad attached to the driver station.
* To calibrate the IMU, run this sample OpMode with a gamepad attached to the driver station.
* Once the IMU has reached sufficient calibration as reported on telemetry, press the 'A'
* button on the gamepad to write the calibration to a file. That file can then be indicated
* later when running an opmode which uses the IMU.</p>
* later when running an OpMode which uses the IMU.
*
* <p>Note: if your intended uses of the IMU do not include use of all its sensors (for exmaple,
* Note: if your intended uses of the IMU do not include use of all its sensors (for example,
* you might not use the magnetometer), then it makes little sense for you to wait for full
* calibration of the sensors you are not using before saving the calibration data. Indeed,
* it appears that in a SensorMode that doesn't use the magnetometer (for example), the
* magnetometer cannot actually be calibrated.</p>
* magnetometer cannot actually be calibrated.
*
* @see AdafruitBNO055IMU
* @see BNO055IMU.Parameters#calibrationDataFile
* @see <a href="https://www.bosch-sensortec.com/bst/products/all_products/bno055">BNO055 product page</a>
* @see <a href="https://ae-bst.resource.bosch.com/media/_tech/media/datasheets/BST_BNO055_DS000_14.pdf">BNO055 specification</a>
* References:
* The AdafruitBNO055IMU Javadoc
* The BNO055IMU.Parameters.calibrationDataFile Javadoc
* The BNO055 product page: https://www.bosch-sensortec.com/bst/products/all_products/bno055
* The BNO055 datasheet: https://www.bosch-sensortec.com/media/boschsensortec/downloads/datasheets/bst-bno055-ds000.pdf
*/
@TeleOp(name = "Sensor: BNO055 IMU Calibration", group = "Sensor")
@Disabled // Uncomment this to add to the opmode list
@Disabled // Uncomment this to add to the OpMode list
public class SensorBNO055IMUCalibration extends LinearOpMode
{
//----------------------------------------------------------------------------------------------
@ -125,7 +120,7 @@ public class SensorBNO055IMUCalibration extends LinearOpMode
telemetry.log().add("");
telemetry.log().add("Please refer to the calibration instructions");
telemetry.log().add("contained in the Adafruit IMU calibration");
telemetry.log().add("sample opmode.");
telemetry.log().add("sample OpMode.");
telemetry.log().add("");
telemetry.log().add("When sufficient calibration has been reached,");
telemetry.log().add("press the 'A' button to write the current");
@ -159,7 +154,7 @@ public class SensorBNO055IMUCalibration extends LinearOpMode
// Save the calibration data to a file. You can choose whatever file
// name you wish here, but you'll want to indicate the same file name
// when you initialize the IMU in an opmode in which it is used. If you
// when you initialize the IMU in an OpMode in which it is used. If you
// have more than one IMU on your robot, you'll of course want to use
// different configuration file names for each.
String filename = "AdafruitIMUCalibration.json";

View File

@ -32,7 +32,6 @@ package org.firstinspires.ftc.robotcontroller.external.samples;
import android.app.Activity;
import android.graphics.Color;
import android.view.View;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
@ -40,11 +39,10 @@ import com.qualcomm.robotcore.hardware.DistanceSensor;
import com.qualcomm.robotcore.hardware.NormalizedColorSensor;
import com.qualcomm.robotcore.hardware.NormalizedRGBA;
import com.qualcomm.robotcore.hardware.SwitchableLight;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
/**
* This is an example LinearOpMode that shows how to use a color sensor in a generic
/*
* This OpMode shows how to use a color sensor in a generic
* way, regardless of which particular make or model of color sensor is used. The OpMode
* assumes that the color sensor is configured with a name of "sensor_color".
*
@ -78,7 +76,7 @@ public class SensorColor extends LinearOpMode {
* robot. Note that you won't see anything change on the Driver Station, only on the Robot Controller. */
View relativeLayout;
/**
/*
* The runOpMode() method is the root of this OpMode, as it is in all LinearOpModes.
* Our implementation here, though is a bit unusual: we've decided to put all the actual work
* in the runSample() method rather than directly in runOpMode() itself. The reason we do that is

View File

@ -1,4 +1,4 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
/* Copyright (c) 2024 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
@ -35,51 +35,41 @@ import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DigitalChannel;
/*
* This is an example LinearOpMode that shows how to use
* a REV Robotics Touch Sensor.
* This OpMode demonstrates how to use a digital channel.
*
* It assumes that the touch sensor is configured with a name of "sensor_digital".
* The OpMode assumes that the digital channel is configured with a name of "digitalTouch".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Sensor: Digital touch", group = "Sensor")
@TeleOp(name = "Sensor: digital channel", group = "Sensor")
@Disabled
public class SensorDigitalTouch extends LinearOpMode {
/**
* The REV Robotics Touch Sensor
* is treated as a digital channel. It is HIGH if the button is unpressed.
* It pulls LOW if the button is pressed.
*
* Also, when you connect a REV Robotics Touch Sensor to the digital I/O port on the
* Expansion Hub using a 4-wire JST cable, the second pin gets connected to the Touch Sensor.
* The lower (first) pin stays unconnected.*
*/
DigitalChannel digitalTouch; // Hardware Device Object
DigitalChannel digitalTouch; // Digital channel Object
@Override
public void runOpMode() {
// get a reference to our digitalTouch object.
digitalTouch = hardwareMap.get(DigitalChannel.class, "sensor_digital");
// get a reference to our touchSensor object.
digitalTouch = hardwareMap.get(DigitalChannel.class, "digitalTouch");
// set the digital channel to input.
digitalTouch.setMode(DigitalChannel.Mode.INPUT);
telemetry.addData("DigitalTouchSensorExample", "Press start to continue...");
telemetry.update();
// wait for the start button to be pressed.
waitForStart();
// while the op mode is active, loop and read the light levels.
// while the OpMode is active, loop and read the digital channel.
// Note we use opModeIsActive() as our loop condition because it is an interruptible method.
while (opModeIsActive()) {
// button is pressed if value returned is LOW or false.
// send the info back to driver station using telemetry function.
// if the digital channel returns true it's HIGH and the button is unpressed.
if (digitalTouch.getState() == true) {
telemetry.addData("Digital Touch", "Is Not Pressed");
if (digitalTouch.getState() == false) {
telemetry.addData("Button", "PRESSED");
} else {
telemetry.addData("Digital Touch", "Is Pressed");
telemetry.addData("Button", "NOT PRESSED");
}
telemetry.update();

View File

@ -0,0 +1,149 @@
/*
Copyright (c) 2023 FIRST
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of FIRST nor the names of its contributors may be used to
endorse or promote products derived from this software without specific prior
written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.dfrobot.HuskyLens;
import com.qualcomm.hardware.rev.Rev2mDistanceSensor;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DistanceSensor;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
import org.firstinspires.ftc.robotcore.internal.system.Deadline;
import java.util.concurrent.TimeUnit;
/*
* This OpMode illustrates how to use the DFRobot HuskyLens.
*
* The HuskyLens is a Vision Sensor with a built-in object detection model. It can
* detect a number of predefined objects and AprilTags in the 36h11 family, can
* recognize colors, and can be trained to detect custom objects. See this website for
* documentation: https://wiki.dfrobot.com/HUSKYLENS_V1.0_SKU_SEN0305_SEN0336
*
* This sample illustrates how to detect AprilTags, but can be used to detect other types
* of objects by changing the algorithm. It assumes that the HuskyLens is configured with
* a name of "huskylens".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: HuskyLens", group = "Sensor")
@Disabled
public class SensorHuskyLens extends LinearOpMode {
private final int READ_PERIOD = 1;
private HuskyLens huskyLens;
@Override
public void runOpMode()
{
huskyLens = hardwareMap.get(HuskyLens.class, "huskylens");
/*
* This sample rate limits the reads solely to allow a user time to observe
* what is happening on the Driver Station telemetry. Typical applications
* would not likely rate limit.
*/
Deadline rateLimit = new Deadline(READ_PERIOD, TimeUnit.SECONDS);
/*
* Immediately expire so that the first time through we'll do the read.
*/
rateLimit.expire();
/*
* Basic check to see if the device is alive and communicating. This is not
* technically necessary here as the HuskyLens class does this in its
* doInitialization() method which is called when the device is pulled out of
* the hardware map. However, sometimes it's unclear why a device reports as
* failing on initialization. In the case of this device, it's because the
* call to knock() failed.
*/
if (!huskyLens.knock()) {
telemetry.addData(">>", "Problem communicating with " + huskyLens.getDeviceName());
} else {
telemetry.addData(">>", "Press start to continue");
}
/*
* The device uses the concept of an algorithm to determine what types of
* objects it will look for and/or what mode it is in. The algorithm may be
* selected using the scroll wheel on the device, or via software as shown in
* the call to selectAlgorithm().
*
* The SDK itself does not assume that the user wants a particular algorithm on
* startup, and hence does not set an algorithm.
*
* Users, should, in general, explicitly choose the algorithm they want to use
* within the OpMode by calling selectAlgorithm() and passing it one of the values
* found in the enumeration HuskyLens.Algorithm.
*/
huskyLens.selectAlgorithm(HuskyLens.Algorithm.TAG_RECOGNITION);
telemetry.update();
waitForStart();
/*
* Looking for AprilTags per the call to selectAlgorithm() above. A handy grid
* for testing may be found at https://wiki.dfrobot.com/HUSKYLENS_V1.0_SKU_SEN0305_SEN0336#target_20.
*
* Note again that the device only recognizes the 36h11 family of tags out of the box.
*/
while(opModeIsActive()) {
if (!rateLimit.hasExpired()) {
continue;
}
rateLimit.reset();
/*
* All algorithms, except for LINE_TRACKING, return a list of Blocks where a
* Block represents the outline of a recognized object along with its ID number.
* ID numbers allow you to identify what the device saw. See the HuskyLens documentation
* referenced in the header comment above for more information on IDs and how to
* assign them to objects.
*
* Returns an empty array if no objects are seen.
*/
HuskyLens.Block[] blocks = huskyLens.blocks();
telemetry.addData("Block count", blocks.length);
for (int i = 0; i < blocks.length; i++) {
telemetry.addData("Block", blocks[i].toString());
}
telemetry.update();
}
}
}

View File

@ -29,46 +29,44 @@
package org.firstinspires.ftc.robotcontroller.external.samples;
import static com.qualcomm.hardware.rev.RevHubOrientationOnRobot.xyzOrientation;
import com.qualcomm.hardware.rev.RevHubOrientationOnRobot;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.IMU;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
/**
* {@link SensorIMUNonOrthogonal} shows how to use the new universal {@link IMU} interface. This
import static com.qualcomm.hardware.rev.RevHubOrientationOnRobot.xyzOrientation;
/*
* This OpMode shows how to use the new universal IMU interface. This
* interface may be used with the BNO055 IMU or the BHI260 IMU. It assumes that an IMU is configured
* on the robot with the name "imu".
* <p>
*
* The sample will display the current Yaw, Pitch and Roll of the robot.<br>
* With the correct orientation parameters selected, pitch/roll/yaw should act as follows:
* <p>
* Pitch value should INCREASE as the robot is tipped UP at the front. (Rotation about X) <br>
* Roll value should INCREASE as the robot is tipped UP at the left side. (Rotation about Y) <br>
* Yaw value should INCREASE as the robot is rotated Counter Clockwise. (Rotation about Z) <br>
* <p>
* Pitch value should INCREASE as the robot is tipped UP at the front. (Rotation about X)
* Roll value should INCREASE as the robot is tipped UP at the left side. (Rotation about Y)
* Yaw value should INCREASE as the robot is rotated Counter Clockwise. (Rotation about Z)
*
* The yaw can be reset (to zero) by pressing the Y button on the gamepad (Triangle on a PS4 controller)
* <p>
*
* This specific sample DOES NOT assume that the Hub is mounted on one of the three orthogonal
* planes (X/Y, X/Z or Y/Z) OR that the Hub has only been rotated in a range of 90 degree increments.
* <p>
*
* Note: if your Hub is mounted Orthogonally (on a orthogonal surface, angled at some multiple of
* 90 Degrees) then you should use the simpler SensorImuOrthogonal sample in this folder.
* <p>
*
* But... If your Hub is mounted Non-Orthogonally, you must specify one or more rotational angles
* that transform a "Default" Hub orientation into your desired orientation. That is what is
* illustrated here.
* <p>
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
* <p>
*
* Finally, edit this OpMode to use at least one angle around an axis to orient your Hub.
*/
@TeleOp(name = "Sensor: IMU Non-Orthogonal", group = "Sensor")

View File

@ -34,46 +34,44 @@ import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.IMU;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
/**
* {@link SensorIMUOrthogonal} shows how to use the new universal {@link IMU} interface. This
/*
* This OpMode shows how to use the new universal IMU interface. This
* interface may be used with the BNO055 IMU or the BHI260 IMU. It assumes that an IMU is configured
* on the robot with the name "imu".
* <p>
*
* The sample will display the current Yaw, Pitch and Roll of the robot.<br>
* With the correct orientation parameters selected, pitch/roll/yaw should act as follows:
* <p>
* Pitch value should INCREASE as the robot is tipped UP at the front. (Rotation about X) <br>
* Roll value should INCREASE as the robot is tipped UP at the left side. (Rotation about Y) <br>
* Yaw value should INCREASE as the robot is rotated Counter Clockwise. (Rotation about Z) <br>
* <p>
*
* The yaw can be reset (to zero) by pressing the Y button on the gamepad (Triangle on a PS4 controller)
* <p>
*
* This specific sample assumes that the Hub is mounted on one of the three orthogonal planes
* (X/Y, X/Z or Y/Z) and that the Hub has only been rotated in a range of 90 degree increments.
* <p>
*
* Note: if your Hub is mounted on a surface angled at some non-90 Degree multiple (like 30) look at
* the alternative SensorImuNonOrthogonal sample in this folder.
* <p>
*
* This "Orthogonal" requirement means that:
* <p>
*
* 1) The Logo printed on the top of the Hub can ONLY be pointing in one of six directions:
* FORWARD, BACKWARD, UP, DOWN, LEFT and RIGHT.
* <p>
*
* 2) The USB ports can only be pointing in one of the same six directions:<br>
* FORWARD, BACKWARD, UP, DOWN, LEFT and RIGHT.
* <p>
*
* So, To fully define how your Hub is mounted to the robot, you must simply specify:<br>
* logoFacingDirection<br>
* usbFacingDirection
* <p>
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
* <p>
*
* Finally, choose the two correct parameters to define how your Hub is mounted and edit this OpMode
* to use those parameters.
*/

View File

@ -36,7 +36,6 @@ import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.Gyroscope;
import com.qualcomm.robotcore.hardware.IntegratingGyroscope;
import com.qualcomm.robotcore.util.ElapsedTime;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
@ -44,11 +43,11 @@ import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
/*
* This is an example LinearOpMode that shows how to use Kauai Labs navX Micro Robotics Navigation
* This OpMode shows how to use Kauai Labs navX Micro Robotics Navigation
* Sensor. It assumes that the sensor is configured with a name of "navx".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: KL navX Micro", group = "Sensor")
@Disabled

View File

@ -32,7 +32,6 @@ package org.firstinspires.ftc.robotcontroller.external.samples;
import android.app.Activity;
import android.graphics.Color;
import android.view.View;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
@ -40,10 +39,10 @@ import com.qualcomm.robotcore.hardware.ColorSensor;
/*
*
* This is an example LinearOpMode that shows how to use
* This OpMode that shows how to use
* a Modern Robotics Color Sensor.
*
* The op mode assumes that the color sensor
* The OpMode assumes that the color sensor
* is configured with a name of "sensor_color".
*
* You can use the X button on gamepad1 to toggle the LED on and off.
@ -88,7 +87,7 @@ public class SensorMRColor extends LinearOpMode {
// wait for the start button to be pressed.
waitForStart();
// while the op mode is active, loop and read the RGB data.
// while the OpMode is active, loop and read the RGB data.
// Note we use opModeIsActive() as our loop condition because it is an interruptible method.
while (opModeIsActive()) {

View File

@ -1,148 +0,0 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.modernrobotics.ModernRoboticsI2cCompassSensor;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.CompassSensor;
import com.qualcomm.robotcore.util.ElapsedTime;
import org.firstinspires.ftc.robotcore.external.navigation.Acceleration;
/**
* The {@link SensorMRCompass} op mode provides a demonstration of the
* functionality provided by the Modern Robotics compass sensor.
*
* The op mode assumes that the MR compass is configured with a name of "compass".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*
* @see <a href="http://www.modernroboticsinc.com/compass">MR Compass Sensor</a>
*/
@TeleOp(name = "Sensor: MR compass", group = "Sensor")
@Disabled // comment out or remove this line to enable this opmode
public class SensorMRCompass extends LinearOpMode {
ModernRoboticsI2cCompassSensor compass;
ElapsedTime timer = new ElapsedTime();
@Override public void runOpMode() {
// get a reference to our compass
compass = hardwareMap.get(ModernRoboticsI2cCompassSensor.class, "compass");
telemetry.log().setCapacity(20);
telemetry.log().add("The compass sensor operates quite well out-of-the");
telemetry.log().add("box, as shipped by the manufacturer. Precision can");
telemetry.log().add("however be somewhat improved with calibration.");
telemetry.log().add("");
telemetry.log().add("To calibrate the compass once the opmode is");
telemetry.log().add("started, make sure the compass is level, then");
telemetry.log().add("press 'A' on the gamepad. Next, slowly rotate the ");
telemetry.log().add("compass in a full 360 degree circle while keeping");
telemetry.log().add("it level. When complete, press 'B'.");
// wait for the start button to be pressed
waitForStart();
telemetry.log().clear();
while (opModeIsActive()) {
// If the A button is pressed, start calibration and wait for the A button to rise
if (gamepad1.a && !compass.isCalibrating()) {
telemetry.log().clear();
telemetry.log().add("Calibration started");
telemetry.log().add("Slowly rotate compass 360deg");
telemetry.log().add("Press 'B' when complete");
compass.setMode(CompassSensor.CompassMode.CALIBRATION_MODE);
timer.reset();
while (gamepad1.a && opModeIsActive()) {
doTelemetry();
idle();
}
}
// If the B button is pressed, stop calibration and wait for the B button to rise
if (gamepad1.b && compass.isCalibrating()) {
telemetry.log().clear();
telemetry.log().add("Calibration complete");
compass.setMode(CompassSensor.CompassMode.MEASUREMENT_MODE);
if (compass.calibrationFailed()) {
telemetry.log().add("Calibration failed");
compass.writeCommand(ModernRoboticsI2cCompassSensor.Command.NORMAL);
}
while (gamepad1.a && opModeIsActive()) {
doTelemetry();
idle();
}
}
doTelemetry();
}
}
protected void doTelemetry() {
if (compass.isCalibrating()) {
telemetry.addData("compass", "calibrating %s", Math.round(timer.seconds())%2==0 ? "|.." : "..|");
} else {
// getDirection() returns a traditional compass heading in the range [0,360),
// with values increasing in a CW direction
telemetry.addData("heading", "%.1f", compass.getDirection());
// getAcceleration() returns the current 3D acceleration experienced by
// the sensor. This is used internally to the sensor to compute its tilt and thence
// to correct the magnetometer reading to produce tilt-corrected values in getDirection()
Acceleration accel = compass.getAcceleration();
double accelMagnitude = Math.sqrt(accel.xAccel*accel.xAccel + accel.yAccel*accel.yAccel + accel.zAccel*accel.zAccel);
telemetry.addData("accel", accel);
telemetry.addData("accel magnitude", "%.3f", accelMagnitude);
// getMagneticFlux returns the 3D magnetic field flux experienced by the sensor
telemetry.addData("mag flux", compass.getMagneticFlux());
}
// the command register provides status data
telemetry.addData("command", "%s", compass.readCommand());
telemetry.update();
}
}

View File

@ -33,19 +33,17 @@ import com.qualcomm.hardware.modernrobotics.ModernRoboticsI2cGyro;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.Gyroscope;
import com.qualcomm.robotcore.hardware.IntegratingGyroscope;
import com.qualcomm.robotcore.util.ElapsedTime;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
/*
* This is an example LinearOpMode that shows how to use the Modern Robotics Gyro.
* This OpMode shows how to use the Modern Robotics Gyro.
*
* The op mode assumes that the gyro sensor is attached to a Device Interface Module
* The OpMode assumes that the gyro sensor is attached to a Device Interface Module
* I2C channel and is configured with a name of "gyro".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
@ -55,7 +53,7 @@ import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
@Disabled
public class SensorMRGyro extends LinearOpMode {
/** In this sample, for illustration purposes we use two interfaces on the one gyro object.
/* In this sample, for illustration purposes we use two interfaces on the one gyro object.
* That's likely atypical: you'll probably use one or the other in any given situation,
* depending on what you're trying to do. {@link IntegratingGyroscope} (and it's base interface,
* {@link Gyroscope}) are common interfaces supported by possibly several different gyro

View File

@ -35,12 +35,11 @@ import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.OpticalDistanceSensor;
/*
* This is an example LinearOpMode that shows how to use
* a Modern Robotics Optical Distance Sensor
* This OpMode shows how to use a Modern Robotics Optical Distance Sensor
* It assumes that the ODS sensor is configured with a name of "sensor_ods".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: MR ODS", group = "Sensor")
@Disabled
@ -57,7 +56,7 @@ public class SensorMROpticalDistance extends LinearOpMode {
// wait for the start button to be pressed.
waitForStart();
// while the op mode is active, loop and read the light levels.
// while the OpMode is active, loop and read the light levels.
// Note we use opModeIsActive() as our loop condition because it is an interruptible method.
while (opModeIsActive()) {

View File

@ -33,22 +33,20 @@ import com.qualcomm.hardware.modernrobotics.ModernRoboticsI2cRangeSensor;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
/**
* {@link SensorMRRangeSensor} illustrates how to use the Modern Robotics
* Range Sensor.
/*
* This OpMode illustrates how to use the Modern Robotics Range Sensor.
*
* The op mode assumes that the range sensor is configured with a name of "sensor_range".
* The OpMode assumes that the range sensor is configured with a name of "sensor_range".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* @see <a href="http://modernroboticsinc.com/range-sensor">MR Range Sensor</a>
*/
@TeleOp(name = "Sensor: MR range sensor", group = "Sensor")
@Disabled // comment out or remove this line to enable this opmode
@Disabled // comment out or remove this line to enable this OpMode
public class SensorMRRangeSensor extends LinearOpMode {
ModernRoboticsI2cRangeSensor rangeSensor;

View File

@ -33,38 +33,36 @@ THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.rev.Rev2mDistanceSensor;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DistanceSensor;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
/**
* {@link SensorREV2mDistance} illustrates how to use the REV Robotics
* Time-of-Flight Range Sensor.
/*
* This OpMode illustrates how to use the REV Robotics 2M Distance Sensor.
*
* The op mode assumes that the range sensor is configured with a name of "sensor_range".
* The OpMode assumes that the sensor is configured with a name of "sensor_distance".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*
* @see <a href="http://revrobotics.com">REV Robotics Web Page</a>
* See the sensor's product page: https://www.revrobotics.com/rev-31-1505/
*/
@TeleOp(name = "Sensor: REV2mDistance", group = "Sensor")
@Disabled
public class SensorREV2mDistance extends LinearOpMode {
private DistanceSensor sensorRange;
private DistanceSensor sensorDistance;
@Override
public void runOpMode() {
// you can use this as a regular DistanceSensor.
sensorRange = hardwareMap.get(DistanceSensor.class, "sensor_range");
sensorDistance = hardwareMap.get(DistanceSensor.class, "sensor_distance");
// you can also cast this to a Rev2mDistanceSensor if you want to use added
// methods associated with the Rev2mDistanceSensor class.
Rev2mDistanceSensor sensorTimeOfFlight = (Rev2mDistanceSensor)sensorRange;
Rev2mDistanceSensor sensorTimeOfFlight = (Rev2mDistanceSensor) sensorDistance;
telemetry.addData(">>", "Press start to continue");
telemetry.update();
@ -72,11 +70,11 @@ public class SensorREV2mDistance extends LinearOpMode {
waitForStart();
while(opModeIsActive()) {
// generic DistanceSensor methods.
telemetry.addData("deviceName",sensorRange.getDeviceName() );
telemetry.addData("range", String.format("%.01f mm", sensorRange.getDistance(DistanceUnit.MM)));
telemetry.addData("range", String.format("%.01f cm", sensorRange.getDistance(DistanceUnit.CM)));
telemetry.addData("range", String.format("%.01f m", sensorRange.getDistance(DistanceUnit.METER)));
telemetry.addData("range", String.format("%.01f in", sensorRange.getDistance(DistanceUnit.INCH)));
telemetry.addData("deviceName", sensorDistance.getDeviceName() );
telemetry.addData("range", String.format("%.01f mm", sensorDistance.getDistance(DistanceUnit.MM)));
telemetry.addData("range", String.format("%.01f cm", sensorDistance.getDistance(DistanceUnit.CM)));
telemetry.addData("range", String.format("%.01f m", sensorDistance.getDistance(DistanceUnit.METER)));
telemetry.addData("range", String.format("%.01f in", sensorDistance.getDistance(DistanceUnit.INCH)));
// Rev2mDistanceSensor specific methods.
telemetry.addData("ID", String.format("%x", sensorTimeOfFlight.getModelID()));

View File

@ -32,50 +32,44 @@ package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.IrSeekerSensor;
import com.qualcomm.robotcore.hardware.TouchSensor;
/*
* This is an example LinearOpMode that shows how to use
* the Modern Robotics ITR Seeker
* This OpMode demonstrates how to use a REV Robotics Touch Sensor, REV Robotics Magnetic Limit Switch, or other device
* that implements the TouchSensor interface. Any touch sensor that connects its output to ground when pressed
* (known as "active low") can be configured as a "REV Touch Sensor". This includes REV's Magnetic Limit Switch.
*
* The op mode assumes that the IR Seeker
* is configured with a name of "sensor_ir".
* The OpMode assumes that the touch sensor is configured with a name of "sensor_touch".
*
* Set the switch on the Modern Robotics IR beacon to 1200 at 180. <br>
* Turn on the IR beacon.
* Make sure the side of the beacon with the LED on is facing the robot. <br>
* A REV Robotics Touch Sensor must be configured on digital port number 1, 3, 5, or 7.
* A Magnetic Limit Switch can be configured on any digital port.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "Sensor: MR IR Seeker", group = "Sensor")
@TeleOp(name = "Sensor: REV touch sensor", group = "Sensor")
@Disabled
public class SensorMRIrSeeker extends LinearOpMode {
public class SensorTouch extends LinearOpMode {
TouchSensor touchSensor; // Touch sensor Object
@Override
public void runOpMode() {
IrSeekerSensor irSeeker; // Hardware Device Object
// get a reference to our GyroSensor object.
irSeeker = hardwareMap.get(IrSeekerSensor.class, "sensor_ir");
// get a reference to our touchSensor object.
touchSensor = hardwareMap.get(TouchSensor.class, "sensor_touch");
// wait for the start button to be pressed.
waitForStart();
// while the OpMode is active, loop and read whether the sensor is being pressed.
// Note we use opModeIsActive() as our loop condition because it is an interruptible method.
while (opModeIsActive()) {
// Ensure we have a IR signal
if (irSeeker.signalDetected())
{
// Display angle and strength
telemetry.addData("Angle", irSeeker.getAngle());
telemetry.addData("Strength", irSeeker.getStrength());
}
else
{
// Display loss of signal
telemetry.addData("Seeker", "Signal Lost");
// send the info back to driver station using telemetry function.
if (touchSensor.isPressed()) {
telemetry.addData("Touch Sensor", "Is Pressed");
} else {
telemetry.addData("Touch Sensor", "Is Not Pressed");
}
telemetry.update();

View File

@ -0,0 +1,127 @@
/*
* Copyright (c) 2023 FIRST
*
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to
* endorse or promote products derived from this software without specific prior
* written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
* TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
* THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.util.Size;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.vision.VisionPortal;
import java.util.Locale;
/*
* This OpMode helps calibrate a webcam or RC phone camera, useful for AprilTag pose estimation
* with the FTC VisionPortal. It captures a camera frame (image) and stores it on the Robot Controller
* (Control Hub or RC phone), with each press of the gamepad button X (or Square).
* Full calibration instructions are here:
*
* https://ftc-docs.firstinspires.org/camera-calibration
*
* In Android Studio, copy this class into your "teamcode" folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*
* In OnBot Java, use "Add File" to add this OpMode from the list of Samples.
*/
@TeleOp(name = "Utility: Camera Frame Capture", group = "Utility")
@Disabled
public class UtilityCameraFrameCapture extends LinearOpMode
{
/*
* EDIT THESE PARAMETERS AS NEEDED
*/
final boolean USING_WEBCAM = false;
final BuiltinCameraDirection INTERNAL_CAM_DIR = BuiltinCameraDirection.BACK;
final int RESOLUTION_WIDTH = 640;
final int RESOLUTION_HEIGHT = 480;
// Internal state
boolean lastX;
int frameCount;
long capReqTime;
@Override
public void runOpMode()
{
VisionPortal portal;
if (USING_WEBCAM)
{
portal = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "Webcam 1"))
.setCameraResolution(new Size(RESOLUTION_WIDTH, RESOLUTION_HEIGHT))
.build();
}
else
{
portal = new VisionPortal.Builder()
.setCamera(INTERNAL_CAM_DIR)
.setCameraResolution(new Size(RESOLUTION_WIDTH, RESOLUTION_HEIGHT))
.build();
}
while (!isStopRequested())
{
boolean x = gamepad1.x;
if (x && !lastX)
{
portal.saveNextFrameRaw(String.format(Locale.US, "CameraFrameCapture-%06d", frameCount++));
capReqTime = System.currentTimeMillis();
}
lastX = x;
telemetry.addLine("######## Camera Capture Utility ########");
telemetry.addLine(String.format(Locale.US, " > Resolution: %dx%d", RESOLUTION_WIDTH, RESOLUTION_HEIGHT));
telemetry.addLine(" > Press X (or Square) to capture a frame");
telemetry.addData(" > Camera Status", portal.getCameraState());
if (capReqTime != 0)
{
telemetry.addLine("\nCaptured Frame!");
}
if (capReqTime != 0 && System.currentTimeMillis() - capReqTime > 1000)
{
capReqTime = 0;
}
telemetry.update();
}
}
}

View File

@ -37,7 +37,7 @@ import com.qualcomm.robotcore.eventloop.opmode.OpModeRegister;
import org.firstinspires.ftc.robotcontroller.external.samples.ConceptNullOp;
/**
* {@link FtcOpModeRegister} is responsible for registering opmodes for use in an FTC game.
* {@link FtcOpModeRegister} is responsible for registering OpModes for use in an FTC game.
* @see #register(OpModeManager)
*/
public class FtcOpModeRegister implements OpModeRegister {

View File

@ -625,8 +625,8 @@ public class FtcRobotControllerActivity extends Activity
}
/**
* Updates the orientation of monitorContainer (which contains cameraMonitorView and
* tfodMonitorView) based on the given configuration. Makes the children split the space.
* Updates the orientation of monitorContainer (which contains cameraMonitorView)
* based on the given configuration. Makes the children split the space.
*/
private void updateMonitorLayout(Configuration configuration) {
LinearLayout monitorContainer = (LinearLayout) findViewById(R.id.monitorContainer);

View File

@ -140,13 +140,6 @@ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
android:layout_weight="1"
android:orientation="vertical"
/>
<FrameLayout
android:id="@+id/tfodMonitorViewId"
android:visibility="gone"
android:layout_width="match_parent"
android:layout_height="0dp"
android:layout_weight="1"
/>
</LinearLayout>

10
IMU.md Normal file
View File

@ -0,0 +1,10 @@
# IMU
## Current list of IMUs
| Robot Name | Control Hub Firmware | Control Hub IMU | Expansion Hub? | Expansion Hub Firmware | Expansion Hub IMU |
|--------------------|----------------------|-----------------|-----------------|------------------------|-------------------|
| 14493-RC | 1.8.2 | None | Yes | 1.8.2 | None |
| 14493-b-RC | 1.8.2 | None | Yes | 1.8.2 | None |
| 14493-RC-Chassis | 1.8.2 | BNO055 | No | N/A | N/A |

282
README.md
View File

@ -1,10 +1,15 @@
## NOTICE
This repository contains the public FTC SDK for the POWERPLAY (2022-2023) competition season.
This repository contains the public FTC SDK for the CENTERSTAGE (2023-2024) competition season.
## Welcome!
This GitHub repository contains the source code that is used to build an Android app to control a *FIRST* Tech Challenge competition robot. To use this SDK, download/clone the entire project to your local computer.
## Requirements
To use this Android Studio project, you will need Android Studio 2021.2 (codename Chipmunk) or later.
To program your robot in Blocks or OnBot Java, you do not need Android Studio.
## Getting Started
If you are new to robotics or new to the *FIRST* Tech Challenge, then you should consider reviewing the [FTC Blocks Tutorial](https://ftc-docs.firstinspires.org/programming_resources/blocks/Blocks-Tutorial.html) to get familiar with how to use the control system:
@ -54,6 +59,171 @@ The readme.md file located in the [/TeamCode/src/main/java/org/firstinspires/ftc
# Release Information
## Version 9.1 (20240215-115542)
### Enhancements
* Fixes a problem with Blocks: if the user closes a Block's warning balloon, it will still be closed next time the project is opened in the Blocks editor.
* In the Blocks editor, an alert concerning missing hardware devices is not shown if all the Blocks that use the missing hardware devices are disabled.
* Adds Blocks to support comparing property values CRServo.Direction, DCMotor.Direction, DCMotor.Mode, DCMotor.ZeroPowerBehavior, DigitalChannel.Mode, GyroSensor.HeadingMode, IrSeekerSensor.Mode, and Servo.Direction, to the corresponding enum Block.
* Improves OnBotJava auto-import to correctly import classes when used in certain situations.
* Improves OnBotJava autocomplete to provide better completion options in most cases.
* This fixes an issue where autocomplete would fail if a method with two or more formal parameters was defined.
* In OnBotJava, code folding support was added to expand and collapse code sections
* In OnBotJava, the copyright header is now automatically collapsed loading new files
* For all Blocks OpMode samples, intro comments have been moved to the RunOpMode comment balloon.
* The Clean up Blocks command in the Blocks editor now positions function Blocks so their comment balloons don't overlap other function Blocks.
* Added Blocks OpMode sample SensorTouch.
* Added Java OpMode sample SensorDigitalTouch.
* Several improvements to VisionPortal
* Adds option to control whether the stream is automatically started following a `.build()` call on a VisionPortal Builder
* Adds option to control whether the vision processing statistics overlay is rendered or not
* VisionPortals now implement the `CameraStreamSource` interface, allowing multiportal users to select which portal is routed to the DS in INIT by calling CameraStreamServer.getInstance().setSource(visionPortal). Can be selected via gamepad, between Camera Stream sessions.
* Add option to `AprilTagProcessor` to suppress calibration warnings
* Improves camera calibration warnings
* If a calibration is scaled, the resolution it was scaled from will be listed
* If calibrations exist with the wrong aspect ratio, the calibrated resolutions will be listed
* Fixes race condition which caused app crash when calling `stopStreaming()` immediately followed by `close()` on a VisionPortal
* Fixes IllegalStateException when calling `stopStreaming()` immediately after building a VisionPortal
* Added FTC Blocks counterparts to new Java methods:
* VisionPortal.Builder.setAutoStartStreamOnBuild
* VisionPortal.Builder.setShowStatsOverlay
* AprilTagProcessor.Builder.setSuppressCalibrationWarnings
* CameraStreamServer.setSource
### Bug Fixes
* Fixes a problem where OnBotJava does not apply font size settings to the editor.
* Updates EasyOpenCV dependency to v1.7.1
* Fixes inability to use EasyOpenCV CameraFactory in OnBotJava
* Fixes entire RC app crash when user pipeline throws an exception
* Fixes entire RC app crash when user user canvas annotator throws an exception
* Use the modern stacktrace display when handling user exceptions instead of the legacy ESTOP telemetry message
## Version 9.0.1 (20230929-083754)
### Enhancements
* Updates AprilTag samples to include Decimation and additional Comments. Also corrects misleading tag ID warnings
* Increases maximum size of Blocks inline comments to 140 characters
* Adds Blocks sample BasicOmniOpMode.
* Updated CENTERSTAGE library AprilTag orientation quaternions
* Thanks [@FromenActual](https://github.com/FromenActual)
* Updated Java Sample ConceptTensorFlowObjectDetection.java to include missing elements needed for custom model support.
### Bug Fixes
* Fixes a problem where after October 1 the Driver Station will report as obsolete on v9.0 and prompt the user to update.
## Version 9.0 (20230830-154348)
### Breaking Changes
* Removes Vuforia
* Fields in `AprilTagDetection` and `AprilTagPose(ftc/raw)` objects are now `final`
* VisionPortal builder method `setCameraMonitorViewId()` has been renamed to `setLiveViewContainerId()` and `enableCameraMonitoring()` has been renamed to `enableLiveView()`
### Enhancements
* Adds support for the DFRobot HuskyLens Vision Sensor.
* Blocks teams can now perform webcam calibration.
* Added a Block for System.currentTimeMillis (under Utilities/Time)
* Added a Block for VisionPortal.saveNextFrameRaw (under Vision/VisionPortal)
* Added a new sample Blocks OpMode called UtilityCameraFrameCapture.
* The RobotDriveByGyro sample has been updated to use the new universal IMU interface. It now supports both IMU types.
* Removed some error-prone ElapsedTime Blocks from the Blocks editor's toolbox. This is not a
breaking change: old Blocks OpModes that use these Blocks will still function, both in the
Blocks editor and at runtime.
* Standardizes on the form "OpMode" for the term OpMode.
* The preferred way to refer to OpModes that specifically extend `LinearOpMode` (including Blocks OpModes) is "linear OpMode".
* The preferred way to refer to OpModes that specifically extend `OpMode` directly is "iterative OpMode".
* Overhauls `OpMode` and `LinearOpMode` Javadoc comments to be easier to read and include more detail.
* Makes minor enhancements to Java samples
* Javadoc comments in samples that could be rendered badly in Android Studio have been converted to standard multi-line comments
* Consistency between samples has been improved
* The SensorDigitalTouch sample has been replaced with a new SensorTouch sample that uses the `TouchSensor` interface instead of `DigitalChannel`.
* The ConceptCompassCalibration, SensorMRCompass, and SensorMRIRSeeker samples have been deleted, as they are not useful for modern FTC competitions.
### Bug Fixes
* Fixes a bug which prevented PlayStation gamepads from being used in bluetooth mode. Bluetooth is NOT legal for competition but may be useful to allow a DS device to be used while charging, or at an outreach event.
* Fixes a bug where a Blocks OpMode's Date Modified value can change to December 31, 1969, if the Control Hub is rebooted while the Blocks OpMode is being edited.
* Fixes the automatic TeleOp preselection feature (was broken in 8.2)
* Fixes a bug where passing an integer number such as 123 to the Telemetry.addData block that takes a number shows up as 123.0 in the telemetry.
* Fixes OnBotJava autocomplete issues:
* Autocomplete would incorrectly provide values for the current class when autocompleting a local variable
* `hardwareMap` autocomplete would incorrectly include lambda class entries
* Fixes OnBotJava not automatically importing classes.
* Fixes OnBotJava tabs failing to close when their file is deleted.
* Fixes a project view refresh not happening when a file is renamed in OnBotJava.
* Fixes the "Download" context menu item for external libraries in the OnBotJava interface.
* Fixes issue where Driver Station telemetry would intermittently freeze when set to Monospace mode.
* Fixes performance regression for certain REV Hub operations that was introduced in version 8.2.
* Fixes TagID comparison logic in DriveToTag samples.
## Version 8.2 (20230707-131020)
### Breaking Changes
* Non-linear (iterative) OpModes are no longer allowed to manipulate actuators in their `stop()` method. Attempts to do so will be ignored and logged.
* When an OpMode attempts to illegally manipulate an actuator, the Robot Controller will print a log message
including the text `CANCELLED_FOR_SAFETY`.
* Additionally, LinearOpModes are no longer able to regain the ability to manipulate actuators by removing their
thread's interrupt or using another thread.
* Removes support for Android version 6.0 (Marshmallow). The minSdkVersion is now 24.
* Increases the Robocol version.
* This means an 8.2 or later Robot Controller or Driver Station will not be able to communicate with an 8.1 or earlier Driver Station or Robot Controller.
* If you forget to update both apps at the same time, an error message will be shown explaining which app is older and should be updated.
* FTC_FieldCoordinateSystemDefinition.pdf has been moved. It is still in the git history, but has been removed from the git snapshot corresponding with the 8.2 tag. The official version now lives at [Field Coordinate System](https://ftc-docs.firstinspires.org/field-coordinate-system).
* `LynxUsbDevice.addConfiguredModule()` and `LynxUsbDevice.getConfiguredModule()` have been replaced with `LynxUsbDevice.getOrAddModule()`.
* Old Blocks for Vuforia and TensorFlow Object Detection are obsolete and have been removed from the
Blocks editor's toolbox. Existing Blocks OpModes that contain the old Blocks for Vuforia or
TensorFlow Object Detection can be opened in the Blocks editor, but running them will not work.
### New features
* Adds new `VisionPortal` API for computer vision
* **This API may be subject to change for final kickoff release!**
* Several new samples added.
* Adds support for detecting AprilTags.
* `VisionPortal` is the new entry point for both AprilTag and TFOD processing.
* Vuforia will be removed in a future release.
* Updated TensorFlow dependencies.
* Added support for webcam camera controls to blocks.
* The Blocks editor's toolbox now has a Vision category, directly above the Utilities category.
* Related documentation for associated technologies can be found at
* [AprilTag Introduction](https://ftc-docs.firstinspires.org/apriltag-intro)
* [AprilTag SDK Guide](https://ftc-docs.firstinspires.org/apriltag-sdk)
* [AprilTag Detection Values](https://ftc-docs.firstinspires.org/apriltag-detection-values)
* [AprilTag Test Images](https://ftc-docs.firstinspires.org/apriltag-test-images)
* [Camera Calibration](https://ftc-docs.firstinspires.org/camera-calibration)
* Adds Driver Station support for Logitech Dual Action and Sony PS5 DualSense gamepads.
* This **does not** include support for the Sony PS5 DualSense Edge gamepad.
* Always refer to Game Manual 1 to determine gamepad legality in competition.
* Adds support for MJPEG payload streaming to UVC driver (external JPEG decompression routine required for use).
* Shows a hint on the Driver Station UI about how to bind a gamepad when buttons are pressed or the sticks are moved on an unbound gamepad.
* Adds option for fullscreening "Camera Stream" on Driver Station.
* OnBotJava source code is automatically saved as a ZIP file on every build with a rolling window of the last 30 builds kept; allows recovering source code from previous builds if code is accidentally deleted or corrupted.
* Adds support for changing the addresses of Expansion Hubs that are not connected directly via USB.
* The Expansion Hub Address Change screen now has an Apply button that changes the addresses without leaving the screen.
* Addresses that are assigned to other hubs connected to the same USB connection or Control Hub are no longer able to be selected.
* Increases maximum size of Blocks inline comments to 100 characters
* Saves position of open Blocks comment balloons
* Adds new AprilTag Driving samples: RobotDriveToAprilTagTank & RobotDriveToAprilTagOmni
* Adds Sample to illustrate optimizing camera exposure for AprilTags: ConceptAprilTagOptimizeExposure
### Bug Fixes
* Corrects inspection screen to report app version using the SDK version defined in the libraries instead of the version specified in `AndroidManifest.xml`. This corrects the case where the app could show matching versions numbers to the user but still state that the versions did not match.
* If the version specified in `AndroidManifest.xml` does not match the SDK version, an SDK version entry will be displayed on the Manage webpage.
* Fixes no error being displayed when saving a configuration file with duplicate names from the Driver Station.
* Fixes a deadlock in the UVC driver which manifested in https://github.com/OpenFTC/EasyOpenCV/issues/57.
* Fixes a deadlock in the UVC driver that could occur when hot-plugging cameras.
* Fixes UVC driver compatibility with Arducam OV9281 global shutter camera.
* Fixes Emergency Stop condition when an OnBotJava build with duplicate OpMode names occurs.
* Fixes known causes of "Attempted use of a closed LynxModule instance" logspam.
* Fixes the visual identification LED pattern when configuring Expansion Hubs connected via RS-485.
## Version 8.1.1 (20221201-150726)
This is a bug fix only release to address the following four issues.
* [Issue #492](https://github.com/FIRST-Tech-Challenge/FtcRobotController/issues/492) - Can't create new blocks opmodes.
* [Issue #495](https://github.com/FIRST-Tech-Challenge/FtcRobotController/issues/495) - Remove the final modifier from the OpMode's Telemetry object.
* [Issue #500](https://github.com/FIRST-Tech-Challenge/FtcRobotController/issues/500) - Some devices cannot be configured when the Driver Station app has been updated to 8.1
* Updating either the Robot Controller app or the Driver Station app to 8.1.1 or later will fix this issue.
* The Modern Robotics touch sensor was configurable as a Digital Device. It can only be used as an Analog Device.
## Version 8.1 (20221121-115119)
### Breaking Changes
@ -243,8 +413,8 @@ The readme.md file located in the [/TeamCode/src/main/java/org/firstinspires/ftc
* External libraries can provide support for hardware devices by using the annotation in the
com.qualcomm.robotcore.hardware.configuration.annotations package.
* External libraries can include .so files for native code.
* External libraries can be used from OnBotJava op modes.
* External libraries that use the following annotations can be used from Blocks op modes.
* External libraries can be used from OnBotJava OpModes.
* External libraries that use the following annotations can be used from Blocks OpModes.
* org.firstinspires.ftc.robotcore.external.ExportClassToBlocks
* org.firstinspires.ftc.robotcore.external.ExportToBlocks
* External libraries that use the following annotations can add new hardware devices:
@ -374,8 +544,8 @@ The readme.md file located in the [/TeamCode/src/main/java/org/firstinspires/ftc
Position PIDF values: `P = 5.0`
### New features
* Includes TensorFlow inference model and sample op modes to detect Ultimate Goal Starter Stacks (four rings vs single ring stack).
* Includes Vuforia Ultimate Goal vision targets and sample op modes.
* Includes TensorFlow inference model and sample OpModes to detect Ultimate Goal Starter Stacks (four rings vs single ring stack).
* Includes Vuforia Ultimate Goal vision targets and sample OpModes.
* Introduces a digital zoom feature for TensorFlow object detection (to detect objects more accurately at greater distances).
* Adds configuration entry for the REV UltraPlanetary HD Hex motor
@ -591,7 +761,7 @@ Version 5.5 requires Android Studio 4.0 or later.
* Large dropdown lists display properly on lower res screens
* Disabled buttons are now visually identifiable as disabled
* A warning is shown if a user selects a TFOD sample, but their device is not compatible
* Warning messages in a Blocks op mode are now visible by default.
* Warning messages in a Blocks OpMode are now visible by default.
* Adds goBILDA 5201 and 5202 motors to Robot Configurator
* Adds PIDF Annotation values to AndyMark, goBILDA and TETRIX motor configurations.
This has the effect of causing the RUN_USING_ENCODERS and RUN_TO_POSITION modes to use
@ -609,12 +779,12 @@ Version 5.5 requires Android Studio 4.0 or later.
## Version 5.2 (20190905-083277)
* Fixes extra-wide margins on settings activities, and placement of the new configuration button
* Adds Skystone Vuforia image target data.
* Includes sample Skystone Vuforia Navigation op modes (Java).
* Includes sample Skystone Vuforia Navigation op modes (Blocks).
* Includes sample Skystone Vuforia Navigation OpModes (Java).
* Includes sample Skystone Vuforia Navigation OpModes (Blocks).
* Adds TensorFlow inference model (.tflite) for Skystone game elements.
* Includes sample Skystone TensorFlow op modes (Java).
* Includes sample Skystone TensorFlow op modes (Blocks).
* Removes older (season-specific) sample op modes.
* Includes sample Skystone TensorFlow OpModes (Java).
* Includes sample Skystone TensorFlow OpModes (Blocks).
* Removes older (season-specific) sample OpModes.
* Includes 64-bit support (to comply with [Google Play requirements](https://android-developers.googleblog.com/2019/01/get-your-apps-ready-for-64-bit.html)).
* Protects against Stuck OpModes when a Restart Robot is requested. (Thanks to FROGbots-4634) ([ftc_app issue #709](https://github.com/ftctechnh/ftc_app/issues/709))
* Blocks related changes:
@ -687,10 +857,10 @@ Known issues:
* Block Development Tool Changes
- Includes a fix for a problem with the Velocity blocks that were reported in the FTC Technology forum (Blocks Programming subforum).
- Change the "Save completed successfully." message to a white color so it will contrast with a green background.
- Fixed the "Download image" feature so it will work if there are text blocks in the op mode.
- Fixed the "Download image" feature so it will work if there are text blocks in the OpMode.
* Introduce support for Google's TensorFlow Lite technology for object detetion for 2018-2019 game.
- TensorFlow lite can recognize Gold Mineral and Silver Mineral from 2018-2019 game.
- Example Java and Block op modes are included to show how to determine the relative position of the gold block (left, center, right).
- Example Java and Block OpModes are included to show how to determine the relative position of the gold block (left, center, right).
## Version 4.1 (released on 18.09.24)
@ -699,7 +869,7 @@ Changes include:
* Change to allow FTC Robot Controller APK to be auto-updated using FIRST Global Control Hub update scripts.
* Removed samples for non supported / non legal hardware.
* Improvements to Telemetry.addData block with "text" socket.
* Updated Blocks sample op mode list to include Rover Ruckus Vuforia example.
* Updated Blocks sample OpMode list to include Rover Ruckus Vuforia example.
* Update SDK library version number.
## Version 4.0 (released on 18.09.12)
@ -769,8 +939,8 @@ Changes include:
+ If enabled, user provides a "Match Number" through the Driver Station user interface (top of the screen).
* The Match Number is used to create a log file specifically with log statements from that particular OpMode run.
* Match log files are stored in /sdcard/FIRST/matlogs on the Robot Controller.
* Once an op mode run is complete, the Match Number is cleared.
* This is a convenient way to create a separate match log with statements only related to a specific op mode run.
* Once an OpMode run is complete, the Match Number is cleared.
* This is a convenient way to create a separate match log with statements only related to a specific OpMode run.
* New Devices
- Support for REV Robotics Blinkin LED Controller.
@ -810,9 +980,9 @@ Known issues:
Changes include:
* Blocks Changes
- Uses updated Google Blockly software to allow users to edit their op modes on Apple iOS devices (including iPad and iPhone).
- Improvement in Blocks tool to handle corrupt op mode files.
- Autonomous op modes should no longer get switched back to tele-op after re-opening them to be edited.
- Uses updated Google Blockly software to allow users to edit their OpModes on Apple iOS devices (including iPad and iPhone).
- Improvement in Blocks tool to handle corrupt OpMode files.
- Autonomous OpModes should no longer get switched back to tele-op after re-opening them to be edited.
- The system can now detect type mismatches during runtime and alert the user with a message on the Driver Station.
* Updated javadoc documentation for setPower() method to reflect correct range of values (-1 to +1).
* Modified VuforiaLocalizerImpl to allow for user rendering of frames
@ -821,11 +991,11 @@ Changes include:
## Version 3.5 (built on 17.10.30)
Changes with version 3.5 include:
* Introduced a fix to prevent random op mode stops, which can occur after the Robot Controller app has been paused and then resumed (for example, when a user temporarily turns off the display of the Robot Controller phone, and then turns the screen back on).
* Introduced a fix to prevent random op mode stops, which were previously caused by random peer disconnect events on the Driver Station.
* Introduced a fix to prevent random OpMode stops, which can occur after the Robot Controller app has been paused and then resumed (for example, when a user temporarily turns off the display of the Robot Controller phone, and then turns the screen back on).
* Introduced a fix to prevent random OpMode stops, which were previously caused by random peer disconnect events on the Driver Station.
* Fixes issue where log files would be closed on pause of the RC or DS, but not re-opened upon resume.
* Fixes issue with battery handler (voltage) start/stop race.
* Fixes issue where Android Studio generated op modes would disappear from available list in certain situations.
* Fixes issue where Android Studio generated OpModes would disappear from available list in certain situations.
* Fixes problem where OnBot Java would not build on REV Robotics Control Hub.
* Fixes problem where OnBot Java would not build if the date and time on the Robot Controller device was "rewound" (set to an earlier date/time).
* Improved error message on OnBot Java that occurs when renaming a file fails.
@ -836,8 +1006,8 @@ Changes with version 3.5 include:
Changes with version 3.4 include:
* Added telemetry.update() statement for BlankLinearOpMode template.
* Renamed sample Block op modes to be more consistent with Java samples.
* Added some additional sample Block op modes.
* Renamed sample Block OpModes to be more consistent with Java samples.
* Added some additional sample Block OpModes.
* Reworded OnBot Java readme slightly.
## Version 3.3 (built on 17.09.04)
@ -855,14 +1025,14 @@ Changes with verion 3.3 include:
- Added support for VuMarks that will be used for the 2017-2018 season game.
* Blocks
- Update to latest Google Blockly release.
- Sample op modes can be selected as a template when creating new op mode.
- Sample OpModes can be selected as a template when creating new OpMode.
- Fixed bug where the blocks would disappear temporarily when mouse button is held down.
- Added blocks for Range.clip and Range.scale.
- User can now disable/enable Block op modes.
- User can now disable/enable Block OpModes.
- Fix to prevent occasional Blocks deadlock.
* OnBot Java
- Significant improvements with autocomplete function for OnBot Java editor.
- Sample op modes can be selected as a template when creating new op mode.
- Sample OpModes can be selected as a template when creating new OpMode.
- Fixes and changes to complete hardware setup feature.
- Updated (and more useful) onBot welcome message.
@ -870,13 +1040,13 @@ Known issues:
* Android Studio
- After updating to the new v3.3 Android Studio project folder, if you get error messages indicating "InvalidVirtualFileAccessException" then you might need to do a File->Invalidate Caches / Restart to clear the error.
* OnBot Java
- Sometimes when you push the build button to build all op modes, the RC returns an error message that the build failed. If you press the build button a second time, the build typically suceeds.
- Sometimes when you push the build button to build all OpModes, the RC returns an error message that the build failed. If you press the build button a second time, the build typically suceeds.
## Version 3.2 (built on 17.08.02)
This version of the software introduces the "OnBot Java" Development Tool. Similar to the FTC Blocks Development Tool, the FTC OnBot Java Development Tool allows a user to create, edit and build op modes dynamically using only a Javascript-enabled web browser.
This version of the software introduces the "OnBot Java" Development Tool. Similar to the FTC Blocks Development Tool, the FTC OnBot Java Development Tool allows a user to create, edit and build OpModes dynamically using only a Javascript-enabled web browser.
The OnBot Java Development Tool is an integrated development environment (IDE) that is served up by the Robot Controller. Op modes are created and edited using a Javascript-enabled browser (Google Chromse is recommended). Op modes are saved on the Robot Controller Android device directly.
The OnBot Java Development Tool is an integrated development environment (IDE) that is served up by the Robot Controller. OpModes are created and edited using a Javascript-enabled browser (Google Chromse is recommended). OpModes are saved on the Robot Controller Android device directly.
The OnBot Java Development Tool provides a Java programming environment that does NOT need Android Studio.
@ -894,7 +1064,7 @@ Changes with version 3.2 include:
- Fixed tooltip for ColorSensor.isLightOn block.
Added blocks for ColorSensor.getNormalizedColors and LynxI2cColorRangeSensor.getNormalizedColors.
* Added example op modes for digital touch sensor and REV Robotics Color Distance sensor.
* Added example OpModes for digital touch sensor and REV Robotics Color Distance sensor.
* User selectable color themes.
* Includes many minor enhancements and fixes (too numerous to list).
@ -919,7 +1089,7 @@ Changes include:
- Added VuforiaTrackableDefaultListener.getPose and Vuforia.trackPose blocks.
- Added optimized blocks support for Vuforia extended tracking.
- Added atan2 block to the math category.
- Added useCompetitionFieldTargetLocations parameter to Vuforia.initialize block. If set to false, the target locations are placed at (0,0,0) with target orientation as specified in https://github.com/gearsincorg/FTCVuforiaDemo/blob/master/Robot_Navigation.java tutorial op mode.
- Added useCompetitionFieldTargetLocations parameter to Vuforia.initialize block. If set to false, the target locations are placed at (0,0,0) with target orientation as specified in https://github.com/gearsincorg/FTCVuforiaDemo/blob/master/Robot_Navigation.java tutorial OpMode.
* Incorporates additional improvements to USB comm layer to improve system resiliency (to recover from a greater number of communication disruptions).
**************************************************************************************
@ -929,7 +1099,7 @@ Additional Notes Regarding Version 3.00 (built on 17.04.13)
In addition to the release changes listed below (see section labeled "Version 3.00 (built on 17.04.013)"), version 3.00 has the following important changes:
1. Version 3.00 software uses a new version of the FTC Robocol (robot protocol). If you upgrade to v3.0 on the Robot Controller and/or Android Studio side, you must also upgrade the Driver Station software to match the new Robocol.
2. Version 3.00 software removes the setMaxSpeed and getMaxSpeed methods from the DcMotor class. If you have an op mode that formerly used these methods, you will need to remove the references/calls to these methods. Instead, v3.0 provides the max speed information through the use of motor profiles that are selected by the user during robot configuration.
2. Version 3.00 software removes the setMaxSpeed and getMaxSpeed methods from the DcMotor class. If you have an OpMode that formerly used these methods, you will need to remove the references/calls to these methods. Instead, v3.0 provides the max speed information through the use of motor profiles that are selected by the user during robot configuration.
3. Version 3.00 software currently does not have a mechanism to disable extra i2c sensors. We hope to re-introduce this function with a release in the near future.
**************************************************************************************
@ -957,7 +1127,7 @@ Changes include:
- Addition of optimized Vuforia blocks.
- Auto scrollbar to projects and sounds pages.
- Fixed blocks paste bug.
- Blocks execute after while-opModeIsActive loop (to allow for cleanup before exiting op mode).
- Blocks execute after while-opModeIsActive loop (to allow for cleanup before exiting OpMode).
- Added gyro integratedZValue block.
- Fixes bug with projects page for Firefox browser.
- Added IsSpeaking block to AndroidTextToSpeech.
@ -1051,8 +1221,8 @@ Changes include:
- Updated to latest blockly.
- Added default variable blocks to navigation and matrix blocks.
- Fixed toolbox entry for openGLMatrix_rotation_withAxesArgs.
- When user downloads Blocks-generated op mode, only the .blk file is downloaded.
- When user uploads Blocks-generated op mode (.blk file), Javascript code is auto generated.
- When user downloads Blocks-generated OpMode, only the .blk file is downloaded.
- When user uploads Blocks-generated OpMode (.blk file), Javascript code is auto generated.
- Added DbgLog support.
- Added logging when a blocks file is read/written.
- Fixed bug to properly render blocks even if missing devices from configuration file.
@ -1072,7 +1242,7 @@ Changes include:
## Version 2.30 (released on 16.10.05)
* Blockly programming mode:
- Mechanism added to save Blockly op modes from Programming Mode Server onto local device
- Mechanism added to save Blockly OpModes from Programming Mode Server onto local device
- To avoid clutter, blocks are displayed in categorized folders
- Added support for DigitalChannel
- Added support for ModernRoboticsI2cCompassSensor
@ -1085,12 +1255,12 @@ Changes include:
- Fix deadlock / make camera data available while Vuforia is running.
- Update to Vuforia 6.0.117 (recommended by Vuforia and Google to close security loophole).
* Fix for autonomous 30 second timer bug (where timer was in effect, even though it appeared to have timed out).
* opModeIsActive changes to allow cleanup after op mode is stopped (with enforced 2 second safety timeout).
* opModeIsActive changes to allow cleanup after OpMode is stopped (with enforced 2 second safety timeout).
* Fix to avoid reading i2c twice.
* Updated sample OpModes.
* Improved logging and fixed intermittent freezing.
* Added digital I/O sample.
* Cleaned up device names in sample op modes to be consistent with Pushbot guide.
* Cleaned up device names in sample OpModes to be consistent with Pushbot guide.
* Fix to allow use of IrSeekerSensorV3.
## Version 2.20 (released on 16.09.08)
@ -1152,7 +1322,7 @@ Changes include:
* Improvements made to fix resiliency and responsiveness of the system.
* For LinearOpMode the user now must for a telemetry.update() to update the telemetry data on the driver station. This update() mechanism ensures that the driver station gets the updated data properly and at the same time.
* The Auto Configure function of the Robot Controller is now template based. If there is a commonly used robot configuration, a template can be created so that the Auto Configure mechanism can be used to quickly configure a robot of this type.
* The logic to detect a runaway op mode (both in the LinearOpMode and OpMode types) and to abort the run, then auto recover has been improved/implemented.
* The logic to detect a runaway OpMode (both in the LinearOpMode and OpMode types) and to abort the run, then auto recover has been improved/implemented.
* Fix has been incorporated so that Logitech F310 gamepad mappings will be correct for Marshmallow users.
## Release 16.07.08
@ -1181,9 +1351,9 @@ Changes include:
* Firmware version information for Modern Robotics modules are now logged.
* Changes made to improve USB comm reliability and robustness.
* Added support for voltage indicator for legacy (NXT-compatible) motor controllers.
* Changes made to provide auto stop capabilities for op modes.
* Changes made to provide auto stop capabilities for OpModes.
- A LinearOpMode class will stop when the statements in runOpMode() are complete. User does not have to push the stop button on the driver station.
- If an op mode is stopped by the driver station, but there is a run away/uninterruptible thread persisting, the app will log an error message then force itself to crash to stop the runaway thread.
- If an OpMode is stopped by the driver station, but there is a run away/uninterruptible thread persisting, the app will log an error message then force itself to crash to stop the runaway thread.
* Driver Station UI modified to display lowest measured voltage below current voltage (12V battery).
* Driver Station UI modified to have color background for current voltage (green=good, yellow=caution, red=danger, extremely low voltage).
* javadoc improved (edits and additional classes).
@ -1203,11 +1373,11 @@ Changes include:
* Improved battery checker feature so that voltage values get refreshed regularly (every 250 msec) on Driver Station (DS) user interface.
* Improved software so that Robot Controller (RC) is much more resilient and “self-healing” to USB disconnects:
- If user attempts to start/restart RC with one or more module missing, it will display a warning but still start up.
- When running an op mode, if one or more modules gets disconnected, the RC & DS will display warnings,and robot will keep on working in spite of the missing module(s).
- When running an OpMode, if one or more modules gets disconnected, the RC & DS will display warnings,and robot will keep on working in spite of the missing module(s).
- If a disconnected module gets physically reconnected the RC will auto detect the module and the user will regain control of the recently connected module.
- Warning messages are more helpful (identifies the type of module thats missing plus its USB serial number).
* Code changes to fix the null gamepad reference when users try to reference the gamepads in the init() portion of their op mode.
* NXT light sensor output is now properly scaled. Note that teams might have to readjust their light threshold values in their op modes.
* Code changes to fix the null gamepad reference when users try to reference the gamepads in the init() portion of their OpMode.
* NXT light sensor output is now properly scaled. Note that teams might have to readjust their light threshold values in their OpModes.
* On DS user interface, gamepad icon for a driver will disappear if the matching gamepad is disconnected or if that gamepad gets designated as a different driver.
* Robot Protocol (ROBOCOL) version number info is displayed in About screen on RC and DS apps.
* Incorporated a display filter on pairing screen to filter out devices that dont use the “<TEAM NUMBER>-“ format. This filter can be turned off to show all Wi-Fi Direct devices.
@ -1229,7 +1399,7 @@ Changes include:
* Fix for Telemetry also has off-by-one errors in its data string sizing / short size limitations error
* User telemetry output is sorted.
* added formatting variants to DbgLog and RobotLog APIs
* code modified to allow for a long list of op mode names.
* code modified to allow for a long list of OpMode names.
* changes to improve thread safety of RobocolDatagramSocket
* Fix for "missing hardware leaves robot controller disconnected from driver station" error
* fix for "fast tapping of Init/Start causes problems" (toast is now only instantiated on UI thread).
@ -1256,12 +1426,12 @@ Changes include:
- The GyroSensor class now supports the MR Gyro Sensor.
- Users can access heading data (about Z axis)
- Users can also access raw gyro data (X, Y, & Z axes).
- Example MRGyroTest.java op mode included.
- Example MRGyroTest.java OpMode included.
* Improved error messages
- More descriptive error messages for exceptions in user code.
* Updated DcMotor API
* Enable read mode on new address in setI2cAddress
* Fix so that driver station app resets the gamepads when switching op modes.
* Fix so that driver station app resets the gamepads when switching OpModes.
* USB-related code changes to make USB comm more responsive and to display more explicit error messages.
- Fix so that USB will recover properly if the USB bus returns garbage data.
- Fix USB initializtion race condition.
@ -1276,13 +1446,13 @@ Changes include:
* Updated HT color sensor driver to use constants from ModernRoboticsUsbLegacyModule class.
* Updated MR color sensor driver to use constants from ModernRoboticsUsbDeviceInterfaceModule class.
* Correctly handle I2C Address change in all color sensors
* Updated/cleaned up op modes.
- Updated comments in LinearI2cAddressChange.java example op mode.
* Updated/cleaned up OpModes.
- Updated comments in LinearI2cAddressChange.java example OpMode.
- Replaced the calls to "setChannelMode" with "setMode" (to match the new of the DcMotor method).
- Removed K9AutoTime.java op mode.
- Added MRGyroTest.java op mode (demonstrates how to use MR Gyro Sensor).
- Added MRRGBExample.java op mode (demonstrates how to use MR Color Sensor).
- Added HTRGBExample.java op mode (demonstrates how to use HT legacy color sensor).
- Removed K9AutoTime.java OpMode.
- Added MRGyroTest.java OpMode (demonstrates how to use MR Gyro Sensor).
- Added MRRGBExample.java OpMode (demonstrates how to use MR Color Sensor).
- Added HTRGBExample.java OpMode (demonstrates how to use HT legacy color sensor).
- Added MatrixControllerDemo.java (demonstrates how to use legacy Matrix controller).
* Updated javadoc documentation.
* Updated release .apk files for Robot Controller and Driver Station apps.
@ -1301,11 +1471,11 @@ Changes include:
* An init() method is added to the OpMode class.
- For this release, init() is triggered right before the start() method.
- Eventually, the init() method will be triggered when the user presses an "INIT" button on driver station.
- The init() and loop() methods are now required (i.e., need to be overridden in the user's op mode).
- The init() and loop() methods are now required (i.e., need to be overridden in the user's OpMode).
- The start() and stop() methods are optional.
* A new LinearOpMode class is introduced.
- Teams can use the LinearOpMode mode to create a linear (not event driven) program model.
- Teams can use blocking statements like Thread.sleep() within a linear op mode.
- Teams can use blocking statements like Thread.sleep() within a linear OpMode.
* The API for the Legacy Module and Core Device Interface Module have been updated.
- Support for encoders with the Legacy Module is now working.
* The hardware loop has been updated for better performance.

View File

@ -23,7 +23,17 @@ android {
}
}
repositories {
maven {
url = 'https://maven.brott.dev/'
}
}
dependencies {
implementation project(':FtcRobotController')
annotationProcessor files('lib/OpModeAnnotationProcessor.jar')
implementation "com.acmerobotics.roadrunner:ftc:0.1.12"
implementation "com.acmerobotics.roadrunner:core:1.0.0-beta8"
implementation "com.acmerobotics.roadrunner:actions:1.0.0-beta8"
implementation "com.acmerobotics.dashboard:dashboard:0.4.14"
}

View File

@ -0,0 +1,78 @@
package org.firstinspires.ftc.teamcode;
import android.util.Size;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.vision.VisionPortal;
import java.util.Locale;
@TeleOp(name = "CometBots Camera Demo (HDMI)", group = "[14493] CometBots")
public class CometBotsCameraDemo extends LinearOpMode {
final boolean USING_WEBCAM = true;
final BuiltinCameraDirection INTERNAL_CAM_DIR = BuiltinCameraDirection.FRONT;
final int RESOLUTION_WIDTH = 640;
final int RESOLUTION_HEIGHT = 480;
// Internal state
boolean lastX;
int frameCount;
long capReqTime;
@Override
public void runOpMode() throws InterruptedException {
telemetry.speak("Development - Camera Test");
VisionPortal portal;
if (USING_WEBCAM)
{
portal = new VisionPortal.Builder()
.setCamera(hardwareMap.get(WebcamName.class, "watch out"))
.setCameraResolution(new Size(RESOLUTION_WIDTH, RESOLUTION_HEIGHT))
.build();
}
else
{
portal = new VisionPortal.Builder()
.setCamera(INTERNAL_CAM_DIR)
.setCameraResolution(new Size(RESOLUTION_WIDTH, RESOLUTION_HEIGHT))
.build();
}
while (!isStopRequested())
{
boolean x = gamepad1.x;
if (x && !lastX)
{
portal.saveNextFrameRaw(String.format(Locale.US, "CameraFrameCapture-%06d", frameCount++));
capReqTime = System.currentTimeMillis();
}
lastX = x;
telemetry.addLine("######## Camera Capture Utility ########");
telemetry.addLine(String.format(Locale.US, " > Resolution: %dx%d", RESOLUTION_WIDTH, RESOLUTION_HEIGHT));
telemetry.addLine(" > Press X (or Square) to capture a frame");
telemetry.addData(" > Camera Status", portal.getCameraState());
if (capReqTime != 0)
{
telemetry.addLine("\nCaptured Frame!");
}
if (capReqTime != 0 && System.currentTimeMillis() - capReqTime > 1000)
{
capReqTime = 0;
}
telemetry.update();
}
}
}

View File

@ -0,0 +1,72 @@
package org.firstinspires.ftc.teamcode;
import com.qualcomm.hardware.rev.RevHubOrientationOnRobot;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.IMU;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
@TeleOp(name = "CometBots IMU Demo", group = "[14493] CometBots")
public class CometBotsIMUDemo extends LinearOpMode {
IMU imu;
@Override
public void runOpMode() throws InterruptedException {
// Retrieve and initialize the IMU.
// This sample expects the IMU to be in a REV Hub and named "imu".
imu = hardwareMap.get(IMU.class, "imu");
/* Define how the hub is mounted on the robot to get the correct Yaw, Pitch and Roll values.
*
* Two input parameters are required to fully specify the Orientation.
* The first parameter specifies the direction the printed logo on the Hub is pointing.
* The second parameter specifies the direction the USB connector on the Hub is pointing.
* All directions are relative to the robot, and left/right is as-viewed from behind the robot.
*/
/* The next two lines define Hub orientation.
* The Default Orientation (shown) is when a hub is mounted horizontally with the printed logo pointing UP and the USB port pointing FORWARD.
*
* To Do: EDIT these two lines to match YOUR mounting configuration.
*/
RevHubOrientationOnRobot.LogoFacingDirection logoDirection = RevHubOrientationOnRobot.LogoFacingDirection.UP;
RevHubOrientationOnRobot.UsbFacingDirection usbDirection = RevHubOrientationOnRobot.UsbFacingDirection.FORWARD;
RevHubOrientationOnRobot orientationOnRobot = new RevHubOrientationOnRobot(logoDirection, usbDirection);
// Now initialize the IMU with this mounting orientation
// Note: if you choose two conflicting directions, this initialization will cause a code exception.
imu.initialize(new IMU.Parameters(orientationOnRobot));
// Loop and update the dashboard
while (!isStopRequested()) {
telemetry.addData("Hub orientation", "Logo=%s USB=%s\n ", logoDirection, usbDirection);
// Check to see if heading reset is requested
if (gamepad1.y) {
telemetry.addData("Yaw", "Resetting\n");
imu.resetYaw();
} else {
telemetry.addData("Yaw", "Press Y (triangle) on Gamepad to reset\n");
}
// Retrieve Rotational Angles and Velocities
YawPitchRollAngles orientation = imu.getRobotYawPitchRollAngles();
AngularVelocity angularVelocity = imu.getRobotAngularVelocity(AngleUnit.DEGREES);
telemetry.addData("Yaw (Z)", "%.2f Deg. (Heading)", orientation.getYaw(AngleUnit.DEGREES));
telemetry.addData("Pitch (X)", "%.2f Deg.", orientation.getPitch(AngleUnit.DEGREES));
telemetry.addData("Roll (Y)", "%.2f Deg.\n", orientation.getRoll(AngleUnit.DEGREES));
telemetry.addData("Yaw (Z) velocity", "%.2f Deg/Sec", angularVelocity.zRotationRate);
telemetry.addData("Pitch (X) velocity", "%.2f Deg/Sec", angularVelocity.xRotationRate);
telemetry.addData("Roll (Y) velocity", "%.2f Deg/Sec", angularVelocity.yRotationRate);
telemetry.update();
}
}
}

View File

@ -0,0 +1,189 @@
/* Copyright (c) 2021 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.teamcode;
import static org.firstinspires.ftc.teamcode.cometbots.Constants.*;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.DcMotorSimple;
import com.qualcomm.robotcore.hardware.Gamepad;
import com.qualcomm.robotcore.util.ElapsedTime;
import org.firstinspires.ftc.teamcode.cometbots.MovementLibrary;
/*
* This file contains an example of a Linear "OpMode".
* An OpMode is a 'program' that runs in either the autonomous or the teleop period of an FTC match.
* The names of OpModes appear on the menu of the FTC Driver Station.
* When a selection is made from the menu, the corresponding OpMode is executed.
*
* This particular OpMode illustrates driving a 4-motor Omni-Directional (or Holonomic) robot.
* This code will work with either a Mecanum-Drive or an X-Drive train.
* Both of these drives are illustrated at https://gm0.org/en/latest/docs/robot-design/drivetrains/holonomic.html
* Note that a Mecanum drive must display an X roller-pattern when viewed from above.
*
* Also note that it is critical to set the correct rotation direction for each motor. See details below.
*
* Holonomic drives provide the ability for the robot to move in three axes (directions) simultaneously.
* Each motion axis is controlled by one Joystick axis.
*
* 1) Axial: Driving forward and backward Left-joystick Forward/Backward
* 2) Lateral: Strafing right and left Left-joystick Right and Left
* 3) Yaw: Rotating Clockwise and counter clockwise Right-joystick Right and Left
*
* This code is written assuming that the right-side motors need to be reversed for the robot to drive forward.
* When you first test your robot, if it moves backward when you push the left stick forward, then you must flip
* the direction of all 4 motors (see code below).
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="CometBots Linear OpMode Demo", group = "[14493] CometBots")
public class CometBotsLinearOpMode extends LinearOpMode {
// Declare OpMode members for each of the 4 motors.
private ElapsedTime runtime = new ElapsedTime();
private DcMotor rhw = null;
private DcMotor rlw = null;
private DcMotor lhw = null;
private DcMotor llw = null;
@Override
public void runOpMode() {
// Button A counter to cycle speeds
int cntBtnA = 1;
// Initialize the hardware variables. Note that the strings used here must correspond
// to the names assigned during the robot configuration step on the DS or RC devices.
rhw = hardwareMap.get(DcMotor.class, RT_FRONT);
rlw = hardwareMap.get(DcMotor.class, RT_BACK);
lhw = hardwareMap.get(DcMotor.class, LT_FRONT);
llw = hardwareMap.get(DcMotor.class, LT_BACK);
/*
Put all motors in an array for easier handling
NOTE: Placement of motor in array is important especially when dealing with functions
that set direction
*/
DcMotor[] motors = {rhw, rlw, lhw, llw};
/*
Initialize all wheels forward using motors a
*/
MovementLibrary.setDirectionForward(motors);
telemetry.addData("Status", "Initialized");
telemetry.update();
waitForStart();
runtime.reset();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
double max;
// POV Mode uses left joystick to go forward & strafe, and right joystick to rotate.
double axial = -gamepad1.left_stick_y; // Note: pushing stick forward gives negative value
double lateral = gamepad1.left_stick_x;
double yaw = gamepad1.right_stick_x;
// Combine the joystick requests for each axis-motion to determine each wheel's power.
// Set up a variable for each drive wheel to save the power level for telemetry.
double lftFrntPwr = axial + lateral + yaw;
double rtFrntPwr = axial - lateral - yaw;
double ltBckPwr = axial - lateral + yaw;
double rtBckPwr = axial + lateral - yaw;
// Normalize the values so no wheel power exceeds 100%
// This ensures that the robot maintains the desired motion.
max = Math.max(Math.abs(lftFrntPwr), Math.abs(rtFrntPwr));
max = Math.max(max, Math.abs(ltBckPwr));
max = Math.max(max, Math.abs(rtBckPwr));
if (max > 1.0) {
lftFrntPwr /= max;
rtFrntPwr /= max;
ltBckPwr /= max;
rtBckPwr /= max;
}
// This is test code:
//
// Uncomment the following code to test your motor directions.
// Each button should make the corresponding motor run FORWARD.
// 1) First get all the motors to take to correct positions on the robot
// by adjusting your Robot Configuration if necessary.
// 2) Then make sure they run in the correct direction by modifying the
// the setDirection() calls above.
// Once the correct motors move in the correct direction re-comment this code.
/*
lftFrntPwr = gamepad1.x ? 1.0 : 0.0; // X gamepad
ltBckPwr = gamepad1.a ? 1.0 : 0.0; // A gamepad
rtFrntPwr = gamepad1.y ? 1.0 : 0.0; // Y gamepad
rtBckPwr = gamepad1.b ? 1.0 : 0.0; // B gamepad
*/
lhw.setPower(lftFrntPwr/cntBtnA);
rhw.setPower(rtFrntPwr/cntBtnA);
llw.setPower(ltBckPwr/cntBtnA);
rlw.setPower(rtBckPwr/cntBtnA);
// Send calculated power to wheels
if(gamepad1.left_bumper) {
sleep(175);
cntBtnA--;
if(cntBtnA < 0) {
cntBtnA = 0;
}
}
if(gamepad1.right_bumper) {
sleep(175);
cntBtnA++;
if(cntBtnA > 4) {
cntBtnA = 4;
}
}
// Show the elapsed game time and wheel power.
telemetry.addData("Status", "Run Time: " + runtime.toString());
telemetry.addData("Front left/Right", "%4.2f, %4.2f", lftFrntPwr, rtFrntPwr);
telemetry.addData("Back left/Right", "%4.2f, %4.2f", ltBckPwr, rtBckPwr);
telemetry.addData("Power Mode", cntBtnA);
telemetry.update();
}
}}

View File

@ -0,0 +1,132 @@
/* Copyright (c) 2021 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.teamcode;
import static org.firstinspires.ftc.teamcode.cometbots.Constants.LT_BACK;
import static org.firstinspires.ftc.teamcode.cometbots.Constants.LT_FRONT;
import static org.firstinspires.ftc.teamcode.cometbots.Constants.RT_BACK;
import static org.firstinspires.ftc.teamcode.cometbots.Constants.RT_FRONT;
import com.acmerobotics.roadrunner.Pose2d;
import com.acmerobotics.roadrunner.PoseVelocity2d;
import com.acmerobotics.roadrunner.Vector2d;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
import org.firstinspires.ftc.teamcode.cometbots.MovementLibrary;
/*
* This file contains an example of a Linear "OpMode".
* An OpMode is a 'program' that runs in either the autonomous or the teleop period of an FTC match.
* The names of OpModes appear on the menu of the FTC Driver Station.
* When a selection is made from the menu, the corresponding OpMode is executed.
*
* This particular OpMode illustrates driving a 4-motor Omni-Directional (or Holonomic) robot.
* This code will work with either a Mecanum-Drive or an X-Drive train.
* Both of these drives are illustrated at https://gm0.org/en/latest/docs/robot-design/drivetrains/holonomic.html
* Note that a Mecanum drive must display an X roller-pattern when viewed from above.
*
* Also note that it is critical to set the correct rotation direction for each motor. See details below.
*
* Holonomic drives provide the ability for the robot to move in three axes (directions) simultaneously.
* Each motion axis is controlled by one Joystick axis.
*
* 1) Axial: Driving forward and backward Left-joystick Forward/Backward
* 2) Lateral: Strafing right and left Left-joystick Right and Left
* 3) Yaw: Rotating Clockwise and counter clockwise Right-joystick Right and Left
*
* This code is written assuming that the right-side motors need to be reversed for the robot to drive forward.
* When you first test your robot, if it moves backward when you push the left stick forward, then you must flip
* the direction of all 4 motors (see code below).
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list
*/
@TeleOp(name="CometBots Mecanum Drive Demo", group = "[14493] CometBots")
public class CometBotsMecanumDriveMode extends LinearOpMode {
// Declare OpMode members for each of the 4 motors.
private ElapsedTime runtime = new ElapsedTime();
private DcMotor rhw = null;
private DcMotor rlw = null;
private DcMotor lhw = null;
private DcMotor llw = null;
@Override
public void runOpMode() {
MecanumDrive drive = new MecanumDrive(hardwareMap, new Pose2d(0,0,Math.toRadians(90)));
// Button A counter to cycle speeds
int cntBtnA = 1;
// Initialize the hardware variables. Note that the strings used here must correspond
// to the names assigned during the robot configuration step on the DS or RC devices.
rhw = hardwareMap.get(DcMotor.class, RT_FRONT);
rlw = hardwareMap.get(DcMotor.class, RT_BACK);
lhw = hardwareMap.get(DcMotor.class, LT_FRONT);
llw = hardwareMap.get(DcMotor.class, LT_BACK);
/*
Put all motors in an array for easier handling
NOTE: Placement of motor in array is important especially when dealing with functions
that set direction
*/
DcMotor[] motors = {rhw, rlw, lhw, llw};
/*
Initialize all wheels forward using motors a
*/
MovementLibrary.setDirectionForward(motors);
telemetry.addData("Status", "Initialized");
telemetry.update();
waitForStart();
runtime.reset();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
drive.setDrivePowers(
new PoseVelocity2d(
new Vector2d(
gamepad1.right_stick_x,
gamepad1.right_stick_y
),
-gamepad1.right_stick_x
)
);
drive.updatePoseEstimate();
}
}}

View File

@ -0,0 +1,46 @@
package org.firstinspires.ftc.teamcode;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.openftc.easyopencv.OpenCvCamera;
import org.openftc.easyopencv.OpenCvCameraFactory;
import org.openftc.easyopencv.OpenCvCameraRotation;
@Disabled
@Autonomous(name="DevApp - OpenCV Demo", group="[14493] CometBots")
public class CometBotsOpenCVDemo extends LinearOpMode {
private OpenCvCamera controlHubCam; // Use OpenCvCamera class from FTC SDK
private static final int CAMERA_WIDTH = 1280; // width of wanted camera resolution
private static final int CAMERA_HEIGHT = 720; // height of wanted camera resolution
// Calculate the distance using the formula
public static final double objectWidthInRealWorldUnits = 3.75; // Replace with the actual width of the object in real-world units
public static final double focalLength = 728; // Replace with the focal length of the camera in pixels
private void initOpenCV() {
// Create an instance of the camera
int cameraMonitorViewId = hardwareMap.appContext.getResources().getIdentifier(
"cameraMonitorViewId", "id", hardwareMap.appContext.getPackageName());
// Use OpenCvCameraFactory class from FTC SDK to create camera instance
controlHubCam = OpenCvCameraFactory.getInstance().createWebcam(
hardwareMap.get(WebcamName.class, "watch out"), cameraMonitorViewId);
// controlHubCam.setPipeline(new YellowBlobDetectionPipeline());
controlHubCam.openCameraDevice();
controlHubCam.startStreaming(CAMERA_WIDTH, CAMERA_HEIGHT, OpenCvCameraRotation.UPRIGHT);
}
@Override
public void runOpMode() throws InterruptedException {
}
}

View File

@ -0,0 +1,100 @@
package org.firstinspires.ftc.teamcode;
import static org.firstinspires.ftc.teamcode.cometbots.Constants.*;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.ColorSensor;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.DistanceSensor;
import com.qualcomm.robotcore.hardware.TouchSensor;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
import org.firstinspires.ftc.teamcode.cometbots.MovementLibrary;
@Disabled
@Autonomous(name = "Development Class", group = "[14493] CometBots")
public class CometBotsProject extends LinearOpMode {
@Override
public void runOpMode() throws InterruptedException {
DcMotor rhw = hardwareMap.get(DcMotor.class, "right hand wheel");
DcMotor rlw = hardwareMap.get(DcMotor.class, "right leg wheel");
DcMotor lhw = hardwareMap.get(DcMotor.class, "left hand wheel");
DcMotor llw = hardwareMap.get(DcMotor.class, "left leg wheel");
TouchSensor touchSensor = hardwareMap.get(TouchSensor.class, "touchsensor");
DistanceSensor distanceSensor = hardwareMap.get(DistanceSensor.class, "distancesensor");
ColorSensor colorSensor = hardwareMap.get(ColorSensor.class, "colorsensor");
/*
Put all motors in an array for easier handling
NOTE: Placement of motor in array is important
especially when dealing with functions that
set direction
*/
DcMotor[] motors = {rhw, rlw, lhw, llw};
/*
Initialize all wheels forward using motors array
*/
// MovementLibrary.setDirectionForward(motors);
MovementLibrary.setLeftSideBackward(motors);
MovementLibrary.setRightSideForward(motors);
/*
Capture color readout from color sensor
*/
telemetry.addData("Color Sensor [Red]", colorSensor.red());
telemetry.addData("Color Sensor [Green]", colorSensor.green());
telemetry.addData("Color Sensor [Blue]", colorSensor.blue());
telemetry.addData("Color Sensor [Alpha]", colorSensor.alpha());
/*
Capture distance readout from distance sensor
*/
telemetry.addData("Distance Sensor [inches]", distanceSensor.getDistance(DistanceUnit.INCH));
/*
Capture state when touch sensor is pressed
*/
if (touchSensor.isPressed()) {
telemetry.speak("Touch sensor is pressed");
telemetry.addData("Touch sensor pressed?", touchSensor.isPressed());
}
telemetry.update();
/*
Wait for the driver to hit START in the driver hub
*/
waitForStart();
while (opModeIsActive()) {
/*
Initialize wheel direction with base speed
*/
MovementLibrary.setSpeed(motors, CB_BASE_SPEED);
/*
Distance traveled in inches
*/
double distanceInInches = distanceSensor.getDistance(DistanceUnit.INCH);
telemetry.addData("Distance traveled (inches)", "%.2f", distanceInInches);
/*
Speed up when going over an object that is blue-ish / green
*/
if (colorSensor.blue() > 800 && colorSensor.green() > 1100) {
telemetry.speak("Green speed boost");
MovementLibrary.setSpeed(motors, CB_BASE_SPEED * 2.0);
} else {
MovementLibrary.setSpeed(motors, CB_BASE_SPEED);
}
}
}
}

View File

@ -0,0 +1,111 @@
package org.firstinspires.ftc.teamcode;
import com.acmerobotics.dashboard.FtcDashboard;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.ColorSensor;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.DistanceSensor;
import com.qualcomm.robotcore.hardware.TouchSensor;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
import org.firstinspires.ftc.teamcode.cometbots.MovementLibrary;
import org.openftc.easyopencv.OpenCvCamera;
import org.openftc.easyopencv.OpenCvCameraFactory;
import org.openftc.easyopencv.OpenCvCameraRotation;
@Autonomous(name = "CometBot Sensor Demo", group = "[14493] CometBots")
public class CometBotsSensorDemo extends LinearOpMode {
private OpenCvCamera controlHubCam; // Use OpenCvCamera class from FTC SDK
private static final int CAMERA_WIDTH = 1280; // width of wanted camera resolution
private static final int CAMERA_HEIGHT = 720; // height of wanted camera resolution
// Calculate the distance using the formula
public static final double objectWidthInRealWorldUnits = 3.75; // Replace with the actual width of the object in real-world units
public static final double focalLength = 728; // Replace with the focal length of the camera in pixels
private DcMotor rhw = null;
private DcMotor rlw = null;
private DcMotor lhw = null;
private DcMotor llw = null;
@Override
public void runOpMode() throws InterruptedException {
// Initialize the hardware variables. Note that the strings used here must correspond
// to the names assigned during the robot configuration step on the DS or RC devices.
rhw = hardwareMap.get(DcMotor.class, "right hand wheel");
rlw = hardwareMap.get(DcMotor.class, "right leg wheel");
lhw = hardwareMap.get(DcMotor.class, "left hand wheel");
llw = hardwareMap.get(DcMotor.class, "left leg wheel");
/*
Put all motors in an array for easier handling
NOTE: Placement of motor in array is important especially when dealing with functions
that set direction
*/
DcMotor[] motors = {rhw, rlw, lhw, llw};
// Create an instance of the camera
int cameraMonitorViewId = hardwareMap.appContext.getResources().getIdentifier(
"cameraMonitorViewId", "id", hardwareMap.appContext.getPackageName());
// Use OpenCvCameraFactory class from FTC SDK to create camera instance
controlHubCam = OpenCvCameraFactory.getInstance().createWebcam(
hardwareMap.get(WebcamName.class, "watch out"), cameraMonitorViewId);
TouchSensor touchSensor = hardwareMap.get(TouchSensor.class, "touchsensor");
DistanceSensor distanceSensor = hardwareMap.get(DistanceSensor.class, "distancesensor");
ColorSensor colorSensor = hardwareMap.get(ColorSensor.class, "colorsensor");
controlHubCam.openCameraDevice();
controlHubCam.startStreaming(CAMERA_WIDTH, CAMERA_HEIGHT, OpenCvCameraRotation.UPRIGHT);
FtcDashboard.getInstance().startCameraStream(controlHubCam, 0);
/*
Wait for the driver to hit START in the driver hub
*/
waitForStart();
while (opModeIsActive()) {
/*
Capture color readout from color sensor
*/
telemetry.addData("Color Sensor [Red]", colorSensor.red());
telemetry.addData("Color Sensor [Green]", colorSensor.green());
telemetry.addData("Color Sensor [Blue]", colorSensor.blue());
telemetry.addData("Color Sensor [Alpha]", colorSensor.alpha());
/*
Capture distance readout from distance sensor
*/
telemetry.addData("Distance Sensor [inches]", distanceSensor.getDistance(DistanceUnit.INCH));
/*
Capture state when touch sensor is pressed
*/
telemetry.addData("Touch sensor pressed?", touchSensor.isPressed());
if (touchSensor.isPressed()) {
telemetry.speak("Touch sensor is pressed");
}
if (distanceSensor.getDistance(DistanceUnit.INCH) < 3) {
telemetry.speak("Warning, unknown object near robot");
MovementLibrary.setDirectionRight(motors);
MovementLibrary.setSpeed(motors,0.25);
sleep(1500);
MovementLibrary.setSpeed(motors,0);
}
/*
Update stats continually
*/
telemetry.update();
}
}
}

View File

@ -0,0 +1,141 @@
/* Copyright (c) 2019 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.teamcode;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.hardware.camera.BuiltinCameraDirection;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
import org.firstinspires.ftc.vision.VisionPortal;
import org.firstinspires.ftc.vision.tfod.TfodProcessor;
import java.util.List;
/*
* This OpMode illustrates the basics of TensorFlow Object Detection, using
* the easiest way.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@TeleOp(name = "CometBots TensorFlow Demo (HDMI)", group = "[14493] CometBots")
public class CometBotsTensorFlowDemo extends LinearOpMode {
private static final boolean USE_WEBCAM = true; // true for webcam, false for phone camera
/**
* The variable to store our instance of the TensorFlow Object Detection processor.
*/
private TfodProcessor tfod;
/**
* The variable to store our instance of the vision portal.
*/
private VisionPortal visionPortal;
@Override
public void runOpMode() {
initTfod();
// Wait for the DS start button to be touched.
telemetry.addData("DS preview on/off", "3 dots, Camera Stream");
telemetry.addData(">", "Touch Play to start OpMode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
telemetryTfod();
// Push telemetry to the Driver Station.
telemetry.update();
// Save CPU resources; can resume streaming when needed.
if (gamepad1.dpad_down) {
visionPortal.stopStreaming();
} else if (gamepad1.dpad_up) {
visionPortal.resumeStreaming();
}
// Share the CPU.
sleep(20);
}
}
// Save more CPU resources when camera is no longer needed.
visionPortal.close();
} // end runOpMode()
/**
* Initialize the TensorFlow Object Detection processor.
*/
private void initTfod() {
// Create the TensorFlow processor the easy way.
tfod = TfodProcessor.easyCreateWithDefaults();
// Create the vision portal the easy way.
if (USE_WEBCAM) {
visionPortal = VisionPortal.easyCreateWithDefaults(
hardwareMap.get(WebcamName.class, "watch out"), tfod);
} else {
visionPortal = VisionPortal.easyCreateWithDefaults(
BuiltinCameraDirection.BACK, tfod);
}
} // end method initTfod()
/**
* Add telemetry about TensorFlow Object Detection (TFOD) recognitions.
*/
private void telemetryTfod() {
List<Recognition> currentRecognitions = tfod.getRecognitions();
telemetry.addData("# Objects Detected", currentRecognitions.size());
// Step through the list of recognitions and display info for each one.
for (Recognition recognition : currentRecognitions) {
double x = (recognition.getLeft() + recognition.getRight()) / 2 ;
double y = (recognition.getTop() + recognition.getBottom()) / 2 ;
telemetry.addData(""," ");
telemetry.addData("Image", "%s (%.0f %% Conf.)", recognition.getLabel(), recognition.getConfidence() * 100);
telemetry.addData("- Position", "%.0f / %.0f", x, y);
telemetry.addData("- Size", "%.0f x %.0f", recognition.getWidth(), recognition.getHeight());
} // end for() loop
} // end method telemetryTfod()
} // end class

View File

@ -0,0 +1,22 @@
package org.firstinspires.ftc.teamcode;
import com.acmerobotics.dashboard.canvas.Canvas;
import com.acmerobotics.roadrunner.Pose2d;
import com.acmerobotics.roadrunner.Vector2d;
public final class Drawing {
private Drawing() {}
public static void drawRobot(Canvas c, Pose2d t) {
final double ROBOT_RADIUS = 9;
c.setStrokeWidth(1);
c.strokeCircle(t.position.x, t.position.y, ROBOT_RADIUS);
Vector2d halfv = t.heading.vec().times(0.5 * ROBOT_RADIUS);
Vector2d p1 = t.position.plus(halfv);
Vector2d p2 = p1.plus(halfv);
c.strokeLine(p1.x, p1.y, p2.x, p2.y);
}
}

View File

@ -0,0 +1,8 @@
package org.firstinspires.ftc.teamcode;
import com.acmerobotics.roadrunner.Time;
import com.acmerobotics.roadrunner.Twist2dDual;
public interface Localizer {
Twist2dDual<Time> update();
}

View File

@ -0,0 +1,502 @@
package org.firstinspires.ftc.teamcode;
import static org.firstinspires.ftc.teamcode.cometbots.Constants.*;
import androidx.annotation.NonNull;
import com.acmerobotics.dashboard.canvas.Canvas;
import com.acmerobotics.dashboard.config.Config;
import com.acmerobotics.dashboard.telemetry.TelemetryPacket;
import com.acmerobotics.roadrunner.*;
import com.acmerobotics.roadrunner.AngularVelConstraint;
import com.acmerobotics.roadrunner.DualNum;
import com.acmerobotics.roadrunner.HolonomicController;
import com.acmerobotics.roadrunner.MecanumKinematics;
import com.acmerobotics.roadrunner.MinVelConstraint;
import com.acmerobotics.roadrunner.MotorFeedforward;
import com.acmerobotics.roadrunner.Pose2d;
import com.acmerobotics.roadrunner.Pose2dDual;
import com.acmerobotics.roadrunner.ProfileAccelConstraint;
import com.acmerobotics.roadrunner.Time;
import com.acmerobotics.roadrunner.TimeTrajectory;
import com.acmerobotics.roadrunner.TimeTurn;
import com.acmerobotics.roadrunner.TrajectoryActionBuilder;
import com.acmerobotics.roadrunner.TurnConstraints;
import com.acmerobotics.roadrunner.Twist2dDual;
import com.acmerobotics.roadrunner.VelConstraint;
import com.acmerobotics.roadrunner.ftc.DownsampledWriter;
import com.acmerobotics.roadrunner.ftc.Encoder;
import com.acmerobotics.roadrunner.ftc.FlightRecorder;
import com.acmerobotics.roadrunner.ftc.LazyImu;
import com.acmerobotics.roadrunner.ftc.LynxFirmware;
import com.acmerobotics.roadrunner.ftc.OverflowEncoder;
import com.acmerobotics.roadrunner.ftc.PositionVelocityPair;
import com.acmerobotics.roadrunner.ftc.RawEncoder;
import com.qualcomm.hardware.lynx.LynxModule;
import com.qualcomm.hardware.rev.RevHubOrientationOnRobot;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.DcMotorEx;
import com.qualcomm.robotcore.hardware.DcMotorSimple;
import com.qualcomm.robotcore.hardware.HardwareMap;
import com.qualcomm.robotcore.hardware.IMU;
import com.qualcomm.robotcore.hardware.VoltageSensor;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
import org.firstinspires.ftc.teamcode.cometbots.MovementLibrary;
import org.firstinspires.ftc.teamcode.messages.DriveCommandMessage;
import org.firstinspires.ftc.teamcode.messages.MecanumCommandMessage;
import org.firstinspires.ftc.teamcode.messages.MecanumLocalizerInputsMessage;
import org.firstinspires.ftc.teamcode.messages.PoseMessage;
import java.lang.Math;
import java.util.Arrays;
import java.util.LinkedList;
import java.util.List;
@Config
public final class MecanumDrive {
public static class Params {
// IMU orientation
// TODO: fill in these values based on
// see https://ftc-docs.firstinspires.org/en/latest/programming_resources/imu/imu.html?highlight=imu#physical-hub-mounting
public RevHubOrientationOnRobot.LogoFacingDirection logoFacingDirection =
RevHubOrientationOnRobot.LogoFacingDirection.FORWARD;
public RevHubOrientationOnRobot.UsbFacingDirection usbFacingDirection =
RevHubOrientationOnRobot.UsbFacingDirection.UP;
// drive model parameters
public double inPerTick = 119/5924.5;
public double lateralInPerTick = 124/6930.5;
public double trackWidthTicks = 1131;
// feedforward parameters (in tick units)
public double kS = 0.7476;
public double kV = 0.0043736;
public double kA = 00055;
// path profile parameters (in inches)
public double maxWheelVel = 50;
public double minProfileAccel = -30;
public double maxProfileAccel = 50;
// turn profile parameters (in radians)
public double maxAngVel = Math.PI; // shared with path
public double maxAngAccel = Math.PI;
// path controller gains
public double axialGain = 4.125;
public double lateralGain = 4.125;
public double headingGain = 4.25; // shared with turn
public double axialVelGain = 0.0;
public double lateralVelGain = 0.0;
public double headingVelGain = 0.0; // shared with turn
}
public static Params PARAMS = new Params();
public final MecanumKinematics kinematics = new MecanumKinematics(
PARAMS.inPerTick * PARAMS.trackWidthTicks, PARAMS.inPerTick / PARAMS.lateralInPerTick);
public final TurnConstraints defaultTurnConstraints = new TurnConstraints(
PARAMS.maxAngVel, -PARAMS.maxAngAccel, PARAMS.maxAngAccel);
public final VelConstraint defaultVelConstraint =
new MinVelConstraint(Arrays.asList(
kinematics.new WheelVelConstraint(PARAMS.maxWheelVel),
new AngularVelConstraint(PARAMS.maxAngVel)
));
public final AccelConstraint defaultAccelConstraint =
new ProfileAccelConstraint(PARAMS.minProfileAccel, PARAMS.maxProfileAccel);
public final DcMotorEx leftFront, leftBack, rightBack, rightFront;
public final VoltageSensor voltageSensor;
public final LazyImu lazyImu;
public final Localizer localizer;
public Pose2d pose;
private final LinkedList<Pose2d> poseHistory = new LinkedList<>();
private final DownsampledWriter estimatedPoseWriter = new DownsampledWriter("ESTIMATED_POSE", 50_000_000);
private final DownsampledWriter targetPoseWriter = new DownsampledWriter("TARGET_POSE", 50_000_000);
private final DownsampledWriter driveCommandWriter = new DownsampledWriter("DRIVE_COMMAND", 50_000_000);
private final DownsampledWriter mecanumCommandWriter = new DownsampledWriter("MECANUM_COMMAND", 50_000_000);
public class DriveLocalizer implements Localizer {
public final Encoder leftFront, leftBack, rightBack, rightFront;
public final IMU imu;
private int lastLeftFrontPos, lastLeftBackPos, lastRightBackPos, lastRightFrontPos;
private Rotation2d lastHeading;
private boolean initialized;
public DriveLocalizer() {
leftFront = new OverflowEncoder(new RawEncoder(MecanumDrive.this.leftFront));
leftBack = new OverflowEncoder(new RawEncoder(MecanumDrive.this.leftBack));
rightBack = new OverflowEncoder(new RawEncoder(MecanumDrive.this.rightBack));
rightFront = new OverflowEncoder(new RawEncoder(MecanumDrive.this.rightFront));
imu = lazyImu.get();
// TODO: reverse encoders if needed
// leftFront.setDirection(DcMotorSimple.Direction.REVERSE);
}
@Override
public Twist2dDual<Time> update() {
PositionVelocityPair leftFrontPosVel = leftFront.getPositionAndVelocity();
PositionVelocityPair leftBackPosVel = leftBack.getPositionAndVelocity();
PositionVelocityPair rightBackPosVel = rightBack.getPositionAndVelocity();
PositionVelocityPair rightFrontPosVel = rightFront.getPositionAndVelocity();
YawPitchRollAngles angles = imu.getRobotYawPitchRollAngles();
FlightRecorder.write("MECANUM_LOCALIZER_INPUTS", new MecanumLocalizerInputsMessage(
leftFrontPosVel, leftBackPosVel, rightBackPosVel, rightFrontPosVel, angles));
Rotation2d heading = Rotation2d.exp(angles.getYaw(AngleUnit.RADIANS));
if (!initialized) {
initialized = true;
lastLeftFrontPos = leftFrontPosVel.position;
lastLeftBackPos = leftBackPosVel.position;
lastRightBackPos = rightBackPosVel.position;
lastRightFrontPos = rightFrontPosVel.position;
lastHeading = heading;
return new Twist2dDual<>(
Vector2dDual.constant(new Vector2d(0.0, 0.0), 2),
DualNum.constant(0.0, 2)
);
}
double headingDelta = heading.minus(lastHeading);
Twist2dDual<Time> twist = kinematics.forward(new MecanumKinematics.WheelIncrements<>(
new DualNum<Time>(new double[]{
(leftFrontPosVel.position - lastLeftFrontPos),
leftFrontPosVel.velocity,
}).times(PARAMS.inPerTick),
new DualNum<Time>(new double[]{
(leftBackPosVel.position - lastLeftBackPos),
leftBackPosVel.velocity,
}).times(PARAMS.inPerTick),
new DualNum<Time>(new double[]{
(rightBackPosVel.position - lastRightBackPos),
rightBackPosVel.velocity,
}).times(PARAMS.inPerTick),
new DualNum<Time>(new double[]{
(rightFrontPosVel.position - lastRightFrontPos),
rightFrontPosVel.velocity,
}).times(PARAMS.inPerTick)
));
lastLeftFrontPos = leftFrontPosVel.position;
lastLeftBackPos = leftBackPosVel.position;
lastRightBackPos = rightBackPosVel.position;
lastRightFrontPos = rightFrontPosVel.position;
lastHeading = heading;
return new Twist2dDual<>(
twist.line,
DualNum.cons(headingDelta, twist.angle.drop(1))
);
}
}
public MecanumDrive(HardwareMap hardwareMap, Pose2d pose) {
this.pose = pose;
LynxFirmware.throwIfModulesAreOutdated(hardwareMap);
for (LynxModule module : hardwareMap.getAll(LynxModule.class)) {
module.setBulkCachingMode(LynxModule.BulkCachingMode.AUTO);
}
// TODO: make sure your config has motors with these names (or change them)
// see https://ftc-docs.firstinspires.org/en/latest/hardware_and_software_configuration/configuring/index.html
leftFront = hardwareMap.get(DcMotorEx.class, LT_FRONT);
leftBack = hardwareMap.get(DcMotorEx.class, LT_BACK);
rightBack = hardwareMap.get(DcMotorEx.class, RT_BACK);
rightFront = hardwareMap.get(DcMotorEx.class, RT_FRONT);
leftFront.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
leftBack.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
rightBack.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
rightFront.setZeroPowerBehavior(DcMotor.ZeroPowerBehavior.BRAKE);
/*
Put all motors in an array for easier handling
NOTE: Placement of motor in array is important especially when dealing with functions
that set direction
*/
DcMotor[] motors = {rightFront, rightBack, leftFront, leftBack};
/*
Initialize all wheels forward using motors a
*/
MovementLibrary.setDirectionForward(motors);
// TODO: make sure your config has an IMU with this name (can be BNO or BHI)
// see https://ftc-docs.firstinspires.org/en/latest/hardware_and_software_configuration/configuring/index.html
lazyImu = new LazyImu(hardwareMap, "imu", new RevHubOrientationOnRobot(
PARAMS.logoFacingDirection, PARAMS.usbFacingDirection));
voltageSensor = hardwareMap.voltageSensor.iterator().next();
localizer = new DriveLocalizer();
FlightRecorder.write("MECANUM_PARAMS", PARAMS);
}
public void setDrivePowers(PoseVelocity2d powers) {
MecanumKinematics.WheelVelocities<Time> wheelVels = new MecanumKinematics(1).inverse(
PoseVelocity2dDual.constant(powers, 1));
double maxPowerMag = 1;
for (DualNum<Time> power : wheelVels.all()) {
maxPowerMag = Math.max(maxPowerMag, power.value());
}
leftFront.setPower(wheelVels.leftFront.get(0) / maxPowerMag);
leftBack.setPower(wheelVels.leftBack.get(0) / maxPowerMag);
rightBack.setPower(wheelVels.rightBack.get(0) / maxPowerMag);
rightFront.setPower(wheelVels.rightFront.get(0) / maxPowerMag);
}
public final class FollowTrajectoryAction implements Action {
public final TimeTrajectory timeTrajectory;
private double beginTs = -1;
private final double[] xPoints, yPoints;
public FollowTrajectoryAction(TimeTrajectory t) {
timeTrajectory = t;
List<Double> disps = com.acmerobotics.roadrunner.Math.range(
0, t.path.length(),
Math.max(2, (int) Math.ceil(t.path.length() / 2)));
xPoints = new double[disps.size()];
yPoints = new double[disps.size()];
for (int i = 0; i < disps.size(); i++) {
Pose2d p = t.path.get(disps.get(i), 1).value();
xPoints[i] = p.position.x;
yPoints[i] = p.position.y;
}
}
@Override
public boolean run(@NonNull TelemetryPacket p) {
double t;
if (beginTs < 0) {
beginTs = Actions.now();
t = 0;
} else {
t = Actions.now() - beginTs;
}
if (t >= timeTrajectory.duration) {
leftFront.setPower(0);
leftBack.setPower(0);
rightBack.setPower(0);
rightFront.setPower(0);
return false;
}
Pose2dDual<Time> txWorldTarget = timeTrajectory.get(t);
targetPoseWriter.write(new PoseMessage(txWorldTarget.value()));
PoseVelocity2d robotVelRobot = updatePoseEstimate();
PoseVelocity2dDual<Time> command = new HolonomicController(
PARAMS.axialGain, PARAMS.lateralGain, PARAMS.headingGain,
PARAMS.axialVelGain, PARAMS.lateralVelGain, PARAMS.headingVelGain
)
.compute(txWorldTarget, pose, robotVelRobot);
driveCommandWriter.write(new DriveCommandMessage(command));
MecanumKinematics.WheelVelocities<Time> wheelVels = kinematics.inverse(command);
double voltage = voltageSensor.getVoltage();
final MotorFeedforward feedforward = new MotorFeedforward(PARAMS.kS,
PARAMS.kV / PARAMS.inPerTick, PARAMS.kA / PARAMS.inPerTick);
double leftFrontPower = feedforward.compute(wheelVels.leftFront) / voltage;
double leftBackPower = feedforward.compute(wheelVels.leftBack) / voltage;
double rightBackPower = feedforward.compute(wheelVels.rightBack) / voltage;
double rightFrontPower = feedforward.compute(wheelVels.rightFront) / voltage;
mecanumCommandWriter.write(new MecanumCommandMessage(
voltage, leftFrontPower, leftBackPower, rightBackPower, rightFrontPower
));
leftFront.setPower(leftFrontPower);
leftBack.setPower(leftBackPower);
rightBack.setPower(rightBackPower);
rightFront.setPower(rightFrontPower);
p.put("x", pose.position.x);
p.put("y", pose.position.y);
p.put("heading (deg)", Math.toDegrees(pose.heading.toDouble()));
Pose2d error = txWorldTarget.value().minusExp(pose);
p.put("xError", error.position.x);
p.put("yError", error.position.y);
p.put("headingError (deg)", Math.toDegrees(error.heading.toDouble()));
// only draw when active; only one drive action should be active at a time
Canvas c = p.fieldOverlay();
drawPoseHistory(c);
c.setStroke("#4CAF50");
Drawing.drawRobot(c, txWorldTarget.value());
c.setStroke("#3F51B5");
Drawing.drawRobot(c, pose);
c.setStroke("#4CAF50FF");
c.setStrokeWidth(1);
c.strokePolyline(xPoints, yPoints);
return true;
}
@Override
public void preview(Canvas c) {
c.setStroke("#4CAF507A");
c.setStrokeWidth(1);
c.strokePolyline(xPoints, yPoints);
}
}
public final class TurnAction implements Action {
private final TimeTurn turn;
private double beginTs = -1;
public TurnAction(TimeTurn turn) {
this.turn = turn;
}
@Override
public boolean run(@NonNull TelemetryPacket p) {
double t;
if (beginTs < 0) {
beginTs = Actions.now();
t = 0;
} else {
t = Actions.now() - beginTs;
}
if (t >= turn.duration) {
leftFront.setPower(0);
leftBack.setPower(0);
rightBack.setPower(0);
rightFront.setPower(0);
return false;
}
Pose2dDual<Time> txWorldTarget = turn.get(t);
targetPoseWriter.write(new PoseMessage(txWorldTarget.value()));
PoseVelocity2d robotVelRobot = updatePoseEstimate();
PoseVelocity2dDual<Time> command = new HolonomicController(
PARAMS.axialGain, PARAMS.lateralGain, PARAMS.headingGain,
PARAMS.axialVelGain, PARAMS.lateralVelGain, PARAMS.headingVelGain
)
.compute(txWorldTarget, pose, robotVelRobot);
driveCommandWriter.write(new DriveCommandMessage(command));
MecanumKinematics.WheelVelocities<Time> wheelVels = kinematics.inverse(command);
double voltage = voltageSensor.getVoltage();
final MotorFeedforward feedforward = new MotorFeedforward(PARAMS.kS,
PARAMS.kV / PARAMS.inPerTick, PARAMS.kA / PARAMS.inPerTick);
double leftFrontPower = feedforward.compute(wheelVels.leftFront) / voltage;
double leftBackPower = feedforward.compute(wheelVels.leftBack) / voltage;
double rightBackPower = feedforward.compute(wheelVels.rightBack) / voltage;
double rightFrontPower = feedforward.compute(wheelVels.rightFront) / voltage;
mecanumCommandWriter.write(new MecanumCommandMessage(
voltage, leftFrontPower, leftBackPower, rightBackPower, rightFrontPower
));
leftFront.setPower(feedforward.compute(wheelVels.leftFront) / voltage);
leftBack.setPower(feedforward.compute(wheelVels.leftBack) / voltage);
rightBack.setPower(feedforward.compute(wheelVels.rightBack) / voltage);
rightFront.setPower(feedforward.compute(wheelVels.rightFront) / voltage);
Canvas c = p.fieldOverlay();
drawPoseHistory(c);
c.setStroke("#4CAF50");
Drawing.drawRobot(c, txWorldTarget.value());
c.setStroke("#3F51B5");
Drawing.drawRobot(c, pose);
c.setStroke("#7C4DFFFF");
c.fillCircle(turn.beginPose.position.x, turn.beginPose.position.y, 2);
return true;
}
@Override
public void preview(Canvas c) {
c.setStroke("#7C4DFF7A");
c.fillCircle(turn.beginPose.position.x, turn.beginPose.position.y, 2);
}
}
public PoseVelocity2d updatePoseEstimate() {
Twist2dDual<Time> twist = localizer.update();
pose = pose.plus(twist.value());
poseHistory.add(pose);
while (poseHistory.size() > 100) {
poseHistory.removeFirst();
}
estimatedPoseWriter.write(new PoseMessage(pose));
return twist.velocity().value();
}
private void drawPoseHistory(Canvas c) {
double[] xPoints = new double[poseHistory.size()];
double[] yPoints = new double[poseHistory.size()];
int i = 0;
for (Pose2d t : poseHistory) {
xPoints[i] = t.position.x;
yPoints[i] = t.position.y;
i++;
}
c.setStrokeWidth(1);
c.setStroke("#3F51B5");
c.strokePolyline(xPoints, yPoints);
}
public TrajectoryActionBuilder actionBuilder(Pose2d beginPose) {
return new TrajectoryActionBuilder(
TurnAction::new,
FollowTrajectoryAction::new,
new TrajectoryBuilderParams(
1e-6,
new ProfileParams(
0.25, 0.1, 1e-2
)
),
beginPose, 0.0,
defaultTurnConstraints,
defaultVelConstraint, defaultAccelConstraint
);
}
}

View File

@ -0,0 +1,12 @@
package org.firstinspires.ftc.teamcode.cometbots;
public class Constants {
public static double CB_BASE_SPEED = 0.2;
public static String RT_FRONT = "right hand wheel";
public static String LT_FRONT = "left hand wheel";
public static String RT_BACK = "right leg wheel";
public static String LT_BACK = "left leg wheel";
}

View File

@ -0,0 +1,58 @@
package org.firstinspires.ftc.teamcode.cometbots;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.DcMotorSimple;
public class MovementLibrary {
public static void setDirectionForward(DcMotor[] motors) {
setLeftSideForward(motors);
setRightSideForward(motors);
}
public static void setLeftSideForward(DcMotor[] motors) {
motors[2].setDirection(DcMotor.Direction.FORWARD);
motors[3].setDirection(DcMotor.Direction.FORWARD);
}
public static void setRightSideForward(DcMotor[] motors) {
motors[0].setDirection(DcMotor.Direction.FORWARD);
motors[1].setDirection(DcMotor.Direction.FORWARD);
}
public static void setLeftSideBackward(DcMotor[] motors) {
motors[2].setDirection(DcMotor.Direction.REVERSE);
motors[3].setDirection(DcMotor.Direction.REVERSE);
}
public static void setRightSideBackward(DcMotor[] motors) {
motors[0].setDirection(DcMotor.Direction.REVERSE);
motors[1].setDirection(DcMotor.Direction.REVERSE);
}
public static void setDirectionBackward(DcMotor[] motors) {
setLeftSideBackward(motors);
setRightSideBackward(motors);
}
public static void setSpeed(DcMotor[] motors, double speedValue) {
for (DcMotor motor : motors) {
motor.setPower(speedValue);
}
}
public static void setDirectionLeft(DcMotor[] motors) {
motors[0].setDirection(DcMotor.Direction.REVERSE);
motors[1].setDirection(DcMotor.Direction.REVERSE);
motors[2].setDirection(DcMotor.Direction.FORWARD);
motors[3].setDirection(DcMotor.Direction.FORWARD);
}
public static void setDirectionRight(DcMotor[] motors) {
motors[0].setDirection(DcMotor.Direction.FORWARD);
motors[1].setDirection(DcMotor.Direction.FORWARD);
motors[2].setDirection(DcMotor.Direction.REVERSE);
motors[3].setDirection(DcMotor.Direction.REVERSE);
}
}

View File

@ -0,0 +1,24 @@
package org.firstinspires.ftc.teamcode.messages;
import com.acmerobotics.roadrunner.PoseVelocity2dDual;
import com.acmerobotics.roadrunner.Time;
public final class DriveCommandMessage {
public long timestamp;
public double forwardVelocity;
public double forwardAcceleration;
public double lateralVelocity;
public double lateralAcceleration;
public double angularVelocity;
public double angularAcceleration;
public DriveCommandMessage(PoseVelocity2dDual<Time> poseVelocity) {
this.timestamp = System.nanoTime();
this.forwardVelocity = poseVelocity.linearVel.x.get(0);
this.forwardAcceleration = poseVelocity.linearVel.x.get(1);
this.lateralVelocity = poseVelocity.linearVel.y.get(0);
this.lateralAcceleration = poseVelocity.linearVel.y.get(1);
this.angularVelocity = poseVelocity.angVel.get(0);
this.angularAcceleration = poseVelocity.angVel.get(1);
}
}

View File

@ -0,0 +1,19 @@
package org.firstinspires.ftc.teamcode.messages;
public final class MecanumCommandMessage {
public long timestamp;
public double voltage;
public double leftFrontPower;
public double leftBackPower;
public double rightBackPower;
public double rightFrontPower;
public MecanumCommandMessage(double voltage, double leftFrontPower, double leftBackPower, double rightBackPower, double rightFrontPower) {
this.timestamp = System.nanoTime();
this.voltage = voltage;
this.leftFrontPower = leftFrontPower;
this.leftBackPower = leftBackPower;
this.rightBackPower = rightBackPower;
this.rightFrontPower = rightFrontPower;
}
}

View File

@ -0,0 +1,30 @@
package org.firstinspires.ftc.teamcode.messages;
import com.acmerobotics.roadrunner.ftc.PositionVelocityPair;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.YawPitchRollAngles;
public final class MecanumLocalizerInputsMessage {
public long timestamp;
public PositionVelocityPair leftFront;
public PositionVelocityPair leftBack;
public PositionVelocityPair rightBack;
public PositionVelocityPair rightFront;
public double yaw;
public double pitch;
public double roll;
public MecanumLocalizerInputsMessage(PositionVelocityPair leftFront, PositionVelocityPair leftBack, PositionVelocityPair rightBack, PositionVelocityPair rightFront, YawPitchRollAngles angles) {
this.timestamp = System.nanoTime();
this.leftFront = leftFront;
this.leftBack = leftBack;
this.rightBack = rightBack;
this.rightFront = rightFront;
{
this.yaw = angles.getYaw(AngleUnit.RADIANS);
this.pitch = angles.getPitch(AngleUnit.RADIANS);
this.roll = angles.getRoll(AngleUnit.RADIANS);
}
}
}

View File

@ -0,0 +1,17 @@
package org.firstinspires.ftc.teamcode.messages;
import com.acmerobotics.roadrunner.Pose2d;
public final class PoseMessage {
public long timestamp;
public double x;
public double y;
public double heading;
public PoseMessage(Pose2d pose) {
this.timestamp = System.nanoTime();
this.x = pose.position.x;
this.y = pose.position.y;
this.heading = pose.heading.toDouble();
}
}

View File

@ -0,0 +1,4 @@
https://www.youtube.com/watch?v=aeMWWvteF2U
https://www.youtube.com/watch?v=1xdo8QP6_mI

View File

@ -54,7 +54,7 @@ android {
defaultConfig {
signingConfig signingConfigs.debug
applicationId 'com.qualcomm.ftcrobotcontroller'
minSdkVersion 23
minSdkVersion 24
//noinspection ExpiredTargetSdkVersion
targetSdkVersion 28
@ -87,7 +87,6 @@ android {
versionName vName
}
// Advanced user code might just want to use Vuforia directly, so we set up the libs as needed
// http://google.github.io/android-gradle-dsl/current/com.android.build.gradle.internal.dsl.BuildType.html
buildTypes {
release {
@ -122,8 +121,5 @@ android {
}
repositories {
flatDir {
dirs rootProject.file('libs')
}
}

View File

@ -1,22 +1,23 @@
repositories {
mavenCentral()
google() // Needed for androidx
flatDir {
dirs rootProject.file('libs')
}
maven { url = 'https://maven.brott.dev/' }
}
dependencies {
implementation 'org.firstinspires.ftc:Inspection:8.1.0'
implementation 'org.firstinspires.ftc:Blocks:8.1.0'
implementation 'org.firstinspires.ftc:Tfod:8.1.0'
implementation 'org.firstinspires.ftc:RobotCore:8.1.0'
implementation 'org.firstinspires.ftc:RobotServer:8.1.0'
implementation 'org.firstinspires.ftc:OnBotJava:8.1.0'
implementation 'org.firstinspires.ftc:Hardware:8.1.0'
implementation 'org.firstinspires.ftc:FtcCommon:8.1.0'
implementation 'org.tensorflow:tensorflow-lite-task-vision:0.2.0'
implementation 'org.firstinspires.ftc:Inspection:9.1.0'
implementation 'org.firstinspires.ftc:Blocks:9.1.0'
implementation 'org.firstinspires.ftc:Tfod:9.1.0'
implementation 'org.firstinspires.ftc:RobotCore:9.1.0'
implementation 'org.firstinspires.ftc:RobotServer:9.1.0'
implementation 'org.firstinspires.ftc:OnBotJava:9.1.0'
implementation 'org.firstinspires.ftc:Hardware:9.1.0'
implementation 'org.firstinspires.ftc:FtcCommon:9.1.0'
implementation 'org.firstinspires.ftc:Vision:9.1.0'
implementation 'org.firstinspires.ftc:gameAssets-CenterStage:1.0.0'
implementation 'org.tensorflow:tensorflow-lite-task-vision:0.4.3'
runtimeOnly 'org.tensorflow:tensorflow-lite:2.12.0'
implementation 'androidx.appcompat:appcompat:1.2.0'
implementation 'org.firstinspires.ftc:gameAssets-PowerPlay:1.0.0'
implementation 'com.acmerobotics.dashboard:dashboard:0.4.15' // FTC Dashboard
}

View File

@ -10,6 +10,7 @@ buildscript {
google()
}
dependencies {
// Note for FTC Teams: Do not modify this yourself.
classpath 'com.android.tools.build:gradle:7.2.0'
}
}
@ -25,8 +26,4 @@ allprojects {
repositories {
mavenCentral()
flatDir {
dirs '../libs'
}
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 797 KiB

Some files were not shown because too many files have changed in this diff Show More